What's the best way to eliminate "429 : Received HTTP status 429" errors?
-
My company website is built on WordPress. It receives very few crawl errors, but it do regularly receive a few (typically 1-2 per crawl) "429 : Received HTTP status 429" errors through Moz.
Based on my research, my understand is that my server is essentially telling Moz to cool it with the requests. That means it could be doing the same for search engines' bots and even visitors, right? This creates two questions for me, which I would greatly appreciate your help with:
-
Are "429 : Received HTTP status 429" errors harmful for my SEO? I imagine the answer is "yes" because Moz flags them as high priority issues in my crawl report.
-
What can I do to eliminate "429 : Received HTTP status 429" errors?
Any insight you can offer is greatly appreciated!
Thanks,
Ryan -
-
I have a customer that is using GoDaddy website hosting (at least according to BuiltWith) and I'm experiencing this same issue.
Any updates on this experiment from user rsigg? I'd love to know if I can remove this from my customer's robots file...
FWIW, Netrepid is a hosting provider for colocation, infrastructure and applications (website hosting being considered an application) and we would never force a crawl delay on a Wordpress install!
Not hating on the other hosting service providers... #justsayin
-
I am also on the same hosting and they have not been able to help with the 429. I have now started getting 429 errors when I attempt to login. Definitely something wrong with wp premium hosting.
-
Interesting. I look forward to hearing your results, as my robots.txt file is also set to:
Crawl-delay: 1.
-
We host on Media Temple's Premium WordPress hosting (which I do not recommend, but that's another post for another place), and the techs there told me that it could be an issue with the robots.txt file:
"The issue may be with the settings in the robots.txt file. It looks fine to me but the "Crawl-delay" line might be causing issues. I understand. For the most part, crawlers tend to use robots.txt to determine how to crawl your site, so you may want to see if Moz requires some special settings in there to work correctly."
Ours is set to:
Crawl-delay: 1
I haven't tried changing these values yet in our file, but may experiment with this very soon. If I get results, I'll post back here as well as start a new forum thread.
-
Chase,
They ran a bunch of internal diagnostic tools on my site, and were unable to replicate the 429 errors. They ended up telling me exactly what they told you. I haven't noticed any issues with my site's rankings, or any flags in Webmaster Tools, so it looks like they are right so far. I just hate logging into Moz and seeing all those crawl errors!
-
What'd they say Ryan? Having the same issue and just contacted Godaddy who told me that basically Moz's software is pinging my client's server too frequently so Godaddy is temporarily blocking their IP. They said it's not a concern though as they would never block Google from pinging/indexing the site.
-
Many thanks - I will contact them now!
-
Contact your host and ask let them know about the errors. More than likely they have mod_sec enabled to limit request rates. Ask them to up the limit that you are getting 429 errors from crawlers and you do not want them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
Where To Get a *Hint of My Beta Invite Status?
Wondering if I'm missing a thread somewhere with updates on new analytics dashboard (beta) invites? I'm not surprised I'm not first in line, but be nice to know if it's 2013...or 14?...or 15? 🙂
Moz Pro | | InsightRiot300 -
Metric "Total Links" can somebody explain this metric to me?
Dear colleagues, Who can explain the following to me? Subdomain metrics - Total links The total links is huge compared to the sum of internal and external links. I do not understand this metric. Can somebody help me to explain this the metrich "total links" I have to present these metrics to my customer and do not want to have "don't know" as an answer 😉 Thanks, Alain Nijholt BMC Internet Marketing
Moz Pro | | bmcinternetmarketing0 -
What is the best way to set up my seomoz campaign with multiple landing pages
I have 30 geo targeted landing pages under the same domain. So i want to track geo targeting keywords for each landing page. given this what is the best way to use seomoz and how do i set up and structure? example of landing page structure san francisco is - http://www.relationshipcounselingcenter.org nyc is - http://www.relationshipcounselingcenter.org/new-york-city-nyc-marriage-couples-therapy/ dc- http://www.relationshipcounselingcenter.org/washington-dc-marriage-couples-therapy/ etc Much thanks I'm a newbie to seomoz tools
Moz Pro | | sevin0 -
How can I see the URL's affected in Seomoz Crawl when Notices increase
Hi, When Seomoz crawled my site, my notices increased by 255. How can I only these affected urls ? thanks Sarah
Moz Pro | | SarahCollins0 -
Finding the source of duplicate content URL's
We have a website that displays a number of products. The product has variations (sizes) and unfortunately every size has its own URL (for now anyway). Needless to say, this causes duplicate content issues. (And of course, we are looking to change the URL's for our site as soon as possible) However, even though these duplicate URL's exist, you should not be able to land on them by navigating through the site. In theory, the site should always display the link to the smallest size. It seems that there is a flaw in our system somewhere, as these links are now found in our campaign here on SEOmoz. My question: is there any way to find the crawl path that lead to the URL's that shouldn't have been found, so we can locate the problem?
Moz Pro | | DocdataCommerce0 -
Errors went from 2420 to ZERO
Of course this happened without my intervention, i don't know why but seomoz is reporting 0 errors.
Moz Pro | | iFix0 -
SEOmoz API? – "Limited access is included ... PRO membership." ?
Can someone expand on what you actually get with your pro membership on the Site Intelligence API. API page. Thanks
Moz Pro | | josey0