What's the best way to eliminate "429 : Received HTTP status 429" errors?
-
My company website is built on WordPress. It receives very few crawl errors, but it do regularly receive a few (typically 1-2 per crawl) "429 : Received HTTP status 429" errors through Moz.
Based on my research, my understand is that my server is essentially telling Moz to cool it with the requests. That means it could be doing the same for search engines' bots and even visitors, right? This creates two questions for me, which I would greatly appreciate your help with:
-
Are "429 : Received HTTP status 429" errors harmful for my SEO? I imagine the answer is "yes" because Moz flags them as high priority issues in my crawl report.
-
What can I do to eliminate "429 : Received HTTP status 429" errors?
Any insight you can offer is greatly appreciated!
Thanks,
Ryan -
-
I have a customer that is using GoDaddy website hosting (at least according to BuiltWith) and I'm experiencing this same issue.
Any updates on this experiment from user rsigg? I'd love to know if I can remove this from my customer's robots file...
FWIW, Netrepid is a hosting provider for colocation, infrastructure and applications (website hosting being considered an application) and we would never force a crawl delay on a Wordpress install!
Not hating on the other hosting service providers... #justsayin
-
I am also on the same hosting and they have not been able to help with the 429. I have now started getting 429 errors when I attempt to login. Definitely something wrong with wp premium hosting.
-
Interesting. I look forward to hearing your results, as my robots.txt file is also set to:
Crawl-delay: 1.
-
We host on Media Temple's Premium WordPress hosting (which I do not recommend, but that's another post for another place), and the techs there told me that it could be an issue with the robots.txt file:
"The issue may be with the settings in the robots.txt file. It looks fine to me but the "Crawl-delay" line might be causing issues. I understand. For the most part, crawlers tend to use robots.txt to determine how to crawl your site, so you may want to see if Moz requires some special settings in there to work correctly."
Ours is set to:
Crawl-delay: 1
I haven't tried changing these values yet in our file, but may experiment with this very soon. If I get results, I'll post back here as well as start a new forum thread.
-
Chase,
They ran a bunch of internal diagnostic tools on my site, and were unable to replicate the 429 errors. They ended up telling me exactly what they told you. I haven't noticed any issues with my site's rankings, or any flags in Webmaster Tools, so it looks like they are right so far. I just hate logging into Moz and seeing all those crawl errors!
-
What'd they say Ryan? Having the same issue and just contacted Godaddy who told me that basically Moz's software is pinging my client's server too frequently so Godaddy is temporarily blocking their IP. They said it's not a concern though as they would never block Google from pinging/indexing the site.
-
Many thanks - I will contact them now!
-
Contact your host and ask let them know about the errors. More than likely they have mod_sec enabled to limit request rates. Ask them to up the limit that you are getting 429 errors from crawlers and you do not want them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Moz and HubSpot SSL - crawl error?
I'm getting an error message when Moz tries to crawl my site, however when I check in Google Search Console, they return no errors. Our site is hosted on HubSpot. Is Moz still having trouble crawling HubSpot sites that have enabled their SSL? I read an article that this should have been corrected in early 2017, but I'm getting an error.
Moz Pro | | jennygriffin0 -
How could I improve my "brand" and "mentions" using the Moz tool?
Hello I am a bit stuck on what to put into this section of MOZ. As you can see from the attached image I have setup to catch any mention of my business name and root url. But beyond that I am uncertain of what to use to monitor who is talking about us. I'm also uncertain of what other abilities or usefulness this function would have. aOq7wVg.png
Moz Pro | | infinart0 -
Domain Does Not Respond To Web Requests'
i see this problem happens for a few ppl but reading the responses i still couldn't figure it out. im trying to set up my first campaign but im getting this error message. Roger has detected a problem: We have detected that the domain www.homebrew.ie does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. im technologically challenged (in other words im useless on computers) so any simple advice would be appreciated cheers donal
Moz Pro | | homebrew10 -
How Can I View Last Week's Page Grading?
How can I see all my on-page grading reports for page grading optimization that decreased for the worse from last week - so as to know which pages to fix? My mozpro email report indicates only ten of them decreased from Grad B to C etc from last week. As I have just migrated to a new cms, I need to find and bring back to grade all effected pages. I can't find how to view historical page grades - just for last weeks. Any ideas anyone? Thanks!
Moz Pro | | emerald0 -
What's up with Yahoo and Bing in my keyword rankings?
SEOmoz is good I've learned a lot from it. I'm on my 3rd month and now have 8 keywords in top 50 for one site and 1 keyword in top 50 for another. This is for Google US. For Bing and Yahoo! US however I am still not in the top 50 for anything. I have read elsewhere that meta-tag page words can be regarded by these engines as a spam signal. In my meta I have used competative keywords that are relevant to my content, some of which are in the body, but are not my actual targeted keywords as a red-herring to potential competitors. Also might be usefull for other smaller places that still use meta-tags. Could not appearing in BingHoo! have something to do with their alogrithms and the meta-tags? or is it an SEOmoz issue? I have another question also. The " / " after a domain name doesn't appear in my Firefox address bar (I think I did a 301 redirect), but when I copy and paste it into an entry box (e.g. for a link submission) it puts the " / " there at the end which I take off. Does this make any difference?
Moz Pro | | Zoolander0 -
SEOMoz reports and 404 errors
My SEOMoz report shows a 404 error, found today for this url: http://globalheavyhaul.com/google.com i do not have this anchor text anywhere on my website. How did Roger figure out that somebody looked for that page? Do I need to worry about 404 errors that are the result of user mistakes, instead of actual bad links?
Moz Pro | | FreightBoy0 -
7 Days into My Account and Only Crawled 3 Pages - What's Happened?
Am I doing something wrong? So far only 3 pages of my site have been crawled - but my account has been live for 7 days. Would it tell me if it was having trouble crawling the rest of the site?
Moz Pro | | columbus0