20 x '400' errors in site but URLs work fine in browser...
-
Hi, I have a new client set-up in SEOmoz and the crawl completed this morning... I am picking up 20 x '400' errors, but the pages listed in the crawl report load fine... any ideas?
example -
-
Most major robots obey crawl delays. You could check your errors in Google Webmaster Tools to see if your site is serving a lot of error pages when Google crawls.
I suspect Google is pretty smart about slowing down its crawl rate when it encounters too many errors, so it's probably safe to not include a crawl delay for Google.
-
Sorry, one last question.
Do I need to add a similar delay for Google Bots, or is this issue specifically a Roger Bot problem?
Thanks
-
Fantastic, thanks, Cyrus and Tampa, prevented many more hours of scratching head!!!
-
Hi Justin,
Sometimes when rogerbot crawls a site, the servers and/or the content management system can get overwhelmed if roger is going to fast, and this causes your site to deliver error pages as roger crawls.
If the problem persists, you might consider installing a crawl delay for roger in your robots.txt file. It would look something like this:
User-agent: rogerbot
Crawl-delay: 5This would cause the SEOmoz crawlers to wait 5 seconds before fetching each page. Then, if the problem still persists, feel free to contact the help team at help@seomoz.org
Hope this helps! Best of luck with your SEO!
-
Thanks Tampa SEO, good advice.
Interestingly, the URL listed in SEOmoz is as follows:
www.morethansport.co.uk/brand/adidas?sortDirection=ascending&sortField=Price&category=sport and leisure
But when I look at the link in the referring page it is as follows:
/brand/adidas?sortDirection=ascending&sortField=Price&category=sport%20and%20leisure
notice the "%" symbol instead of the spaces.
The actual URL is the one listed in SEOmoz but even if I copy and paste the % version, the browser removed the '%' and the page loads fine.
I still can't get the site to throw-up a 400.
-
Just ran the example link that you provided through two independent HTTP response code checkers, and both are giving me a 200 response, i.e. the site is OK.
This question has been asked before on here; you're definitely not the first person to run into the issue.
One way to diagnose what's going on is to dig a little deeper into the crawling report that SEOmoz generated. Download the CSV file and look at the referring link, i.e. on which page Roger found the link. Then go to that page and look if your CMS is doing anything weird with the way it outputs the links that you create. I recall someone back in December having the same issue and eventually resolved it by noticing that his CMS put all sort of weird slashes (i.e. /.../...) into the link.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Separate URL vs iFrame
Hi Everyone, I'm not a designer/developer and am an not extremely knowledgeable in SEO, but I'll try to be as clear as I can. One of the designers here is creating a recipe section on our website. He created it so that it's a container (or iFrame?) on the page. Basically, no matter what you click (different sections and recipes) the URL stays the same. I was told to find out from an SEO perspective if it's better to do things this way or have a separate URL for each section and recipe. It's been brought up that from a social/sharing standpoint separate URLs would be better so people can send a link directly to the specific recipe they want to share. Any thoughts/comments are appreciated! Thanks for the help!
On-Page Optimization | | AliMac260 -
Help finding someone to handle crashing site/site optimization.
I need someone who can handle website/WordPress issues as they come up.For example, my site has gone down 4 times tonight, and my host can't figure it out. They also keep recommending that I optimize my site, but I don't know how. I need a go-to web person for this sort of thing. Any recommendations?
On-Page Optimization | | cbrant7770 -
Long url links
Just wondering about creating links.
On-Page Optimization | | Robotnik
Is it ok to have very long links?
Like: http://www.robotnik.com/computer-hardware-ram/8gb-ddr3-1600-desktop Is the above too long, is it better for SEO to be more to the point? Also For better SEO, is it better to use hyphens in a domain name or not?0 -
Working on this site...
and wondering what is wrong in terms of on page SEO (basically just want some feedback on tips/changes to make) http://www.stevenholmesstudio.com/ I'm assuming that the title shouldn't be just the img file name..any suggestions for what it should be?
On-Page Optimization | | callmeed0 -
Replacing "_" with "-" in url, results in new url?
We ran SEOmoz's "On-Page Optimization" tool on a url which contains the character "_". According to the tool: "Characters which are less commonly used in URLs may cause problems with accessibility, interpretation and ranking in search engines. It is considered a best practice to stick to standard URL structures to avoid potential problems." "Rewrite the URL to contain only standard characters." Therefore we will rewrite the url, replacing "_" with "-". Will search engines consider the "-" url a different one? Do we need to 301 the old url to the new one? Thanks for your help!
On-Page Optimization | | gerardoH0 -
Meta refresh - nojavascript url
seomox is telling me that I am getting a page that is not being indexed or crawled and since the crawl status code is 200 and there are no robots the meta-refresh url must be the problem. the meta refresh url is different than the on page report card url as it's the nojavascript url which my developer says should be ok. see his comments below. The is redirecting to http://mastermindtoys.com/store/nojavascript.html only in case if the JavaScript is disabled in the client browser. This is the right way to do it, I don’t understand why this might be a problem, otherwise MM has to implement Noscript pages that have a real content. I didn’t get what’s wrong about accessibility. The code 200 means it is accessible, and yes there is nothing to access if JavaScript is disabled on browser. I think there are no modern retail sites that would do any sensible business with the scripting disabled in browsers.The H1 is really present 2 times and second occurrence can be removed, though I highly doubt about importance of this change.Regarding duplicates – what URLs are considered duplicates? Can you please send me examples?I am not aware of canonical URL problem for MM site unless we consider old .asp links as duplicate links of the canonical product pages. I would appreciate if SEOMoz gave us an example what they mean.I suspect that the page is not getting indexed as a result of this or I'm just not getting a good score. Which is it?
On-Page Optimization | | mastermindtoys0 -
Long or Short URLs. Who's Coming to Dinner?
This has been discussed on the forums in some regard. My situation. Example 1 Long Keyword URL: www.abctown.com/keyword-for-life-helping-keywords-everywhere-rank-better Example 2 Short Keyword URL: www.abctown.com/keyword In both examples I want to improve rankings for the "keyword" phrase. My current URL is example 1. And I've landed a page one ranking in Google (7) with that URL. In attempts to improve rankings further (top 5), I was toying with the idea of going simpler with all my URLs in favor of the example 2 model. Might this method help or hurt my current rankings? In recent articles I've read it seems that going with the simpler more human approach to my SEO efforts. Any thought would be appreciated. Cheers,
On-Page Optimization | | creativedepartment0 -
Setting up a domain for a future site
Hi there, That may be a bit of a silly question to ask, but we've setup a new domain for an existing site. While the site is in the making, the site owners wants to already start promoting the new URL on stationeries etc. Hence, we need to setup the new URL so that it forwards to the site, but so that Google doesn't give it the history of a secondary (less important) domain. What is the best way to do this? Currently we've put in a 301 redirect, but will that bear no future consequences on the SEO of the site, when the site is moved to this new domain, and the old domain is 301 redirected. Thanks, SEOeclipse
On-Page Optimization | | Bozboz0