20 x '400' errors in site but URLs work fine in browser...
-
Hi, I have a new client set-up in SEOmoz and the crawl completed this morning... I am picking up 20 x '400' errors, but the pages listed in the crawl report load fine... any ideas?
example -
-
Most major robots obey crawl delays. You could check your errors in Google Webmaster Tools to see if your site is serving a lot of error pages when Google crawls.
I suspect Google is pretty smart about slowing down its crawl rate when it encounters too many errors, so it's probably safe to not include a crawl delay for Google.
-
Sorry, one last question.
Do I need to add a similar delay for Google Bots, or is this issue specifically a Roger Bot problem?
Thanks
-
Fantastic, thanks, Cyrus and Tampa, prevented many more hours of scratching head!!!
-
Hi Justin,
Sometimes when rogerbot crawls a site, the servers and/or the content management system can get overwhelmed if roger is going to fast, and this causes your site to deliver error pages as roger crawls.
If the problem persists, you might consider installing a crawl delay for roger in your robots.txt file. It would look something like this:
User-agent: rogerbot
Crawl-delay: 5This would cause the SEOmoz crawlers to wait 5 seconds before fetching each page. Then, if the problem still persists, feel free to contact the help team at help@seomoz.org
Hope this helps! Best of luck with your SEO!
-
Thanks Tampa SEO, good advice.
Interestingly, the URL listed in SEOmoz is as follows:
www.morethansport.co.uk/brand/adidas?sortDirection=ascending&sortField=Price&category=sport and leisure
But when I look at the link in the referring page it is as follows:
/brand/adidas?sortDirection=ascending&sortField=Price&category=sport%20and%20leisure
notice the "%" symbol instead of the spaces.
The actual URL is the one listed in SEOmoz but even if I copy and paste the % version, the browser removed the '%' and the page loads fine.
I still can't get the site to throw-up a 400.
-
Just ran the example link that you provided through two independent HTTP response code checkers, and both are giving me a 200 response, i.e. the site is OK.
This question has been asked before on here; you're definitely not the first person to run into the issue.
One way to diagnose what's going on is to dig a little deeper into the crawling report that SEOmoz generated. Download the CSV file and look at the referring link, i.e. on which page Roger found the link. Then go to that page and look if your CMS is doing anything weird with the way it outputs the links that you create. I recall someone back in December having the same issue and eventually resolved it by noticing that his CMS put all sort of weird slashes (i.e. /.../...) into the link.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I be worried about our 'Duplicate' content
Hi guys... I've just been working through some issues to give our site a little cleanup. I'm working through our duplicate content issues (we have some legitimate duplicate pages that need removing, and some of our dynamic content is problematic. Are web developers are going to sort with canonical tags this week.) However... There are some pages that are actually different products, but are very similar pages that are 'triggering' MOZ to say we have duplicate pages. Here an example... http://www.toaddiaries.co.uk/filofax-refills/filo-12-month-inserts-personal-size/fortnight-view-filofax-personal and http://www.toaddiaries.co.uk/filofax-refills/filo-12-month-inserts-personal-size/week-to-a-view-filofax-personal They are very similar refill products, it's just the diary format is different. Question: Should I be worried about this? I've never seen our rankings change in the past when 'cleaning up' duplicate content. What do you guys think? Isaac.
On-Page Optimization | | isaac6630 -
Can I have schema.org links as relative on my site? Getting an html validation error.
I'm getting an html validation error on relative schema.org links "Bad value //schema.org/Organization for attribute itemtype on element div: The string //schema.org/Organization is not an absolute URL." This is my code for https site: <code class="input">e itemtype="//schema.org/Organization"><a itemprop="url" class="navbar-brand" …<="" code=""></a></code>
On-Page Optimization | | RoxBrock0 -
Paginated URLs are getting Indexed
Hi, For ex: - My site is www.abc.com and Its paginated URLs for www.abc.com/jobs-in-delhi are in the format of : www.abc.com/jobs-in-delhi-1, www.abc.com/jobs-in-delhi-2 and vice versa also i have used pagination tags rel=next and rel=prev. My concern is all the paginated URLs are getting indexed so is their any disadvantage if these URLs are getting indexed as somewhere i have read that link juice may get distributed in case of pagination. isn't it good to use Noindex, Follow so that we can make the Google to understand that paginated page are not so much important and that should not be ranked.
On-Page Optimization | | vivekrathore0 -
I'm planning the structure of a Car Parts e-commerce site and...
Hi all, As the title says, I am redeveloping a website for a client who sells car parts. The market is saturated with competition and 9 times out of 10, www.eurocarparts.com appears within the top 5 SERPS when searching for the same things that my clients' website sells. Now I know it will be very difficult to compete with the likes of these sites but given time (plenty of time) you never know so I have dissected Euro Car Parts website for many hours to look at their internal link structure and so my question is related to this area. My site only sells car parts for 4 car manufacturers. The left hand "shop" navigation menu will list the categories in which the shop sells products for e.g: Air Filters
On-Page Optimization | | yousayjump
Break Pads
Coilovers
Dampers
Suspension etc. When you view one of those pages, there will be a form to allow the user to filter down to their particular make/model of car etc. Now when I search for "Air Filters" in Google, the results come back as nearly 40 million. If I prefix that search term with a car manufacturer name i.e. "Audi Air Filters" the number drops down to 8 million - quite a difference!
So my thinking was to do the following but I wanted to see if you guys think I am barking up the wrong tree or if this is a good approach: Create pages for my 4 car manufacturers for each "shop category" listing all the products I have in those categories for the manufacturer along with well written unique, relevant content, ie:- Audi Air Filters
Audi Break Pads
Audi Coilovers
Audi Suspension ... this would allow me to target the slightly long-tailed version of "Air Filters" as I now have a page for "Audi Air Filters". This would then mean when users click the "Air Filters" link from my left hand category menu, I would need to ensure that those pages were not indexed by the search engines as they would essentially be showing the same subset of products but with the title of the page being "Air Filters". I hope I have explained myself enough for you to understand my question. Ultimately I want to know if my approach is a typical one when knowing that even attempting to target "air filters" with a new website is going to be a lost cause - I need to try and get some of the lower hanging fruit. Thank you for reading.0 -
I'm puzzled
Last week we decided to run a facebook campaign with a small offer, any way cut a long story short. www.specialistsonlinepaints.co.uk was dropped by google when the penguin update took place in 2012, however it has been re included by google but nothing really returned in terms of search results. so our facebook campaign reached over 3k of people according to the stats but no one purchased anything or even clicked to visit the website. am I missing something in terms on onpage, black listing etc? Im at a loss at the minute.
On-Page Optimization | | TeamacPaints0 -
400 error - Phone number link.
I am getting 400 errors for all my pages that have a phone number with a link to Skype etc on click, is this a genuine issue or am I ok? How do I resolve this? Any bright ideas, here is an example of the issue - http://www.arts1.co.uk/5-reasons-to-choose-arts1 There are pages of these and I am not sure what to do? Many Thanks James Grimsey
On-Page Optimization | | jamesgrimsey0 -
Advice needed about site
I would like to get advice about the site http://www.wb-3d.com regarding what we should do to get it highly on Google. This seems to be a tough one as there is not enough content on the site. I would immensly appeciate any suggestion to improve the rankings. Specifically, what we should tweet and update on facebook. Thanks
On-Page Optimization | | seoug_20050 -
URL question
Hi guys, the pro campaign thing you got going is wicked, love it. I'm recieving good results with my keywords and have noticed that categories that go beyond sub/sub/sub don't do to well. So I wanna move those that do one step up which makes it go from: http://spytunes.com/practice-guitar/advanced-routine/scales/aeolian to here http://spytunes.com/practice-guitar/advanced-routine/aeolian The existing menu system that follow all these categories across the site will soon go so it won't be a user friendly problem, I will have other type of menus. But, and here is the question: Would I greatly benefit from taking the non existent menu away and just go for: http://spytunes.com/practice-guitar/aeolian while i'm at it? Or do I stick with my current structure? I guess my real question is; how much is there to flat URLs? Cheers -dan lundholm spytunes.com
On-Page Optimization | | spytunes0