Rogerbot does not catch all existing 4XX Errors
-
Hi I experienced that Rogerbot after a new Crawl presents me new 4XX Errors, so why doesn't he tell me all at once?
I have a small static site and had 9 crawls ago 10 4XX Errors, so I tried to fix them all.
The next crawl Rogerbot fount still 5 Errors so I thought that I did not fix them all... but this happened now many times so that I checked before the latest crawl if I really fixed all the errors 101%.Today, although I really corrected 5 Errors, Rogerbot digs out 2 "new" Errors. So does Rogerbot not catch all the errors that have been on my site many weeks before?
Pls see the screenshot how I was chasing the errors
-
I understand,
I am not using a CMS and the site is not very big, so I wondered why Roberbot did not find all the 404 Error at the first time, because they have been there for many months.
Holger
-
Hey Holger,
Our crawler will catch as many errors as it can. It's possible that these errors were not present or just were not found at the time of the crawl.I'm running a crawl test to see if there's any discrepancy between your current campaign crawl and mine just to double-check.
In general, Kyle is correct that sometimes those errors just crop up, especially if you're using any sort of CMS.
I hope that helps. I'll update here after my crawl test is done.
Cheers,
Joel. -
Hi Holger,
4XX Errors can be quite common depending on your site setup so don't be surprised that Roger will keep returning errors for you to fix.
I would advise checking this data against GWT's own crawl error data which you can find in Webmaster Tools under Health>Crawl Errors.
I hope that helps,
K
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Yahoo Store Beginner with "duplicate content" errors. Can I pay for support? $$$
Hi. I have a Yahoo store that seems to have many errors. We built the site for utility knowing NOTHING about SEO. We just started with MOZ and would love to PAY someone to help get us past the beginning stages. Is there someone familiar with the Yahoo! Store format that can charge us perhaps in hourly blocks to walk us through possible solutions to issues? One issue we are having... seems to be that our subsections which contain items that are the endpoints... I know of no way to label the sections anything but an "item". I'm wondering if this might be causing the "duplicate" error because a specific item is listed both in the section and on it's own page. please help! Thom 888-567-5194
Moz Pro | | TITOJAX0 -
Find a 4xx or 5xx link referenced in an SEO Crawl Report
So I just got the Crawl Diagnostics report for a client site and it came back with a number of 4xx errors and even 1 5xx error. So while I can find the URL that has the problem, I cannot find the pages that have the links pointing to these non-existent or problematic pages. Normally I would just search the database for the site, but in this case I don't have access to it as the site is on a proprietary platform with no access other than to the CMS. Is there anyway to get the linking URL from the report? Thanks!
Moz Pro | | farlandlee0 -
Does Rogerbot respect the robots.txt file for wildcards?
Hi All, Our robots.txt file has wildcards in it, which Googlebot recognizes. Can anyone tell me whether or not Rogerbot recognizes wildcards in the robots.txt file? We've done a Rogerbot site crawl since updating the robots.txt file and the pages that are set to disallow using the wildcards are still showing. BTW, Googlebot is not crawling these pages according to Webmaster Tools. Thanks in advance, Robert
Moz Pro | | AC_Pro0 -
After fixing errors can I re-crawl for diagnostics?
As I am fixing errors will the campaign automatically update to show where I have fixed issues?
Moz Pro | | eidna220 -
Will canonical tag get rid of duplicate page title errors?
I have a directory on my website, paginated in groups of 10. On page 2 of the results, the title tag is the same as the first page, as it is on the 3rd page and so on. This is giving me duplicate page title errors. If i use rel=canonical tags on the subsequent pages and href the first page of my results, will my duplicate page title warnings go away? thanks.
Moz Pro | | fourthdimensioninc0 -
Blocking all robots except rogerbot
I'm in the process of working with a site under development and wish to run the SEOmoz crawl test before we launch it publicly. Unfortunately rogerbot is reluctant to crawl the site. I've set my robots.txt to disallow all bots besides rogerbot. Currently looks like this: User-agent: * Disallow: / User-agent: rogerbot Disallow: All pages within the site are meta tagged index,follow. Crawl report says: Search Engine blocked by robots.txt Yes Am I missing something here?
Moz Pro | | ignician0 -
Site explore reporting error over week
unable to dispaly anchor text error Doh! Roger is still working out the kinks with the new index and is having issues untangling anchor text data. We're currently showing anchor text data from the previous index, but we will update as soon as we can.
Moz Pro | | 1step2heaven120 -
The Site Explorer crawl shows errors for files/folders that do not exist.
I'm fairly certain there is ultimately something amiss on our server but the Site Explorer report on my website (www.kpmginstitutes.com) is showing thousands of folders that do not exist. Example: For my "About Us" page (www.kpmginstitutes.com/about-us.aspx), the report shows a link: www.kpmginstitutes.com/rss/industries/404-institute/404-institute/about-us.aspx. We do have "rss", "industries", "404-institute" folders but they are parallel in the architecture, not sequential as indicated in the error url. Has anyone else seen these types of error in your Site Explorer reports?
Moz Pro | | dturkington0