Rogerbot does not catch all existing 4XX Errors
-
Hi I experienced that Rogerbot after a new Crawl presents me new 4XX Errors, so why doesn't he tell me all at once?
I have a small static site and had 9 crawls ago 10 4XX Errors, so I tried to fix them all.
The next crawl Rogerbot fount still 5 Errors so I thought that I did not fix them all... but this happened now many times so that I checked before the latest crawl if I really fixed all the errors 101%.Today, although I really corrected 5 Errors, Rogerbot digs out 2 "new" Errors. So does Rogerbot not catch all the errors that have been on my site many weeks before?
Pls see the screenshot how I was chasing the errors
-
I understand,
I am not using a CMS and the site is not very big, so I wondered why Roberbot did not find all the 404 Error at the first time, because they have been there for many months.
Holger
-
Hey Holger,
Our crawler will catch as many errors as it can. It's possible that these errors were not present or just were not found at the time of the crawl.I'm running a crawl test to see if there's any discrepancy between your current campaign crawl and mine just to double-check.
In general, Kyle is correct that sometimes those errors just crop up, especially if you're using any sort of CMS.
I hope that helps. I'll update here after my crawl test is done.
Cheers,
Joel. -
Hi Holger,
4XX Errors can be quite common depending on your site setup so don't be surprised that Roger will keep returning errors for you to fix.
I would advise checking this data against GWT's own crawl error data which you can find in Webmaster Tools under Health>Crawl Errors.
I hope that helps,
K
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Account Error
Hey I have Moz account free trail for 30 days whenever i analyze website that is based on machines https://lattemachinehub.com/ i show error please solve my problem.
Moz Pro | | alihamughal6930 -
Crawler errors or Page Load Time? What affect more to SEO
Hello, I have a page with a forum and at this moment the moz report says that have 15.1k of issues like url too long, meta noindex, title too long etc. But this page have a load time realy sloooow with 11 seconds. I know i need fix all that errors (i'm working on this) but... What is more important for SEO? The page load or that type of error like duplicate titles etc. Thank you!
Moz Pro | | DanielExposito1 -
5XX (Server Error) on all urls
Hi I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine. Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report | 500 : TimeoutError http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 | Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout? Many thanks Carl
Moz Pro | | GrumpyCarl0 -
How to solve duplicate page title & content error
I got lot of errors in Duplicate page title - 5000 Here the result page is same and content is also same,but it differs only with page no in meta title Title missing error In seomoz report i got empty msg - title,meta desc,meta robots,meta refresh But if i check the link which i got error it shows all meta tags..we have added all meta tags in our site..But i dont no why i got title missing error . 404 error In this report,if i click the link which i got error, it goes to main page of our site. But the url differs. eg: The error link is :www.example.com/buy/requirement-2-0-inmumbai-property it automatically goes to www.example.com page Let me know how to solve these issues.
Moz Pro | | Rajesh.Chandran0 -
Error on duplicated content, but when checking shouldn't been possible
Dear all, Every week I look at the different crawl reports for our website, since the start of my SeoMoz membership the Errors for duplicated content and duplicated Title is rising. But if I take out the .csv file and look in more detail, and select a pages which is marked as duplicated content, a canonical is actually existing on this page. So it shouldn't be an warning, I have no idea what the issue could be. For example pagesare marked as duplicated content, <colgroup><col width="966"></colgroup>
Moz Pro | | Letty
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?sortby=2 |
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?startnumber=60&sortby=2 |
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?startnumber=80&sortby=2 | the parameters after '?' (question mark) are necessary for our internal system. To overcome duplicated content we coded that a canonical tag onis placed on every page with parameters and the main page is http://www.zylom.com/es/descargar-juegos/3-en-raya/ but it doesn't seem to work, because my error warnings are still rising. Please advice me Kind regards, Ms Letty van Eembergen0 -
4xx status code a page that cannot be accessed..
All the error is because the Danish letters "åøæ" in the url.. But i can access all pages, therefor the error isent a true error!!! Why is it an Issue, Google can read "øæå", why cant SEOMOZ??
Moz Pro | | seopeter290 -
Why am I getting 400 client errors on pages that work?
Hi, I just done the initial crawl on y domain and I sem to have 80 400 client errors. However when I visit the urls the pages are loading fine. Any ideas on why this is happening and how I can resolve the problem?
Moz Pro | | moesian0