Is there any way to view crawl errors historically?
-
One of the website's we monitor have been getting high duplicate page titles, as we work through the pages, we see changes and the number of duplicate page titles are decreasing.
However, lately, it went up again and the duplicate page titles have increased.
I wanted to ask if there's any way to view the new errors and the old errors separately or sorted in a way that can help me identify why we are getting new page crawl errors.
Any advice would be great.
Thanks!
-
Do you have any server logs? Those can give you the information you need.
Google Webmaster Tools keep errors from a couple of year back. At least the ones not yet fixed. It might be that you are also lworking on the internal link structure, improving the crawling and you might have recently submitted a more up to date sitemap, which surfaced old errors.
How do you currently track the errors?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way for me to find out how a keyword would rank if it were on a specific site?
Is there a way for me to find out how a keyword would rank if it were on a specific site? For example, lets say that XYZ.com does not have the keyword "ABC". Is there a way for me to find out how the keyword "ABC" would rank if it were on XYZ.com?
Moz Pro | | TurboH0 -
Rogerbot does not catch all existing 4XX Errors
Hi I experienced that Rogerbot after a new Crawl presents me new 4XX Errors, so why doesn't he tell me all at once? I have a small static site and had 9 crawls ago 10 4XX Errors, so I tried to fix them all.
Moz Pro | | inlinear
The next crawl Rogerbot fount still 5 Errors so I thought that I did not fix them all... but this happened now many times so that I checked before the latest crawl if I really fixed all the errors 101%. Today, although I really corrected 5 Errors, Rogerbot digs out 2 "new" Errors. So does Rogerbot not catch all the errors that have been on my site many weeks before? Pls see the screenshot how I was chasing the errors 😉 404.png0 -
Crawl Diagnostics 403 on home page...
In the crawl diagnostics it says oursite.com/ has a 403. doesn't say what's causing it but mentions no robots.txt. There is a robots.txt and I see no problems. How can I find out more information about this error?
Moz Pro | | martJ0 -
Question #3) My last question has to do with Some SEOmoz crawl diagnostics -
I recently fixed (or well, I am asking to make sure that this was the right thing to do in my first question posted a few minutes ago), a problem where all of my internal main sidebar category pages were linking using https://, which to my knowledge means SECURE pages. anyways, OSE, and google seem to be not recognizing the link juice. but my rank fell for one of my main keywords by 2 positions about a week after i made the fix to have the pages be indexable. Making my pages properly linked can't be a bad thing right? That's what I said. So I looked deeper, and my crawl diagnostics reports showed a MASSIVE reduction in warnings (about 3,000 301 redirects were removed by changing the https:// to http:// because all the secure pages were re-directing to http:// regular structure) and an INCREASE, in Duplicate Page Titles, and Temporary redirects... Could that have been the reason the rank dropped? I think I am going to fix all the Duplicate Page Title problems tonight, but still, I am a little confused as to why such a major fix didn't help and appeared to hurt me. I feel like it hurt the rank, not because of what I did, but because what I did caused a few extra re-directs, and opened the doors for the search engine to discover more pages that had problems (which could have triggered an algo that says hey, these people have to much duplicate problems) Any thoughts will be GREATLY appreciated thumbed, thanked, and marked as best answers! Thanks in advance for your time, Tyler A.
Moz Pro | | TylerAbernethy0 -
Third crawl of my sites back to 250 pages
Hi all, I've been waiting some days for the third crawl of my sites, but SEOMOZ only crawled 277 pages. The next phrase appeared on my crawl report: Pages Crawled: 277 | Limit: 250 My last 2 crawls were of about 10K limit. Any idea? Kind regards, Simon.
Moz Pro | | Aureka0 -
Links & page authority crawl
I see the links and page authority have not been updated in over a month... does anyone know how often it gets updated?
Moz Pro | | nazmiyal0 -
Duplicate page error from SEOmoz
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. I'm trying to use a rel=canonical but maybe I'm not doing it right. This page is the original, definitive version of the content: https://www.borntosell.com/covered-call-newsletter/sent-2011-10-01 This page is an alias that points to it (each month the alias is changed to point to the then current issue): https://www.borntosell.com/covered-call-newsletter/latest-issue The alias page above contains this tag (which is also updated each month when a new issue comes out) in the section: Is that not correct? Is the https (vs http) messing something up? Thanks!
Moz Pro | | scanlin0 -
Question about when new crawls start
Hi everyone, I'm currently using the trial of seomoz and I absolutely love what I'm seeing. However, I have 2 different websites (one has over 10,000 pages and one has about 40 pages). I've noticed that the smaller website is crawled every few days. However, the larger site hasn't been crawled in a few days. Although both campaigns state that the sites won't be crawled until next Monday, is there any way to get the crawl to start sooner on the large site? The reason that I've asked is that I've implemented some changes that will likely decrease the amount of pages that are crawled simply based upon the recommendations on this site. So, I'm excited to see the potential changes. Thanks, Brian
Moz Pro | | beeneeb0