Has any on else experienced a spike in crawl errors?
-
Hi,
Since the last time our sites were crawled in SEOmoz they are all showing a spike in Errors. (Mainly duplicate page titles and duplicate content).
We haven't changed anything to the structure of the sites but they are all using the same content management system.
The image is an example of what we are witnessing for all our sites based on the same system.
Is anyone else experiencing anything similar? or does anyone know of any changes that SEOmoz has implemented which may be affecting this?
Anthony.
-
Thanks for all your replies.
We haven't changed anything on any of the sites. We use our own CMS which has not changed either.
Webmaster tools doesn't show the same errors as SEOMoz.
We appear to be in the same situation as Mike. We know that we have duplicate titles and content but we have taken care of our duplicate issues using canonical and no index tags, which drastically reduced our errors. For some reason SEOmoz doesn't seem to have paid heed to them on it's latest crawl.
Thanks Mike. At least we are not on our own.
Maybe I should see if this is rectified after the next SEOMoz crawl before I pursue this any further?
-
This leads me to a problem then. As per Dave (the author of the article), "using canonical tags will result in duplicate errors being suppressed. If one page refers to another as a duplicate, than that pair will not be reported as duplicates. Also, if two pages both refer to the same third page as their canonical, then they will not be reported as duplicates of each other, either."
But now that this change has gone into effect I have 2000+ more duplicate content errors appearing and they are all pages with rel="canonical" pointing to the original page. So, as he stated earlier in the post this has caused "the most negative customer experience we anticipate: having a behind-the-scenes change of our duplicate detection heuristic causing a sudden rash of incorrect "duplicate page" errors to appear for no apparent good reason."
Is this something that will eventually correct itself or is this something that will need tweaking of the new detection method?
-
We did change the way we detected duplicate content earlier this month. Here's a blog post about it at http://www.seomoz.org/blog/visualizing-duplicate-web-pages.
Hope this helps explain things for you! Let me know if you have any more questions.
-
I saw a huge spike after the last crawl. In my case, the canonicals we set on our site months ago to handle some duplicate content issues appear not to be seen by Seomoz's crawl. Though when I check for duplicate title & meta issues in Webmaster Tools I don't see the offending pages that SEOMoz is showing me. That leads me to believe something is happening with either how the SEOMoz system is reporting or how their bot is crawling.
-
What CMS are you using?
Did you add any menus to your home or sub-pages (ie footer menus or anything like that?)
Have you gone into the Errors and see what pages are being duplicated?
Have you implemented rel=canonical on the pages?
Is your CMS creating Titles for you or are they manually created?
Have you checked WMT to see if the duplicate issue is there too? (under html improvements)
-
No spikes in either of our campaigns.
You said that yours were related to duplicate page titles / content which likely means your CMS is generating duplicate pages. Could be related to reviews, sorting, comments etc..
Have had a chance to research the errors and see if those pages actually exist? We had an issue with Oscommerce and page sorting causing this same problem, we fixed it by implementing rel canonical tags.
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Crawl Test error
Moz crawl test show blank report for my website test - guitarcontrol.com. Why??? Please suggest.
Moz Pro | | zoe.wilson170 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Why is Link Count smaller than Internal Links in Crawl Test report?
We recently ran the crawl test report and for most of our pages we are getting 1150 internal links but 40-50 as the link count. Why is there such a big disparity?
Moz Pro | | usdmseo0 -
Why for all my campaigns it is always shows that the number of pages crawled as 1
Hi All, I am new to moz. Can anyone help to solve my problem. I am signed up for a pro account and taking a free trial. and I've created 3 campaigns, for everything, the number of pages crawled is shown as 1 (i.e there are only one page is crawled for a given url, it doesn't crawl my pages comes through that url, like pagination and etc.) Anyone please tell me, Is this is error due to my site or any activity in my campaign.
Moz Pro | | sandy7th0 -
Dot Net Nuke generating long URL showing up as crawl errors!
Since early July a DotNetNuke site is generating long urls that are showing in campaigns as crawl errors: long url, duplicate content, duplicate page title. URL: http://www.wakefieldpetvet.com/Home/tabid/223/ctl/SendPassword/Default.aspx?returnurl=%2F Is this a problem with DNN or a nuance to be ignored? Can it be controlled? Google webmaster tools shows no crawl errors like this.
Moz Pro | | EricSchmidt0 -
Campaign Crawl Report
Hello, Just a quicky, is there anyway I can do a crawl report for something in a campaign so I can compare the changes? I know you can do a separate crawl test, but it wont show the differences,and the next crawl date isnt untill the 28th.
Moz Pro | | Prestige-SEO0 -
How do I get the Page Authority of individual URLs in my exported (CSV) crawl reports?
I need to prioritize fixes somehow. It seems the best way to do this would be to filter my exported crawl report by the Page Authority of each URL with an error/issue. However, Page Authority doesn't seem to be included in the crawl report's CSV file. Am I missing something?
Moz Pro | | Twilio0 -
Is there any way to manually initiate a crawl through SEOMoz?
... or do you actually have to wait a week for the next scheduled crawl date on a particular campaign? We've just made a ton of changes to our site, and it would be helpful to know if they will generate any warnings or errors sooner rather than later. Thanks!
Moz Pro | | jadeinteractive1