Has any on else experienced a spike in crawl errors?
-
Hi,
Since the last time our sites were crawled in SEOmoz they are all showing a spike in Errors. (Mainly duplicate page titles and duplicate content).
We haven't changed anything to the structure of the sites but they are all using the same content management system.
The image is an example of what we are witnessing for all our sites based on the same system.
Is anyone else experiencing anything similar? or does anyone know of any changes that SEOmoz has implemented which may be affecting this?
Anthony.
-
Thanks for all your replies.
We haven't changed anything on any of the sites. We use our own CMS which has not changed either.
Webmaster tools doesn't show the same errors as SEOMoz.
We appear to be in the same situation as Mike. We know that we have duplicate titles and content but we have taken care of our duplicate issues using canonical and no index tags, which drastically reduced our errors. For some reason SEOmoz doesn't seem to have paid heed to them on it's latest crawl.
Thanks Mike. At least we are not on our own.
Maybe I should see if this is rectified after the next SEOMoz crawl before I pursue this any further?
-
This leads me to a problem then. As per Dave (the author of the article), "using canonical tags will result in duplicate errors being suppressed. If one page refers to another as a duplicate, than that pair will not be reported as duplicates. Also, if two pages both refer to the same third page as their canonical, then they will not be reported as duplicates of each other, either."
But now that this change has gone into effect I have 2000+ more duplicate content errors appearing and they are all pages with rel="canonical" pointing to the original page. So, as he stated earlier in the post this has caused "the most negative customer experience we anticipate: having a behind-the-scenes change of our duplicate detection heuristic causing a sudden rash of incorrect "duplicate page" errors to appear for no apparent good reason."
Is this something that will eventually correct itself or is this something that will need tweaking of the new detection method?
-
We did change the way we detected duplicate content earlier this month. Here's a blog post about it at http://www.seomoz.org/blog/visualizing-duplicate-web-pages.
Hope this helps explain things for you! Let me know if you have any more questions.
-
I saw a huge spike after the last crawl. In my case, the canonicals we set on our site months ago to handle some duplicate content issues appear not to be seen by Seomoz's crawl. Though when I check for duplicate title & meta issues in Webmaster Tools I don't see the offending pages that SEOMoz is showing me. That leads me to believe something is happening with either how the SEOMoz system is reporting or how their bot is crawling.
-
What CMS are you using?
Did you add any menus to your home or sub-pages (ie footer menus or anything like that?)
Have you gone into the Errors and see what pages are being duplicated?
Have you implemented rel=canonical on the pages?
Is your CMS creating Titles for you or are they manually created?
Have you checked WMT to see if the duplicate issue is there too? (under html improvements)
-
No spikes in either of our campaigns.
You said that yours were related to duplicate page titles / content which likely means your CMS is generating duplicate pages. Could be related to reviews, sorting, comments etc..
Have had a chance to research the errors and see if those pages actually exist? We had an issue with Oscommerce and page sorting causing this same problem, we fixed it by implementing rel canonical tags.
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Site Crawl Stalled and Can't Restart
In my GreenSeed campaign, the site crawl continues to say "in progress." I can't figure out how to stop it or how to restart the site crawl. Can you please help?
Moz Pro | | Winger1 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Webmaster Tools shows mystery errors that Moz does not
One of my campaigns is doing great in the sense that the website has been running fault free for a few months now. Great, of course! But... in Google Webmaster Tools errors keep coming in showing older media documents and pages. And it does not say where they are from. Probably this is more a Google question, but I thought I'd try to find some answers here first. I would appreciate any suggestions and help. Monique
Moz Pro | | MarketingEnergy0 -
Duplicate content error?
I am getting a duplicate content error for the following pages: http://www.bluelinkerp.com/products/accounting/index.asp http://www.bluelinkerp.com/products/accounting/ But, of course, the 2nd link is just an automatic redirect to the index file, is it not? Why is it thinking it is a different URL? See image. NJfxA.png
Moz Pro | | BlueLinkERP0 -
Slowing down SEOmoz Crawl Rate
Is there a way to slow down SEOmoz crawl rate? My site is pretty huge and I'm getting 10k pages crawled every week, which is great. However I sometimes get multiple page requests in one second which slows down my site a bit. If this feature exists I couldn't find it, if it doesn't, it's a great idea to have, in a similar way to how Googlebot do it. Thanks.
Moz Pro | | corwin0 -
Getting Redirect Loop and Oauth Error When Adding Facebook Page
Hi all, I keep getting the following error when trying to add my Facebook page. It worked fine in the past and has suddenly stopped working: The webpage at https://graph.facebook.com/oauth/authorize?client_id=142287725855094&redirect_uri=http%3A%2F%2Fpro.seomoz.org%2Fcampaigns%2F173488%2Fsocial%2Fcreate%2Ffacebook%2F127833296954.html&scope=read_stream%2Cuser_videos%2Cuser_photos%2Cuser_photo_video_tags%2Cmanage_pages%2Cread_insights has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer. I've tried clearing cache and deleting cookies. Any other ideas I would try? Thanks!
Moz Pro | | kenc1380 -
Has anyone else not had an SEOmoz crawl since Dec 22?
Before the holidays, I completed a website redesign on an eCommerce website. One of the reasons for this was duplicate content. The new design has removed all duplicate content. (Product descriptions on 2 pages) I took a look at my Crawl Diagnostics Summary this morning and this is what I saw: Last Crawl Completed: Dec. 15th, 2011 Next Crawl Starts: Dec. 22nd, 2011 Thinking it might have something to do with the holidays. Although, I would like to see this data as soon as possible. Is there a way I can request a crawl from seomoz ? Thanks, John Parker
Moz Pro | | JohnParker27920 -
Schedule crawls for 2 subdomains every 24 hours
I saw at this link: http://pro.seomoz.org/tools/crawl-test "As a PRO member, you can schedule crawls for 2 subdomains every 24 hours, and you'll get up to 3,000 pages crawled per subdomain." However I am having trouble finding where to schedule this 24 hour crawl in my Pro Dashboard. I did not see the option for this setting in the crawl diagnostics tab or in the campaign settings section from the dashboard home page. Can you help? thanks! Michael
Moz Pro | | texmeix0