SEOmoz suddenly reporting duplicate content with no changes???
-
I am told the crawler has been updated and wanted to know if anyone else is seeing the same thing I am.
SEOmoz reports show many months of no duplicate content problems. As of last week though, I get a little over a thousand pages reported as dupe content errors.
Checking these pages I find there is similar content (hasn't changed) with keywords that are definitely different. Many of these pages rank well in Google, but SEOmoz is calling them out as duplicate content. Is SEOmoz attempting to closely imitate Google's perspective in this matter and therefore telling me that I need to seriously change the similar content?
Anyone else seeing something like this?
-
Hi
We see the extreme raise in the duplicate content of our site too. If the sensitivity is adapted, will these graphs come down again?
What is your opinion on how Google sees a webshop with lot's of products and filter options? Our site www.dmlights.com/massive for example can have a lot of filtering but we try to counter this in Webmaster Tools with the URL parameters.
Do you suggest to adapt this for good seo?
Wondering about your opinions. Thanks.
-
Hey Scott,
Again, we're sorry about the odd jump in duplicate content errors!
We just launched a new crawler and it is being extremely sensitive to duplicate content. As of now we are picking up duplicate pages on your domain via:
https clones of URLs
Some pages have a “/” trailing after the URL and some don’t
We are also ignoring some rel=canonical directives
This is an issue that other users are seeing with their crawls. Our engineers have made some changes to the crawler to scale back the sensitivity to these issues on the crawler and you should be seeing the changes within a week or two.
We're really sorry for the confusion.
Best of Luck,
Chiaryn
-
Two good suggestions so far, and both I had checked. Thanks KJ Rogers and Ryan Kent.
This is starting to look like it boils down to how much the new SEOmoz crawler sees content in the same way that Google does.
We did not make any site-wide changes and the URLs identified as duplicate in the report are valid URLs that actually hold similar content (keywords and so forth were changed for each version of a slightly different product through an Excel Concatenate construct to build the content). We have actually seen these pages climb in rank over the months since the content was added.
So, like I said, the sudden identification of these as duplicate by the moz crawler is suspicious to me. Not sure it sees things the way Google does.
-
Without examining your site and the pages involved it is not possible for me to share feedback.
Is it possible you made any recent site wide changes? Changes to your header, navigation, footer or sidebar could have pushed you passed a certain threshhold of duplicate content which triggered a flag.
-
I got the same thing last week. I later found out that mine, using dynamic content on the same page, had speical characters in the url which was taking crawlers to an error page. The error page was showing a list of pages with the url's capitalized. I was able to fix some of them, but it scared the heck out of me.
I had to run a crawl test from SEOMoz to filter out what was going on. Perhaps you have something similar?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much content is duplicate content? Differentiate between website pages, help-guides and blog-posts.
Hi all, I wonder that duplicate content is the strong reason beside our ranking drop. We have multiple pages of same "topic" (not exactly same content; not even 30% similar) spread across different pages like website pages (product info), blog-posts and helpguides. This happens with many websites and I wonder is there any specific way we need to differentiate the content? Does Google find the difference across website pages and blog-pots of same topic? Any good reference about this? Thanks
Algorithm Updates | | vtmoz0 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
Is it stil a rule that Google will only index pages up to three tiers deep? Or has this changed?
I haven't looked into this in a while, it used to be that you didn't want to bury pages beyond three clicks from the main page. What is the rule now in order to have deep pages indexed?
Algorithm Updates | | seoessentials0 -
Does it impact over ranking of any website if their same content being used some other external sources
Hi Moz & members, I just want to make sure over website www.1st-care.org , does it impact over ranking this website if the same content (of about us or home care services) being used some other external sources or local citations places. Do those published same content create any ranking drop issue with this website's and making its content strengthen week? . As I was on 9th position in Google.com before, now it has slipped to 29th position. WHY? is there content issue or anything else which i am not aware.
Algorithm Updates | | Futura
See the content used:
Home page content
About us page content Regards,
Teginder Ravi0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Moving content in to tabs
Hi, I'm kind of an SEO noobie, so please bare with me 🙂 On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs. This makes sense. We tried it and it makes navigating through the information much easier for visitors. My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab. Will this cause SEO issues? Thank you!
Algorithm Updates | | eladlachmi0 -
Are you seeing changes in your sites today? Panda 2.2?
I've heard rumblings of some Panda sites recovering in the last few days and wondered if the talked about Panda 2.2 has been rolled out. My own site (which actually had a significant boost after Panda) has seen a significant increase in traffic today (started about noon EST yesterday) and a nice increase in Adsense revenue as well. How are your sites doing?
Algorithm Updates | | MarieHaynes1