How critical is Duplicate content warnings?
-
Hi,
So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ.
However, the crawl report returned thousands of errors and most of them are duplicate content warnings.
As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know).
I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler)
So my question is, should I be worried about the thousands of error messages in crawler diagnostics?
any ideas appreciated
-
Personally, I'd keep an eye on it. These things do have a way of expanding over time, so you may want to be proactive. At the moment, though, you probably don't have to lose sleep over it.
-
Thanks for that command Dr. Meyers. Apparently, only 5 such pages are indexed. I suppose I shouldn't worry about this then?
-
One clarification one Vahe's answer - if these continue (?page=2, ?page=3, etc.) then it's traditional pagination. You could use the GWT solution Adam mentioned, although, honestly, I find it's hit-or-miss. It is simpler than other solution. The "ideal" Google solution is very hard to implement (and I actually have issues with it). The other option is to META NOINDEX the variants, but that would take adjusting the template code dynamically.
If it's just an issue of a bunch of "page=1" duplicates, and this isn't "true" pagination, then canonical tags are probably your best bet. There may be a Drupal plug-in or fix - unfortunately, I don't have much Drupal experience.
The question is whether these pages are being indexed by Google, and how many of them there are. At large scale, these kinds of near-duplicates can dilute your index, harm rankings, and even contribute to Panda issues. At smaller scale, though, they might have no impact at all. So, it's not always clear cut, and you have to work the risk/cost calculation.
You can run a command in Google like:
site:example.com inurl:page=
...and try to get a sense of how much of this content is being indexed.
The GWT approach won't hurt, and it's fine to try. I just find that Google doesn't honor it consistently.
-
Thanks Adam and Vahe. Your suggestions are definitely helpful.
-
For pagination problem's it would be better to use this cannonical method- http://googlewebmastercentral.blogspot.com.au/2012/03/video-about-pagination-with-relnext-and.html .
Having dup content in the form paginated results will not penalise you, rather the page/link equity will be split between all these pages. This means you would need to spend more time and energy on the original page to outrank your competitors.
To see these errors in Google Webmaster Tools you should go to the HTML sections area where it will review the sites meta data. I'm sure ull find the same issues there, instead of the sitemaps.
So to improve the overall health of your website, I would suggest that you do try and verify this issue.
Hope this helps. Any issues, best to contact me directly.
Regards,
Vahe
-
OK, this is just what I've done, and it might not work for everyone.
As far as I can tell, the duplicate content warnings do not hurt my rankings, I don't think. When I first signed up for SEOMoz they really alarmed me. If they are hurting my rankings, it's not much - as we preform well in many competitive keywords for our industry, and our website traffic has been growing ~20% year over year for many years now.
The fix for auto-generated duplicate content on our site (which I inherited as my responsibility when I started at my company) would be very expensive. It's something I plan on doing eventually along with some other overhauls, but right now it's not in the budget, because it would basically involve re-architecting how the site and databases function on the back end (ugh).
So, in order to help mitigate any issues and help keep Google from indexing all the duplicate content that can be generated by our system, I use the "URL Parameters" setting in Google Webmaster Tools (under Site Configuration). I've set up a few parameters for Google to specifically NOT INDEX, to keep the duplicate content out of the search engine. I've also set some parameters to specifically reenforce content I want indexed (along with including the original content in sitemaps I've curated myself, rather than having auto-generated sitemaps potentially polluted with duplicate content).
My thinking is that while Roger the SEOMoz bot is still finding this stuff and generating warnings, Googlebot is not.
I don't work at an agency - I'm in-house and I've hard to learn everything by trial and error and often fly by the seat of my pants with this sort of thing. So my conclusion/solutions may be wrong or not work for you, but it seems to work for me.
It's a band-aid fix at best, but it seems to be better than nothing!
Hope this helps,
-Adam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
ViewState and Duplicate Content
Our site keeps getting duplicated content flagged as an issue... however, the pages being grouped together have very little in common on-page. One area which does seem to recur across them is the ViewState. There's a minimum of 150 lines across the ones we've investigated. Could this be causing the reports?
Technical SEO | | RobLev0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
Duplicate Content
We have a ton of duplicate content/title errors on our reports, many of them showing errors of: http://www.mysite.com/(page title) and http://mysite.com/(page title) Our site has been set up so that mysite.com 301 redirects to www.mysite.com (we did this a couple years ago). Is it possible that I set up my campaign the wrong way in SEOMoz? I'm thinking it must be a user error when I set up the campaign since we already have the 301 Redirect. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Product Duplicate Content Issue with Google Shopping
I have a site with approx 20,000 products. These products are resold to hundreds of other companies and are fed from one database therefore the content is duplicated many many times. To overcome this, we are launching the site with noindex meta tags on all product pages. (In phase 2 we will begin adding unique content for every product eek) However, we still want them to appear in Google Shopping. Will this happen or will it have to wait until we remove the noindex tags?
Technical SEO | | FGroup0 -
Duplicate Content - Mobile Site
We think that a mobile version of our site is causing a duplicate content issue; what's the best way to stop the mobile version being indexed. Basically the site forwards mobile users to "/mobile" which is just a mobile optimised version of the original site. Is it best to block the /mobile folder from being crawled?
Technical SEO | | nsmith7870