How critical is Duplicate content warnings?
-
Hi,
So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ.
However, the crawl report returned thousands of errors and most of them are duplicate content warnings.
As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know).
I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler)
So my question is, should I be worried about the thousands of error messages in crawler diagnostics?
any ideas appreciated
-
Personally, I'd keep an eye on it. These things do have a way of expanding over time, so you may want to be proactive. At the moment, though, you probably don't have to lose sleep over it.
-
Thanks for that command Dr. Meyers. Apparently, only 5 such pages are indexed. I suppose I shouldn't worry about this then?
-
One clarification one Vahe's answer - if these continue (?page=2, ?page=3, etc.) then it's traditional pagination. You could use the GWT solution Adam mentioned, although, honestly, I find it's hit-or-miss. It is simpler than other solution. The "ideal" Google solution is very hard to implement (and I actually have issues with it). The other option is to META NOINDEX the variants, but that would take adjusting the template code dynamically.
If it's just an issue of a bunch of "page=1" duplicates, and this isn't "true" pagination, then canonical tags are probably your best bet. There may be a Drupal plug-in or fix - unfortunately, I don't have much Drupal experience.
The question is whether these pages are being indexed by Google, and how many of them there are. At large scale, these kinds of near-duplicates can dilute your index, harm rankings, and even contribute to Panda issues. At smaller scale, though, they might have no impact at all. So, it's not always clear cut, and you have to work the risk/cost calculation.
You can run a command in Google like:
site:example.com inurl:page=
...and try to get a sense of how much of this content is being indexed.
The GWT approach won't hurt, and it's fine to try. I just find that Google doesn't honor it consistently.
-
Thanks Adam and Vahe. Your suggestions are definitely helpful.
-
For pagination problem's it would be better to use this cannonical method- http://googlewebmastercentral.blogspot.com.au/2012/03/video-about-pagination-with-relnext-and.html .
Having dup content in the form paginated results will not penalise you, rather the page/link equity will be split between all these pages. This means you would need to spend more time and energy on the original page to outrank your competitors.
To see these errors in Google Webmaster Tools you should go to the HTML sections area where it will review the sites meta data. I'm sure ull find the same issues there, instead of the sitemaps.
So to improve the overall health of your website, I would suggest that you do try and verify this issue.
Hope this helps. Any issues, best to contact me directly.
Regards,
Vahe
-
OK, this is just what I've done, and it might not work for everyone.
As far as I can tell, the duplicate content warnings do not hurt my rankings, I don't think. When I first signed up for SEOMoz they really alarmed me. If they are hurting my rankings, it's not much - as we preform well in many competitive keywords for our industry, and our website traffic has been growing ~20% year over year for many years now.
The fix for auto-generated duplicate content on our site (which I inherited as my responsibility when I started at my company) would be very expensive. It's something I plan on doing eventually along with some other overhauls, but right now it's not in the budget, because it would basically involve re-architecting how the site and databases function on the back end (ugh).
So, in order to help mitigate any issues and help keep Google from indexing all the duplicate content that can be generated by our system, I use the "URL Parameters" setting in Google Webmaster Tools (under Site Configuration). I've set up a few parameters for Google to specifically NOT INDEX, to keep the duplicate content out of the search engine. I've also set some parameters to specifically reenforce content I want indexed (along with including the original content in sitemaps I've curated myself, rather than having auto-generated sitemaps potentially polluted with duplicate content).
My thinking is that while Roger the SEOMoz bot is still finding this stuff and generating warnings, Googlebot is not.
I don't work at an agency - I'm in-house and I've hard to learn everything by trial and error and often fly by the seat of my pants with this sort of thing. So my conclusion/solutions may be wrong or not work for you, but it seems to work for me.
It's a band-aid fix at best, but it seems to be better than nothing!
Hope this helps,
-Adam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
Duplicate Content Issue
Very strange issue I noticed today. In my SEOMoz Campaigns I noticed thousands of Warnings and Errors! I noticed that any page on my website ending in .php can be duplicated by adding anything you want to the end of the url, which seems to be causing these issues. Ex: Normal URL - www.example.com/testing.php Duplicate URL - www.example.com/testing.php/helloworld The duplicate URL displays the page without the images, but all the text and information is present, duplicating the Normal page. I Also found that many of my PDFs seemed to be getting duplicated burried in directories after directories, which I never ever put in place. Ex: www.example.com/catalog/pdfs/testing.pdf/pdfs/another.pdf/pdfs/more.pdfs/pdfs/ ... when the pdfs are only located in a pdfs directory! I am very confused on how to fix this problem. Maybe with some sort of redirect?
Technical SEO | | hfranz0 -
How to prevent duplicate content in archives?
My news site has a number of excerpts in the form of archives based on categories that is causing duplicate content problems. Here's an example with the nutrition archive. The articles here are already posts, so it creates the duplicate content. Should I nofollow/noindex this category page along with the rest and 2011,2012 archives etc (see archives here)? Thanks so much for any input!
Technical SEO | | naturalsociety0 -
How to resolve this Duplicate content?
Hi , There is page i get when i do proper menu navigation Caratlane.com>jewellery>rings>casualsrings> http://www.caratlane.com/jewellery/rings/casual-rings/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html When i do a site search in my search box by my product code number "JR00219" The same page is appears with different url http://www.caratlane.com/leaves-dew-diamond-0-03-ct-peridot-1-ct-ring-18k-yellow-gold.html So there is a duplicate content. How can we resolve it. Regards, kathir caratlane.com
Technical SEO | | kathiravan0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | | hawkvt10 -
Long tail keywords and duplicate content (product description)
Hi <acronym title="Search Engine Optimization">SEO</acronym> pro's, how are you doing these days? Hope everything is fine... Let's get down to business: I've got a little question about ecommerce sites with duplicate content (product descriptions). I'm already ranking top #1 for exact keyword matche's (did a lot of backlink work with exact keyword). That's fine. The question is: long tail keywords still getting lower results than the competitors, because they published the content first. How to beat them? What I need to do/work to outrank competitors on long tail keywords? (I really need this because almost keywords/products from my niche only have 10% of exact search's). Hope someone can give me a word of light on this! Thanks!
Technical SEO | | azaiats20 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0