Best way to deal with over 1000 pages of duplicate content?
-
Hi
Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue!
95% of the issues arise from our news and news archive as its been going for sometime now.
We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added.
The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article.
What is the best way to solve this issue?
From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long.
Thanks Ben
-
Hi Guys,
Thanks for your help.
I decided that updating the robot text would be the best option.
Ben
-
Technically, your URL:
http://www.capitalspreads.com/news
is really:
http://www.capitalspreads.com/news/index.php
So just add this line to robots.txt:
Disallow: /news/index.php
You won't be disallowing the pages underneath it but you will be blocking the page that contains all dupe content.
Also, if you prefer to do this with a meta tag on the news page, you could always do "noindex, follow" to make sure Google follows the links - they just don't index the page.
-
It may not be helpful to you in this situation. I was just saying that if your server creates multiple URLs containing the same content, as long as those URLs also contain the identical rel=canonical directive, a single canonical version of that content will be established.
-
Hi Chris,
I've read about the canonicalization but from what i could work I'd have to tag each of the 400 plus page individually to solve the issue and i don't think this is the best use of anyone's time.
I don't under how placing the tag and pointing back at itself will help? Can you explain a little more.
Ideally i want the full article page to be indexed as this will be more beneficial to the user. By placing the canonical tag on the snippets page and pointing it to itself would i not be telling the spider this is the page to index?
Here some examples
http://www.capitalspreads.com/news - Snippets page
http://www.capitalspreads.com/news/uk-economic-recovery-will-take-years - Full article, that would ideally be the page that wants to be indexed.
Regards
Ben
-
Ben, you use the rel=canonical directive in the header of the page with the original source of the content (pointing to itself), every reproduction of that page that also contains the rel=canonical directive pointing to the original source. So it's not necessarily a page by page solution. Have you read through this yet? Canonicalization and the Canonical Tag - Learn SEO - Moz
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to link to multiple location pages
I am a Magician and have multiple location pages for each county I cover. I currently have them linked off the menu under locations/ <county>and also in the footer</county> However I have heard that a link from the page is much stronger, so I am experimenting with removing the Menu & Footer link and just linking to these pages from within the content. It's not really a navigation item and most people come in through search to the right page. Am I diluting the link by having it in the Menu/Page and Footer? I read a long time ago that Google only considers the first link to a page and ignores the rest - is that the case? Thanks Roger https://www.rogerlapin.co.uk/
Technical SEO | | Rogerperk0 -
Email and landing page duplicate content issue?
Hi Mozers, my question is, if there is a web based email that goes to subscribers, then if they click on a link it lands on a Wordpress page with very similar content, will Google penalize us for duplicate content? If so is the best workaround to make the email no index no follow? Thanks!
Technical SEO | | CalamityJane770 -
Duplicate content pages on different domains, best practice?
Hi, We are running directory sites on different domains of different countries (we have the country name in the domain name of each site) and we have the same static page on each one, well, we have more of them but I would like to exemplify one static page for the sake of simplicity. So we have http://firstcountry.com/faq.html, http://secondcountry.com/faq.html and so on for 6-7 sites, faq.html from one country and the other have 94% similarity when checked against duplicate content. We would like an alternative approach to canonical cause the content couldn´t belong to only one of this sites, it belongs to all. Second option would be unindex all but one country. It´s syndicated content but we cannot link back to the source cause there is none. Thanks for taking the time in reading this.
Technical SEO | | seosogood0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Why are some pages now duplicate content?
It is probably a silly question, but all of a sudden, the following pages of one of my clients are reported as Duplicate content. I cannot understand why. They weren't before... http://www.ciaoitalia.nl/product/pizza-originale/mediterranea-halal
Technical SEO | | MarketingEnergy
http://www.ciaoitalia.nl/product/pizza-originale/gyros-halal
http://www.ciaoitalia.nl/product/pizza-originale/döner-halal
http://www.ciaoitalia.nl/product/pizza-originale/vegetariana
http://www.ciaoitalia.nl/product/pizza-originale/seizoen-pizza-estate
http://www.ciaoitalia.nl/product/pizza-originale/contadina
http://www.ciaoitalia.nl/product/pizza-originale/4-stagioni
http://www.ciaoitalia.nl/product/pizza-originale/shoarma Thanks for any help in the right direction 🙂 | |
| |
| |
| |
| |
| |
| |
| | <colgroup><col style="mso-width-source: userset; mso-width-alt: 17225; width: 353pt;" width="471"></colgroup>
| http://www.ciaoitalia.nl/product/pizza-originale/mediterranea-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/gyros-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/döner-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/vegetariana |
| http://www.ciaoitalia.nl/product/pizza-originale/seizoen-pizza-estate |
| http://www.ciaoitalia.nl/product/pizza-originale/contadina |
| http://www.ciaoitalia.nl/product/pizza-originale/4-stagioni |
| http://www.ciaoitalia.nl/product/pizza-originale/shoarma |0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
Duplicate Page Title with Pretashop
We have our main website and blog in Wordpress under www.enasport.com and our shop with Prestashop under www.enasport.com/productos so all our products have for example www.enasport.com/productos/56-creatina-monohidrato.html I wonder if this is the problem with Duplicate Page Title as seems we have more than 200 of this issue. Is there any way to solve this?
Technical SEO | | ENASports0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2