Duplicate content on yearly product models.
-
TL;DR - Is creating a page that has 80% of duplicated content from the past year's product model where 20% is about the new model changes going to be detrimental to duplicate content issues. Is there a better way to update minor yearly model changes and not have duplicated content?
Full Question - We create landing pages for yearly products. Some years the models change drastically and other years there are only a few minor changes.
The years where the product features change significantly is not an issue, it's when there isn't much of a change to the product description & I want to still rank on the new year searches.
Since I don't want duplicate content by just adding the last year's model content to a new page and just changing the year (2013 to 2014) because there isn't much change with the model, I thought perhaps we could write a small paragraph describing the changes & then including the last year's description of the product.
Since 80% of the content on the page will be duplicated from the last year's model, how detrimental do you think this would be for a duplicate content issue?
The reason I'm leaving the old model up is to maintain the authority that page has and to still rank on the old model which is still sold.
Does anyone else have any other better idea other than re-writing the same information over again in a different way with the few minor changes to the product added in.
-
The exact threshold that Google uses to determine duplicate content is a tricky one.
The more important question is, are you noticing a problem? Are both pages being indexed (2013 & 2014)? When you search for the 2014 model in Google, is it showing up in the search results or is it being filtered out as duplicate content? If your content isn't being indexed or isn't ranking, then you have a problem.
Panda also can be an issue, but only if a large portion of your site is duplicated. Is this model upgrade process something you apply to 1 or 2 products, or are you talking hundreds? What proportion of your site is nearly duplicate content compared to original content. If the percentage is too large, you could be at risk of Panda.
If this is only for a couple of pages, you can always just take a little time to re-write the description from last year to improve its uniqueness, and also bolster the page with unique content like user reviews and new photos & videos of your product. Once a product model is discontinued, you can also 301 redirect it to the newer models.
-
First, yes it's an issue if 80% of the content is duplicate. Panda will not like this at all.
There are a few strategies that will fix the issue... primarily write new content. I know its time consuming, but content is king. Take the old description and reword it, tweek it, change it. ...anything to make it different than last years product content.
You can also iframe the old product description on the new page. This will prevent Google from indexing the old content on the new page. But then your new page might have too thin of content...back to the original solution...write more content.
Rand had a video about this issue a few weeks ago. http://moz.com/blog/handling-duplicate-content-across-large-numbers-of-urls
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Country Code Top Level Domains & Duplicate Content
Hi looking to launch in a new market, currently we have a .com.au domain which is geo-targeted to Australia. We want to launch in New Zealand which is ends with .co.nz If i duplicate the Australian based site completely on the new .co.nz domain name, would i face duplicate content issues from a SEO standpoint?
Intermediate & Advanced SEO | | jayoliverwright
Even though it's on a completely separate country code. Or is it still advised tosetup hreflang tag across both of the domains? Cheers.0 -
Redirect issue launching duplicate product categories on another TLD
Dear Mozzerz We run this e-commerce website (superstar.dk) where we are selling all different kinds of wristwatches from different brand names (Casio, Garmin, Suunto etc). We just bought another website selling watches (xxx.com) and therefore we would like to move some of the content from superstar.dk to the new website xxx.com, making superstar.dk into a more niche website. So we are basically taking a brand with all the products in it and shutting it down on superstar.dk and instead launching it on xxx.com. Superstar.dk will still be running, just with a more niche product- and brand selection. So my question is, should we redirect all the old product categories that we are shutting down to the new website on another TLD where we are opening them again and the same for the products (e.g. superstar.dk/garmin -> xxx.com/garmin)? Or would it be better to keep the redirects within the same website/TLD (e.g. superstar.dk/garmin -> superstar.dk)? A few examples:
Intermediate & Advanced SEO | | superstardenmark
superstar.dk/garmin -> xxx.com/garmin
superstar.dk/suunto -> xxx.com/suunto
etc..
superstar.dk/product1 -> xxx.com/product1
superstar.dk/product2 -> xxx.com/product2
etc.0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Need help with huge spike in duplicate content and page title errors.
Hi Mozzers, I come asking for help. I've had a client who's reported a staggering increase in errors of over 18,000! The errors include duplicate content and page titles. I think I've found the culprit and it's the News & Events calender on the following page: http://www.newmanshs.wa.edu.au/news-events/events/07-2013 Essentially each day of the week is an individual link, and events stretching over a few days get reported as duplicate content. Do you have any ideas how to fix this issue? Any help is much appreciated. Cheers
Intermediate & Advanced SEO | | bamcreative0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Can PDF be seen as duplicate content? If so, how to prevent it?
I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it. We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process. However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents. Any one has insight on how to deal with PDF provided by third parties? Thanks in advance.
Intermediate & Advanced SEO | | Gestisoft-Qc1 -
How to deal with category browsing and duplicate content
On an ecommerce site there are typically a lot of pages that may appear to be duplications due to category browse results where the only difference may be the sorting by price or number of products per page. How best to deal with this? Add nofollow to the sorting links? Set canonical values that ignore these variables? Set cononical values that match the category home page? Is this even a possible problem with Panda or spiders in general?
Intermediate & Advanced SEO | | IanTheScot0