Suggestions on dealing with duplicate content?
-
What are the best ways to protect / deal with duplicate content? I've added an example scenario,
- Nike Trainer model 1 – has an overview page that also links to a sub-page about cushioning, one about Gore-Tex and one about breathability.
- Nike Trainer model 2,3,4,5 – have an overview page that also links to sub-pages page about cushioning , Gore-Tex and breathability.
In each of the sub-pages the URL is a child of the parent so a distinct page from each other e.g.
- /nike-trainer/model-1/gore-tex
- /nike-trainer/model-2/gore-tex.
There is some differences in material composition, some different images and of course the product name is referred multiple times. This makes the page in the region of 80% unique.
Any suggestions welcome about the above example or any other ways you guys know of dealing with duplicate content.
-
So the issue is that you have sub-pages which contain a lot of the same information about cushioning, Gore-Tex etc?
If the content is reasonably unique, then it's a case of optimising each sub page for 'trainer model name cushioning' or 'trainer model name Gore-Tex' as appropriate.
Alternatively, you could create one high level glossary page for all the terms used for all the trainers, and then canonicalise each sub page to that one - then you won't risk the duplicate content filter, and provide a useful resource for people who might be looking for Gore Tex Trainers, rather than Nike Model One With Gore-Tex Trainers...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate ecommerce domains and canonical
Hi everybody! I'd like to discuss the SEO strategy I've thought regarding a client of mine and ask for help because it's a serious case of duplicate content. There is a main website (the business model one) where he compares the cost of medicines in several pharmacies, to show the cheapest shopping cart to the customer. But the shopping has to been made in another domain, within the selected pharmacie, because my country's law in Europe says that is compulsory to sell the medicines only on the pharmacy website. So my client has started to create domains, one for each pharmacy, where the differences between them are only some products, the business information of the pharmacy and the template's colour. But all of them shares the same product data base. My aim is to rank the comparing website (it contains all the products), not each pharmacy, so I've started to create different content for this one. Should I place rel=canonical in the pharmacies domains t the original one? For instance: www.pharmacie1.com >> www.originaltorank.com www.pharmacie2.com >> www.originaltorank.com www.pharmacie1.com/product-10 >> www.originaltorank.com/product-10 I've already discuss the possibilities to focus all the content in only one website, but it's compulsory to have different domains in order to sell medicines By the way, I can't redirect 301 because I need these websites exist for the same reason (the law) He is creating 1-3 new domains every week so obviously he has had a drop in his SEO traffic that I have to solve this fast. Do you think the canonical will be the best solution? I dont want to noindex these domains beacuse we're creating Google Local pages for each one in order to be found in their villages. Please, I'll appreciate any piece of advice. Thanks!
On-Page Optimization | | Estherpuntu0 -
Content Mismatch
Hi, I've added my app to search console, and there are reported 480 content mismatch pages. How can I solve this problem?
On-Page Optimization | | Silviu0 -
Category Page Content
Hey Mozzers, I've recently been doing a content audit on the category and sub-category pages on our site. The old pages had the following "profile" Above The Fold
On-Page Optimization | | ATP
Page Heading
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products
600 words+ of content duplicated from articles, sub categories and products My criticisms of the page were
1. No content (text) above the fold
2. Page content was mostly duplicated content
3. No keyword structure, many pages competed for the same keywords and often unwanted pages outranked the desired page for the keyword. I cleaned this up to the following structure Above The Fold
H1 Page Heading 80-200 Word of Content (Including a link to supporting article)
H2 Page Heading (Expansion or variance of the H1 making sure relevant) 80-200 150 Words of Content
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products The new pages are now all unique content, targeted towards 1-2 themed keywords. I have a few worries I was hoping you could address. 1. The new pages are only 180-300 words of text, simply because that is all that is needed to describe that category and provide some supporting information. the pages previously contained 600 words. Should I be looking to get more content on these pages?
2. If i do need more content, It wont fit "above the fold" without pushing the products and sub categories below the fold, which isn't ideal. Should I be putting it there anyway or should I insert additional text below the products and below the fold or would this just be a waste.
3. Keyword Structure. I have designed each page to target a selction of keywords, for example.
a) The main widget pages targets all general "widget" terms and provides supporting infromation
b) The sub-category blue widget page targets anything related and terms such as "Navy Widgets" because navy widgets are a type of blue widget etc"
Is this keyword structure over-optimised or exactly what I should be doing. I dont want to spread content to thin by being over selective in my categories Any other critisms or comment welcome0 -
Duplicate Content - What can be duplicate in two different product pages.
I am having a hard time understanding how my 3 different product pages are being shown up as Duplicate Content in s crawl. Some of my 21 different pages are being shown as duplicate content. Here are 3 of those: 1. http://champu.in/korn-rock-band-mens-round-neck-t-shirt-india 2. http://champu.in/stop-the-burning-mens-round-neck-t-shirt-india 3. http://champu.in/funny-t-shirts/absolut-punjabi-red-men-s-round-neck-t-shirt Can someone help me with this. Thanks in advance 🙂
On-Page Optimization | | sidjain4you0 -
Content Writing for Ecommerce Products
Any idea where I can find content writers / or get content written for my online shop's product descriptions? I need to get a lot of volume done fast. Thanks
On-Page Optimization | | bjs20100 -
How to avoid duplicate page content
I have over 5.000 duplicate page content because my urls contains ?district=1&sort=&how=ASC¤cy=EUR. How can I fix this?
On-Page Optimization | | bruki0 -
Duplicate content harms individual pages or whole site?
Hi, One section of my site is a selection of Art and Design books. I have about 200 individual posts, each with a book image and a description retrieved from Amazon (using their API). Due to several reasons not worth mentioning I decided to use the Amazon description. I don't mind if those pages rank well or not, but I need them as additional content for my visitors as they browse my site. The value relies in the selection of books. My question is if the duplicate content taken from Amazon harms only each book page or the whole site. The rest of the site has unique content. Thanks! Enrique
On-Page Optimization | | enriquef0 -
Geo-targeted content and SEO?
I am wondering, what effect does geo-targeted "cookie cutter" content have on SEO. For example, one might have a list of "Top US Comedians", which appears as "Top UK Comedians" for users from the United Kingdom. The data would be populated with information from a database in both cases, but would be completely different for each region, with the exception of a few words. Is this essentially giving Google's (US-based) crawler different content to users? I know that plenty of sites do it, but is it legitimate? Would it be better to redirect to a unique page, based on location, rather than change the content of one static page? I know what the logical SEO answer is here, but even some of the big players use the "wrong" tactic. I am very interested to hear your thoughts.
On-Page Optimization | | HalogenDigital0