Tricky Decision to make regarding duplicate content (that seems to be working!)
-
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else.
They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age.
In this hypothetical example my client sells lawnmowers:
http://www.example.com/lawnmowers/men/age-34
http://www.example.com/lawnmowers/men/age-33
http://www.example.com/lawnmowers/women/age-25
http://www.example.com/lawnmowers/women/age-3
For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for.
The issue is the specific product pages, which take the form of the following:
http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower
This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance:
http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower
http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower
http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower
So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO.
On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content.
My options as I see them are:
- Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little.
- Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above.
What are your thoughts?
Many thanks,
Tom
-
Thanks Russ - that's a good idea. Bit of a pain from a tech perspective as it turns out, but definitely they way we'll tackle the issue post launch.
Thanks again,
Tom
-
Do you have multiple categories on the site? Why not just collapse 1 category and see how it performs (give it a good 6 months though before making a final verdict)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Cloaking - is this still working ? And how ?
Hello, Recently i read about all the cloaking world.
White Hat / Black Hat SEO | | WayneRooney
I search some information on the internet about it and i fine this service : http://justcloakit.com/.
Since I'm pretty new to whole this "cloaking world" so I have a few questions from from experts in this field. Is this still working on SEO since all the Google update recently ?
How easy is that for someone that don't have much experience and knowledge on php and servers stuff ?
Is there are more sites such as the above example ? In general i have the budget and i don't think its very hard to learn all the technical part but i just want to know if this is something
that still working, is that good investment in your opinion ? (As its not really cheap) Cheers and thank you for your help0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Thin Content Pages: Adding more content really help?
Hello all, So I have a website that was hit hard by Panda back in 2012 November, and ever since the traffic continues to die week by week. The site doesnt have any major moz errors (aside from too many on page links). The site has about 2,700 articles and the text to html ratio is about 14.38%, so clearly we need more text in our articles and we need to relax a little on the number of pictures/links we add. We have increased the text to html ratio for all of our new articles that we put out, but I was wondering how beneficial it is to go back and add more text content to the 2,700 old articles that we have just sitting. Would this really be worth the time and investment? Could this help the drastic decline in traffic and maybe even help it grow?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
User comments with page content or as a separate page?
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
White Hat / Black Hat SEO | | Peter2640 -
Why Does Massive Reciprocal Linking Still Work?
It seems pretty well-settled that massive reciprocal linking is not a very effective strategy, and in fact, may even lead to a penatly. However, I still see massive reciprocal linking (blog roll linking even massive resource page linking) still working all the time. I'm not looking to cast aspersion on any individual or company, but I work with legal websites and I see these strategies working almost universally. My question is why is this still working? Is it because most of the reciprocally linking sites are all legally relevant? Has Google just not "gotten around" to the legal sector (doubtful considering the money and volume of online legal segment)? I have posed this question at SEOmoz in the past and it was opined that massively linking blogs through blog rolls probably wouldn't send any flags to Google. So why is that it seems that everywhere I look, this strategy is basically dismissed as a complete waste of time if not harmful? How can there be such a discrepency between what leading SEOs agree to be "bad" and the simple fact that these strategies are working en masse over the period of at least 3 years?
White Hat / Black Hat SEO | | Gyi0