Tricky Decision to make regarding duplicate content (that seems to be working!)
-
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else.
They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age.
In this hypothetical example my client sells lawnmowers:
http://www.example.com/lawnmowers/men/age-34
http://www.example.com/lawnmowers/men/age-33
http://www.example.com/lawnmowers/women/age-25
http://www.example.com/lawnmowers/women/age-3
For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for.
The issue is the specific product pages, which take the form of the following:
http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower
This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance:
http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower
http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower
http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower
So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO.
On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content.
My options as I see them are:
- Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little.
- Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above.
What are your thoughts?
Many thanks,
Tom
-
Thanks Russ - that's a good idea. Bit of a pain from a tech perspective as it turns out, but definitely they way we'll tackle the issue post launch.
Thanks again,
Tom
-
Do you have multiple categories on the site? Why not just collapse 1 category and see how it performs (give it a good 6 months though before making a final verdict)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content for e-commerce help
Hi. I know I have duplicate content issues and Moz has shown me the issues on ecommerce websites. However a large number of these issues are for variations of the same product. For example a blue, armani t-shirt can be found on armani page, t-shirt page, armani t-shirt page and it also shows links for the duplicates due to sizing variations. Is it possible or even worthwhile working on these issues? Thanks
White Hat / Black Hat SEO | | YNWA0 -
Disabling a slider with content...is considered cloaking?
We have a slider on our site www.cannontrading.com, but the owner didn't like it, so I disabled it. And, each slider contains link & content as well. We had another SEO guy tell me it considered cloaking. Is this True? Please give feedbacks.
White Hat / Black Hat SEO | | ACann0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Negative SEO attack working amazingly on Google.ca
We have a client www.atvandtrailersales.com who recently (March) fell out of the rankings. We checked their backlink file and found over 100 spam links pointing at their website with terms like "uggboots" and "headwear" etc. etc. I submitted a disavow link file, as this was obviously an attack on the website. Since the recent Panda update, the client is back out of the rankings for a majority of keyword phrases. The disavow link file that was submitted back in march has 90% of the same links that are still spamming the website now. I've sent a spam report to Google and nothing has happened. I could submit a new disavow link file, but I'm not sure if this is worth the time. '.'< --Thanks!
White Hat / Black Hat SEO | | SmartWebPros1 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Big loss in Google traffic recently, but can't work out what the problem is
Since about May 17 my site - http://lowcostmarketingstrategies.com - has suffered a big drop in traffic from Google, presumed from the dreaded Penguin update. I am at a loss why I have been hit when I don't engage in any black hat SEO tactics or do any link building. The site is high quality, provides a good experience for the user and I make sure that all of the content is unique and not published elsewhere. The common checklist of potential problems from Penguin (such as keyword stuffing, web spam and over optimisation in general) don't seem relevant to my site. I'm wondering if someone could take a quick look at my site to see any obvious things that need to be removed to get back in Google's good books. I was receiving around 200 - 250 hits per day, but that has now dropped down to 50 - 100 and I fee that I have been penalised incorrectly. Any input would be fantastic Thanks 🙂
White Hat / Black Hat SEO | | ScottDudley0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
How is this obvious black hat technique working in Google?
Get ready to have your minds blown. Try a search in Google for any of these: proform tour de france tour de france trainer tour de france exercise bike proform tour de france bike In each instance you will notice that Proform.com, the maker of the bike, is not #1. In fact, the same guy is #1 every time, and this is the URL: www.indoorcycleinstructor.com/tour-de-france-indoor-cycling-bike Here's the fun part. Click on that result and guess where you go? Yup, Proform.com. The exact same page ranking right behind it in fact. Actually, this URL first redirects to an affiliate link and that affiliate link redirects to Proform.com. I want to know two things. First, how on earth did they do this? They got to #1 ahead of Proform's own page. How was it done? But the second question is, how have they not been caught? Are they cloaking? How does Google rank a double 301 redirect in the top spot whose end destination is the #2 result? PS- I have a site in this industry and this is how I caught it and why it is of particular interest. Just can't figure out how it was done or why they have not been caught. Not because I plan to copy them, but because I plan to report them to Google but want to have some ammo.
White Hat / Black Hat SEO | | DanDeceuster0