Handling "legitimate" duplicate content in an online shop.
-
The scenario:
Online shop selling consumables for machinery.
Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with.
The Problem:
Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z.
Machine A page = Machine B page = Consumables range A page
Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc).
If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic.
However, if I don't use canonical tags then all the pages get slammed as duplicate content.
For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format.
However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same.
The Question:
What is the best way to handle this from both a user and search engine perspective?
-
That's good solid advice. Thank you. Other ecommerce sites in the niche are nothing to write home about. Where they win is where this site has some major issues (larger than the one I'm asking about) that need fixing too.
I'm just trying to come up with a cohesive plan for a site that will blow the competition out of the water on Google (achievable) and increase sales / visitor. This is part of that.
Much as I don't like your suggestion due to the amount of work it is going to take to implement, I do think you are right and it's a better solution than the canonical tags.
That said, I suspect the canonical tags will be tried first, and then we will end up going with the content writing.
-
I have spent an inordinate amount of time cleaning up sites with templated pages and duplicate content. I can tell you that the potential gains are real, and the potential risks of inaction are often large.
Some text is better than no text. Google prefers a solid base of text-based content, period. It's their bread and butter and it helps them figure out what your page is about. Some time spent discussing with your team/writers how to best differentiate each page could be time very well spent. I don't know that it needs to be a solid block of prose; categories like manufacturer, machine type, year, etc. could be used in list or paragraph form (perhaps you already do this?)
You could look at other ecommerce sites that are ranking in your niche, and in others, to see what they do.
I'm not sure exactly what you mean by "The text would have to come before the products for SEO." What I will say is the position of the text on the page should probably be dictated by whatever is best for the user. Test it in different positions on the page (even the left or right sidebar) and see what converts better. I doubt if the text's position on the page will affect your rankings a great deal. As for the "boilerplate-ness," the crawlers can see if its unique or not.
-
I did consider that. It's in the hundreds and it could be done, but I'm not sure that's the way to go for the following reasons:
1. The number of consumables in the list is going to be at least 8 per page (each with snippet information and order box). That means that it would take a significant amount of text to make the content significantly different.
2. There is not a lot of difference between many of the machines, so writing a decent amount of text per item would be a major task. The more text written the more it affects point 3.
3. The text would have to come before the products for SEO (after the products just looks like the boilerplate-esque text that it would actually be) and that's not good for the consumer who just wants to see the consumables.
Also, we are finding more problems with the site every hour and we may not have the resources to get the text accomplished in a reasonable time-frame. Certainly, I'd have to be more certain of getting a "win" from it than I currently am before I suggest spending on it over other issues.
-
Ian,
Is it feasible to write unique text for the machine pages? I.e., are they in the hundreds or thousands? Do you have a budget to hire a writer(s)?
-
I may be missing something, but wouldn't canonical tags sort out your sort orders at least?
-
I have the same problem but it is listed as duplicate content within my site as a result of sort mechanisms and category pages. The consumer wants the sort mechanisms and category pages in order to find the products they are looking for quickly and I've tried everything and still have "duplicate content" listed on Moz crawls and Google Webmaster and just about everything. Imaging won't work for me since it is a result of database search mechanisms which cause the "duplicates" I also have canonical urls on pages but that doesn't solve the problem either. I think we are damned if we do and damned if we don't.
-
Hi Ian,
There is a way around it, but first an opinion on duplicate content. I think that duplicate content issues are really about duplication across websites, not duplication within websites. Store ABC is expected to have a fair amount of text that repeats across it's own pages. The problem arises when both Store ABC and Store LMN and Store TUV all have the same bits of content (like product descriptions).
But anyways, if you really do not want to have your lists of consumables repeated on multiple pages, just turn the lists into images....
Then on the Machine A page display the image of the list and give it a file name and alt tag like "Machine A Consumables" and "Consumables for Machine A".
And on the Machine B page display a COPY of the image of the list and give it a NEW file name and alt tag like "Machine B Consumables" and "Consumables for Machine B". Etc, etc...
For the visitor, there is no difference between reading the words from text or an image. (unless they have sight issues and are using a screen reader)
Does this solve the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick Fix to "Duplicate page without canonical tag"?
When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’. The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests. It seems like a simple effective solution. Are we missing something?
Technical SEO | | CREW-MARKETING1 -
Tags, Categories, & Duplicate Content
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!
Technical SEO | | bradhodson0 -
Duplicate content on user queries
Our website supports a unique business industry where our users will come to us to look for something very specific (a very specific product name) to find out where they can get it. The problem that we're facing is that the products are constantly changing due to the industry. So, for example, one month, one product might be found on our website, and the next, it might be removed completely... and then might come back again a couple months later. All things that are completely out of our control - and we have no way of receiving any sort of warning when these things might happen. Because of this, we're seeing a lot of duplicate content issues arise... For Example... Product A is not active today... so www.mysite.com/search/productA will return no results... Product B is also not active today... so www.mysite.com/search/productB will also return no results. As per Moz Analytics, these are showing up as duplicate content because both pages indicate "No results were found for {your searched term}." Unfortunately, it's a bit difficult to return a 204 in these situations (which I don't know if a 204 would help anyway) or a 404, because, for a faster user experience, we simultaneously render different sections of the page... so in the very beginning of the page load - we start rendering the faster content (template type of content) that says "returning 200 code, we got the query successfully & we're loading the page".. the unique content results finish loading last since they take the longest. I'm still very new to the SEO world, so would greatly appreciate any ideas or suggestions that might help with this... I'm stuck. 😛 Thanks in advance!
Technical SEO | | SFMoz0 -
Duplicate Content - Products
When running a report it says we have lots of duplicate content. We are a e-commerce site that has about 45,000 sku's on the site. Products can be in multiple departments on the site. So the same products can show up on different pages of the site. Because of this the reports show multiple products with duplicate content. Is this an issue with google and site ranking? Is there a way to get around this issue?
Technical SEO | | shoedog1 -
Duplicate content issue with trailing / ?
Hi ,I did a SEOmoz Crawl Test and found most pages show twice, for example: A: www.website.com/index.php/dog/walk B: www.website.com/index.php/dog/walk/ I've checked Google Analytics and 90% of organic search traffic arrives on the URLs with the trailing slash (B). Question 1: Can I assume I've a duplicate content problem? Question 2: Is it best to do 301 redirects from the 'non trailing slash' pages to the 'trailing slash pages'? Question 3: For some reason every web page has a '/index.php' in it (see A&B) above. No idea why. Should it be a SEO concern? Kind regards and thank you in advance Nigel
Technical SEO | | Richard5550 -
WordPress Duplicate Content Issues
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc... My question is, how do you handle these issues? Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc? By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages? I hope this makes sense. Regards, Bill
Technical SEO | | wparlaman0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0