Handling "legitimate" duplicate content in an online shop.
-
The scenario:
Online shop selling consumables for machinery.
Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with.
The Problem:
Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z.
Machine A page = Machine B page = Consumables range A page
Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc).
If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic.
However, if I don't use canonical tags then all the pages get slammed as duplicate content.
For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format.
However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same.
The Question:
What is the best way to handle this from both a user and search engine perspective?
-
That's good solid advice. Thank you. Other ecommerce sites in the niche are nothing to write home about. Where they win is where this site has some major issues (larger than the one I'm asking about) that need fixing too.
I'm just trying to come up with a cohesive plan for a site that will blow the competition out of the water on Google (achievable) and increase sales / visitor. This is part of that.
Much as I don't like your suggestion due to the amount of work it is going to take to implement, I do think you are right and it's a better solution than the canonical tags.
That said, I suspect the canonical tags will be tried first, and then we will end up going with the content writing.
-
I have spent an inordinate amount of time cleaning up sites with templated pages and duplicate content. I can tell you that the potential gains are real, and the potential risks of inaction are often large.
Some text is better than no text. Google prefers a solid base of text-based content, period. It's their bread and butter and it helps them figure out what your page is about. Some time spent discussing with your team/writers how to best differentiate each page could be time very well spent. I don't know that it needs to be a solid block of prose; categories like manufacturer, machine type, year, etc. could be used in list or paragraph form (perhaps you already do this?)
You could look at other ecommerce sites that are ranking in your niche, and in others, to see what they do.
I'm not sure exactly what you mean by "The text would have to come before the products for SEO." What I will say is the position of the text on the page should probably be dictated by whatever is best for the user. Test it in different positions on the page (even the left or right sidebar) and see what converts better. I doubt if the text's position on the page will affect your rankings a great deal. As for the "boilerplate-ness," the crawlers can see if its unique or not.
-
I did consider that. It's in the hundreds and it could be done, but I'm not sure that's the way to go for the following reasons:
1. The number of consumables in the list is going to be at least 8 per page (each with snippet information and order box). That means that it would take a significant amount of text to make the content significantly different.
2. There is not a lot of difference between many of the machines, so writing a decent amount of text per item would be a major task. The more text written the more it affects point 3.
3. The text would have to come before the products for SEO (after the products just looks like the boilerplate-esque text that it would actually be) and that's not good for the consumer who just wants to see the consumables.
Also, we are finding more problems with the site every hour and we may not have the resources to get the text accomplished in a reasonable time-frame. Certainly, I'd have to be more certain of getting a "win" from it than I currently am before I suggest spending on it over other issues.
-
Ian,
Is it feasible to write unique text for the machine pages? I.e., are they in the hundreds or thousands? Do you have a budget to hire a writer(s)?
-
I may be missing something, but wouldn't canonical tags sort out your sort orders at least?
-
I have the same problem but it is listed as duplicate content within my site as a result of sort mechanisms and category pages. The consumer wants the sort mechanisms and category pages in order to find the products they are looking for quickly and I've tried everything and still have "duplicate content" listed on Moz crawls and Google Webmaster and just about everything. Imaging won't work for me since it is a result of database search mechanisms which cause the "duplicates" I also have canonical urls on pages but that doesn't solve the problem either. I think we are damned if we do and damned if we don't.
-
Hi Ian,
There is a way around it, but first an opinion on duplicate content. I think that duplicate content issues are really about duplication across websites, not duplication within websites. Store ABC is expected to have a fair amount of text that repeats across it's own pages. The problem arises when both Store ABC and Store LMN and Store TUV all have the same bits of content (like product descriptions).
But anyways, if you really do not want to have your lists of consumables repeated on multiple pages, just turn the lists into images....
Then on the Machine A page display the image of the list and give it a file name and alt tag like "Machine A Consumables" and "Consumables for Machine A".
And on the Machine B page display a COPY of the image of the list and give it a NEW file name and alt tag like "Machine B Consumables" and "Consumables for Machine B". Etc, etc...
For the visitor, there is no difference between reading the words from text or an image. (unless they have sight issues and are using a screen reader)
Does this solve the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick Fix to "Duplicate page without canonical tag"?
When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’. The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests. It seems like a simple effective solution. Are we missing something?
Technical SEO | | CREW-MARKETING1 -
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
How to avoid duplicate content when blogging from a site
I have a wordpress plastic surgery website. I have a wordpress blog on the site. My concern is avoiding duplicate content penalties when I blog. I use my blog to add new information about procedures that have pages on the same topic on the main site. Invariably same keywords and phrases can appear in the blog-will this be considered Duplicate content? Also is it black hat to insert anchor text in a blog linking back to site content-ie internal link or is one now and then helpful
Technical SEO | | wianno1680 -
Does using data-href="" work more effectively than href="" rel="nofollow"?
I've been looking at some bigger enterprise sites and noticed some of them used HTML like this: <a <="" span="">data-href="http://www.otherodmain.com/" class="nofollow" rel="nofollow" target="_blank"></a> <a <="" span="">Instead of a regular href="" Does using data-href and some javascript help with shaping internal links, rather than just using a strict nofollow?</a>
Technical SEO | | JDatSB0 -
Duplicate page content - index.html
Roger is reporting duplicate page content for my domain name and www.mydomain name/index.html. Example: www.just-insulation.com
Technical SEO | | Collie
www.just-insulation.com/index.html What am I doing wrongly, please?0 -
Techniques for diagnosing duplicate content
Buonjourno from Wetherby UK 🙂 Diagnosing duplicate content is a classic SEO skill but I'm curious to know what techniques other people use. Personally i use webmaster tools as illustrated here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/webmaster-tools-duplicate.jpg but what other techniques are effective? Thanks,
Technical SEO | | Nightwing
David0 -
I have both a ".net" and a ".com" address for the Same Website.....
I have mysite.net and mysite.com......They are both the same age, however, we always had it so that the mysite.com address forwarded to the mysite.net address. The mysite.net address was our main address forever. We recently reversed that and made the mysite.com address the main address and just have mysite.net forward to the mysite.com address. I'm wondering if this change will affect our rankings since a lot of the backlinks we've acquired are actually pointing to mysite.net and not mysite.com (our new main address)???
Technical SEO | | B24Group0