Handling "legitimate" duplicate content in an online shop.
-
The scenario:
Online shop selling consumables for machinery.
Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with.
The Problem:
Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z.
Machine A page = Machine B page = Consumables range A page
Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc).
If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic.
However, if I don't use canonical tags then all the pages get slammed as duplicate content.
For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format.
However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same.
The Question:
What is the best way to handle this from both a user and search engine perspective?
-
That's good solid advice. Thank you. Other ecommerce sites in the niche are nothing to write home about. Where they win is where this site has some major issues (larger than the one I'm asking about) that need fixing too.
I'm just trying to come up with a cohesive plan for a site that will blow the competition out of the water on Google (achievable) and increase sales / visitor. This is part of that.
Much as I don't like your suggestion due to the amount of work it is going to take to implement, I do think you are right and it's a better solution than the canonical tags.
That said, I suspect the canonical tags will be tried first, and then we will end up going with the content writing.
-
I have spent an inordinate amount of time cleaning up sites with templated pages and duplicate content. I can tell you that the potential gains are real, and the potential risks of inaction are often large.
Some text is better than no text. Google prefers a solid base of text-based content, period. It's their bread and butter and it helps them figure out what your page is about. Some time spent discussing with your team/writers how to best differentiate each page could be time very well spent. I don't know that it needs to be a solid block of prose; categories like manufacturer, machine type, year, etc. could be used in list or paragraph form (perhaps you already do this?)
You could look at other ecommerce sites that are ranking in your niche, and in others, to see what they do.
I'm not sure exactly what you mean by "The text would have to come before the products for SEO." What I will say is the position of the text on the page should probably be dictated by whatever is best for the user. Test it in different positions on the page (even the left or right sidebar) and see what converts better. I doubt if the text's position on the page will affect your rankings a great deal. As for the "boilerplate-ness," the crawlers can see if its unique or not.
-
I did consider that. It's in the hundreds and it could be done, but I'm not sure that's the way to go for the following reasons:
1. The number of consumables in the list is going to be at least 8 per page (each with snippet information and order box). That means that it would take a significant amount of text to make the content significantly different.
2. There is not a lot of difference between many of the machines, so writing a decent amount of text per item would be a major task. The more text written the more it affects point 3.
3. The text would have to come before the products for SEO (after the products just looks like the boilerplate-esque text that it would actually be) and that's not good for the consumer who just wants to see the consumables.
Also, we are finding more problems with the site every hour and we may not have the resources to get the text accomplished in a reasonable time-frame. Certainly, I'd have to be more certain of getting a "win" from it than I currently am before I suggest spending on it over other issues.
-
Ian,
Is it feasible to write unique text for the machine pages? I.e., are they in the hundreds or thousands? Do you have a budget to hire a writer(s)?
-
I may be missing something, but wouldn't canonical tags sort out your sort orders at least?
-
I have the same problem but it is listed as duplicate content within my site as a result of sort mechanisms and category pages. The consumer wants the sort mechanisms and category pages in order to find the products they are looking for quickly and I've tried everything and still have "duplicate content" listed on Moz crawls and Google Webmaster and just about everything. Imaging won't work for me since it is a result of database search mechanisms which cause the "duplicates" I also have canonical urls on pages but that doesn't solve the problem either. I think we are damned if we do and damned if we don't.
-
Hi Ian,
There is a way around it, but first an opinion on duplicate content. I think that duplicate content issues are really about duplication across websites, not duplication within websites. Store ABC is expected to have a fair amount of text that repeats across it's own pages. The problem arises when both Store ABC and Store LMN and Store TUV all have the same bits of content (like product descriptions).
But anyways, if you really do not want to have your lists of consumables repeated on multiple pages, just turn the lists into images....
Then on the Machine A page display the image of the list and give it a file name and alt tag like "Machine A Consumables" and "Consumables for Machine A".
And on the Machine B page display a COPY of the image of the list and give it a NEW file name and alt tag like "Machine B Consumables" and "Consumables for Machine B". Etc, etc...
For the visitor, there is no difference between reading the words from text or an image. (unless they have sight issues and are using a screen reader)
Does this solve the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
Technical SEO | | Mike.NW0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Duplicate Content - Reverse Phone Directory
Hi, Until a few months ago, my client's site had about 600 pages. He decided to implement what is essentially a reverse phone directory/lookup tool. There are now about 10,000 reverse directory/lookup pages (.html), all with short and duplicate content except for the phone number and the caller name. Needless to say, I'm getting thousands of duplicate content errors. Are there tricks of the trade to deal with this? In nosing around, I've discovered that the pages are showing up in Google search results (when searching for a specific phone number), usually in the first or second position. Ideally, each page would have unique content, but that's next to impossible with 10,000 pages. One potential solution I've come up with is incorporating user-generated content into each page (maybe via Disqus?), which over time would make each page unique. I've also thought about suggesting that he move those pages onto a different domain. I'd appreciate any advice/suggestions, as well as any insights into the long-term repercussions of having so many dupes on the ranking of the 600 solidly unique pages on the site. Thanks in advance for your help!
Technical SEO | | sally580 -
Duplicate content with "no results found" search result pages
We have a motorcycle classifieds section that lets users search for motorcycles for sale using various drop down menus to pick year-make-type-model-trim, etc.. These search results create urls such as:
Technical SEO | | seoninjaz
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=On-Off Road&vehicle_model=Tiger&vehicle_trim=800 XC ABS We understand that all of these URL varieties are considered unique URLs by Google. The issue is that we are getting duplicate content errors on the pages that have no results as they have no content to distinguish themselves from each other. A URL like:
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=Sportbike
and
www.example.com/classifieds/search.php?vehicle_manufacturer=Honda&vehicle_category=Streetbike Will have a results page that says "0 results found". I'm wondering how we can distinguish these "unique" pages better? Some thoughts:
-make sure <title>reflects what was search<br />-add a heading that may say "0 results found for Triumph On-Off Road Tiger 800 XC ABS"<br /><br />Can anyone please help out and lend some ideas in solving this? <br /><br />Thank you.</p></title>0 -
Duplicate content - font size and themes
Hi, How do we sort duplicate content issues with: http://www.ourwebsite.co.uk/ being the same as http://www.ourwebsite.co.uk/StyleType=SmallFont&StyleClass=FontSize or http://www.ourwebsite.co.uk/?StyleType=LargeFont&StyleClass=FontSize and http://www.ourwebsite.co.uk/legal_notices.aspx being the same as http://www.ourwebsite.co.uk/legal_notices.aspx?theme=default
Technical SEO | | Houses0 -
Duplicate Content issue
I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?
Technical SEO | | CPLDistribution0