Handling "legitimate" duplicate content in an online shop.
-
The scenario:
Online shop selling consumables for machinery.
Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with.
The Problem:
Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z.
Machine A page = Machine B page = Consumables range A page
Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc).
If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic.
However, if I don't use canonical tags then all the pages get slammed as duplicate content.
For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format.
However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same.
The Question:
What is the best way to handle this from both a user and search engine perspective?
-
That's good solid advice. Thank you. Other ecommerce sites in the niche are nothing to write home about. Where they win is where this site has some major issues (larger than the one I'm asking about) that need fixing too.
I'm just trying to come up with a cohesive plan for a site that will blow the competition out of the water on Google (achievable) and increase sales / visitor. This is part of that.
Much as I don't like your suggestion due to the amount of work it is going to take to implement, I do think you are right and it's a better solution than the canonical tags.
That said, I suspect the canonical tags will be tried first, and then we will end up going with the content writing.
-
I have spent an inordinate amount of time cleaning up sites with templated pages and duplicate content. I can tell you that the potential gains are real, and the potential risks of inaction are often large.
Some text is better than no text. Google prefers a solid base of text-based content, period. It's their bread and butter and it helps them figure out what your page is about. Some time spent discussing with your team/writers how to best differentiate each page could be time very well spent. I don't know that it needs to be a solid block of prose; categories like manufacturer, machine type, year, etc. could be used in list or paragraph form (perhaps you already do this?)
You could look at other ecommerce sites that are ranking in your niche, and in others, to see what they do.
I'm not sure exactly what you mean by "The text would have to come before the products for SEO." What I will say is the position of the text on the page should probably be dictated by whatever is best for the user. Test it in different positions on the page (even the left or right sidebar) and see what converts better. I doubt if the text's position on the page will affect your rankings a great deal. As for the "boilerplate-ness," the crawlers can see if its unique or not.
-
I did consider that. It's in the hundreds and it could be done, but I'm not sure that's the way to go for the following reasons:
1. The number of consumables in the list is going to be at least 8 per page (each with snippet information and order box). That means that it would take a significant amount of text to make the content significantly different.
2. There is not a lot of difference between many of the machines, so writing a decent amount of text per item would be a major task. The more text written the more it affects point 3.
3. The text would have to come before the products for SEO (after the products just looks like the boilerplate-esque text that it would actually be) and that's not good for the consumer who just wants to see the consumables.
Also, we are finding more problems with the site every hour and we may not have the resources to get the text accomplished in a reasonable time-frame. Certainly, I'd have to be more certain of getting a "win" from it than I currently am before I suggest spending on it over other issues.
-
Ian,
Is it feasible to write unique text for the machine pages? I.e., are they in the hundreds or thousands? Do you have a budget to hire a writer(s)?
-
I may be missing something, but wouldn't canonical tags sort out your sort orders at least?
-
I have the same problem but it is listed as duplicate content within my site as a result of sort mechanisms and category pages. The consumer wants the sort mechanisms and category pages in order to find the products they are looking for quickly and I've tried everything and still have "duplicate content" listed on Moz crawls and Google Webmaster and just about everything. Imaging won't work for me since it is a result of database search mechanisms which cause the "duplicates" I also have canonical urls on pages but that doesn't solve the problem either. I think we are damned if we do and damned if we don't.
-
Hi Ian,
There is a way around it, but first an opinion on duplicate content. I think that duplicate content issues are really about duplication across websites, not duplication within websites. Store ABC is expected to have a fair amount of text that repeats across it's own pages. The problem arises when both Store ABC and Store LMN and Store TUV all have the same bits of content (like product descriptions).
But anyways, if you really do not want to have your lists of consumables repeated on multiple pages, just turn the lists into images....
Then on the Machine A page display the image of the list and give it a file name and alt tag like "Machine A Consumables" and "Consumables for Machine A".
And on the Machine B page display a COPY of the image of the list and give it a NEW file name and alt tag like "Machine B Consumables" and "Consumables for Machine B". Etc, etc...
For the visitor, there is no difference between reading the words from text or an image. (unless they have sight issues and are using a screen reader)
Does this solve the problem?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on Places to Stay listings pages
Hello, I've just crawled our website https://www.i-escape.com/ to find we have a duplicate content issue. Every places to stay listing page has identical content (over 1,500 places) due to the fact it's based on user searches or selections. If we hide this pages using canonical tags, will we lose our visibility for each country and/or region we promote hotels? Any help on this would be hugely appreciated! Thanks so much Clair
Technical SEO | | iescape0 -
Site Crawl -> Duplicate Page Content -> Same pages showing up with duplicates that are not
These, for example: | https://im.tapclicks.com/signup.php/?utm_campaign=july15&utm_medium=organic&utm_source=blog | 1 | 2 | 29 | 2 | 200 |
Technical SEO | | writezach
| https://im.tapclicks.com/signup.php?_ga=1.145821812.1573134750.1440742418 | 1 | 1 | 25 | 2 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=blog&utm_campaign=brightpod-article | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=marketplace&utm_campaign=homepage | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=blog&utm_campaign=first-3-must-watch-videos | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?_ga=1.159789566.2132270851.1418408142 | 1 | 5 | 31 | 2 | 200 |
| https://im.tapclicks.com/signup.php/?utm_source=vocus&utm_medium=PR&utm_campaign=52release | Any suggestions/directions for fixing or should I just disregard this "High Priority" moz issue? Thank you!0 -
Self inflicted duplicate content penalty?
Wondering if I could pick the brains of fellow mozer's. Been working with a client for about 3 months now to get their site up in the engine. In the three months the DA has gone from about 11 to 34 and PA is 40 (up from about 15) so that's all good. However, we seem not to be moving up the ranking much. The average DA of competitors in the niche in the top ten is 25. We have 9.2 times the average no of backlinks too. During a call to the client today they told me that they noticed a major drop in their rankings a few months back. Didn't say this when we started the project. I just searched for the first paragraph on their homepage and it returns 16,000 hits in google, The second returns 9600 and the third 1,400. Searching for the first paragraph of their 'about us' page gives me 13,000 results!! Clearly something is not right here. Looking into this, I seems that someone has use their content, word for word, as the descriptions on thousands of blogs, social sites. I am thinking that this, tied in with the slow movement in the listings, has caused a duplicate content penalty in the search engines. The client haven't copied anyone's content as it is very specific for their site but it seems all over the web. I have advised them to change their site content asap and hope we get a Panda refresh in to view the new unique content. Once the penalty is off i expect the site to shoot up the rankings. From an seo company point of view, should I have seen this before? Maybe. If they had said they suffered a major drop in rankings a few months back - when they dropped their seo agency, I would have looked into it, but one doesn't naturally assume that a client's copy will be posted all over the web, it is not something I would have searched for without reason to search Any thoughts on this, either saying yes or no to my theory would be most welcome please. Thanks Carl
Technical SEO | | GrumpyCarl0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
Hi SEOMoz,I am trying to fix the final issues in my site crawl. One that confuses me is this canonical homepage URL fix. It says I have duplicate content on the following pages:http://www.accupos.com/http://www.accupos.com/index.phpWhat would be the best way to fix this problem? (...the first URL has a higher page authority by 10 points and 100+ more inbound links).Respectfully Yours,Derek M.
Technical SEO | | DerekM880 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0 -
Question about duplicate content within my site
Hi. New here to SEOmoz and also somewhat new to SEO in general. A friend has asked me to help do some onsite SEO for their company's website. The company uses Drupal Content Management System. They have a couple product pages that contain a tabbed section for features, accessories, etc. When they built their tabs, they used a Drupal module called Quicktabs, by which each individual tab is created as a separate page and then pulled into the tabs from those pages. So, in essence, you now have instances of repeated content. 1) the page used to create the tab, and 2) the tab that displays on the product page. My question is, how should I handle the pages that were used to create the tabs? Should I make them NOINDEX? Thank you for your advice in advance.
Technical SEO | | aprilm-1890400