Duplicate content on ecommerce sites
-
I just want to confirm something about duplicate content.
On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content?
Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page.
If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products...
Cheers,
-
Yes, duplicate content can harm your e-commerce sites. It can confuse search engines, making it hard for your site to rank well. Here are some simple ways to deal with it:
Use Canonical Tags: This tells search engines which version of a page is the main one.
Unique Product Descriptions: Try to write unique descriptions for each product, even if they are similar.
Noindex, Follow Tags: For pages that you don't want indexed, use these tags to prevent search engines from listing them.For a full guide on handling duplicate content, check out this blog: https://www.resultfirst.com/blog/ecommerce-seo/how-to-handle-duplicate-content-on-your-ecommerce-site/
I hope it will be helpful for you.
-
@Dr-Pete Thanks, exactly what I was looking for. Really thank you very much
-
With the caveat that this is a 7-yo thread -- I'd say that it's generally more of a filter these days (vs. a Capital-P penalty). The OEM or large resellers are almost always going to win these battles, and you'll be at a disadvantage if you duplicate their product descriptions word-for-word.
Can you still rank? Sure, but you're going to have an easier time if you can add some original value. If you aren't allowed to modify the info, is there anything you can add to it -- custom reviews (not from users, but say an editorial-style review), for example? You don't have to do it for thousands of products. You could start with ten or 25 top sellers and see how things go.
-
-
What do you suggest as a solution if you are a reseller of a product and you are using the same description as measurements, characteristics etc? Especially if your wholeseller demands not to alternate the titles and the descriptions.
-
Then you are saying that all resellers selling, for example, an X model of sports shoes will get penalised because they are using the same description? Test: take a phrase or a paragraph from the most authoritative brand and paste to google. You will have results from other resellers. They don't actually look "penalized" if you see their PA score...
-
-
I'm going to generally agree with (and thumb up) Mark, but a couple of additional comments:
(1) It really varies wildly. You can, with enough duplication, make your pages look thin enough to get filtered out. I don't think there's a fixed word-count or percentage, because it depends on the nature of the duplicate content, the non-duplicate content, the structure/code of the page, etc. Generally speaking, I would not add a long chunk of "Why Buy With Us" text - not only is it going to increase duplicate-content risks, but most people won't read it. Consider something short and punchy - maybe even an image or link that goes to a site with a full description. That way, most people will get the short message and people who are worried can get more details on a stand-alone page. You could even A/B test it - I suspect the long-form content may not be as powerful as you think.
(2) While duplicate content is not "penalized" in the traditional sense, the impact of it can approach penalty-like levels since the Panda updates.
(3) Definitely agreed with Mark that you have to watch both internal and external duplication. If you're a product reseller, for example, and you have a duplicate block in your own site AND you duplicate the manufacturer's product description, then you're at even more risk.
-
James- Great question.....let me provide a little guidance.....we have a bunch of ecommerce sites we help manage for SEO.I am going to lump together several of googles "focus areas" into one. They are duplicate content, shallow content and copied duplicate content. Because with an ecommerce site, all 3 of these items can be the same or interchangeable thing. Here are the major issues/things to focus on:Alot of ecommerce sites, in the past, have been able to generate substantial SEO value by listing products in variations of sizes and colors and with brief descriptions , and then create 1,000's of pages of what used to be considered unique content; (Shallow content). THOSE DAYS ARE GONE. Assuming you still have the standard information copied and pasted on every page, that you mention above, ideally you want 250 unique words of description of a product. Bare minimum you should have 100 words.....and in addition to the on-page content, you should make sure your meta descriptions are unique. Remember, Unique means relevant content that is different. With duplicate content issues, google isn't penalizing you to hurt your ranking but they will only give you SEO value for the page they think is unique...for example if you have 40 pages of the same product but small variations in color or size or sku, and little to differentiate the pages, then they will count those 40 pages as 1 page....you lose the opportunity to build 39 pages of unique content value. The last thing to be careful of is if you have product that other companies have.....(you are a distributor or supplier or wholesaler and not the manufacturer). Then the manufacturer posts standard info and a bunch of people copy it and use it. YOU WILL BE PENALIZED BY GOOGLE FOR THIS BECAUSE IT IS COPIED DUPLICATE CONTENT. Most important point to remphasis----you know you are going to have some duplicate content on a website......you know that it it likely that if you are selling different variations of the same product, that you will have alot of the same stuff.....again, make sure you have unique and different content focused on your keywords. Target at least 50% different or unique content on each page as a MINIMUM.....Hope this helps.Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using same copy on different domain
I have a client that currently has a .com domain (not using hreflang) . They have a new partner in the UK and they want to replicate the website and use a .co.uk domain. It will be a different brand name. Will this cause any SEO issues?
Intermediate & Advanced SEO | | bedynamic0 -
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
How to stop /tag creating duplicate content - Wordpress
Hi, I keep getting alert for duplicate content. It seems Wordpress is creating it through a /tag https://www.curveball-media.co.uk/tag/cipr/ https://www.curveball-media.co.uk/tag/pr-agencies/ Something in the way we've got Wordpress set up?
Technical SEO | | curveballmedia0 -
When is Too Many Categories Too Many on a eCommerce site?
We all know that more and more people are increasing the amount of different categories that eCommerce sites have. Say for example, you have over 3,000 different products, all categories contain unique text at the top of each, all of the categories link to each other (so loads on internal linking) and no two categories contain the exact same products. My question is this, is there ever a stage that you could create too many categories? Alternatively, do you think you should just keep creating categories based on what our customers search for?
Intermediate & Advanced SEO | | the-gate-films1 -
Duplicate ecommerce sites, SEO implications & others?
We have an established eCom site built out with custom php, dedicated SERPs, traffic, etc.. The question has arisen on how to extend commerce on social and we have found a solution with Shopify. In order to take advantage of this, we'd need to build out a completely new site in Shopify and would have to have the site live in order to have storefronts on Pinterest and Twitter. Aside from the obvious problem with having two databases, merchant processing, etc, does anyone know whether there are SEO implications to having two live sites with duplicate products? Could we just disavow a Shopify store in Webmaster Tools? Any other thoughts or suggestions? TIA!
Intermediate & Advanced SEO | | PAC31350 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Duplicate content, website authority and affiliates
We've got a dilemma at the moment with the content we supply to an affiliate. We currently supply the affiliate with our product database which includes everything about a product including the price, title, description and images. The affiliate then lists the products on their website and provides a Commission Junction link back to our ecommerce store which tracks any purchases with the affiliate getting a commission based on any sales via a cookie. This has been very successful for us in terms of sales but we've noticed a significant dip over the past year in ranking whilst the affiliate has achieved a peak...all eyes are pointing towards the Panda update. Whenever I type one of our 'uniquely written' product descriptions into Google, the affiliate website appears higher than ours suggesting Google has ranked them the authority. My question is, without writing unique content for the affiliate and changing the commission junction link. What would be the best option to be recognised as the authority of the content which we wrote in the first place? It always appears on our website first but Google seems to position the affiliate higher than us in the SERPS after a few weeks. The commission junction link is written like this: http://www.anrdoezrs.net/click-1428744-10475505?sid=shopp&url=http://www.outdoormegastore.co.uk/vango-calisto-600xl-tent.html
Intermediate & Advanced SEO | | gavinhoman0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0