Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on ecommerce sites
-
I just want to confirm something about duplicate content.
On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content?
Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page.
If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products...
Cheers,
-
Yes, duplicate content can harm your e-commerce sites. It can confuse search engines, making it hard for your site to rank well. Here are some simple ways to deal with it:
Use Canonical Tags: This tells search engines which version of a page is the main one.
Unique Product Descriptions: Try to write unique descriptions for each product, even if they are similar.
Noindex, Follow Tags: For pages that you don't want indexed, use these tags to prevent search engines from listing them.For a full guide on handling duplicate content, check out this blog: https://www.resultfirst.com/blog/ecommerce-seo/how-to-handle-duplicate-content-on-your-ecommerce-site/
I hope it will be helpful for you.
-
@Dr-Pete Thanks, exactly what I was looking for. Really thank you very much
-
With the caveat that this is a 7-yo thread -- I'd say that it's generally more of a filter these days (vs. a Capital-P penalty). The OEM or large resellers are almost always going to win these battles, and you'll be at a disadvantage if you duplicate their product descriptions word-for-word.
Can you still rank? Sure, but you're going to have an easier time if you can add some original value. If you aren't allowed to modify the info, is there anything you can add to it -- custom reviews (not from users, but say an editorial-style review), for example? You don't have to do it for thousands of products. You could start with ten or 25 top sellers and see how things go.
-
-
What do you suggest as a solution if you are a reseller of a product and you are using the same description as measurements, characteristics etc? Especially if your wholeseller demands not to alternate the titles and the descriptions.
-
Then you are saying that all resellers selling, for example, an X model of sports shoes will get penalised because they are using the same description? Test: take a phrase or a paragraph from the most authoritative brand and paste to google. You will have results from other resellers. They don't actually look "penalized" if you see their PA score...
-
-
I'm going to generally agree with (and thumb up) Mark, but a couple of additional comments:
(1) It really varies wildly. You can, with enough duplication, make your pages look thin enough to get filtered out. I don't think there's a fixed word-count or percentage, because it depends on the nature of the duplicate content, the non-duplicate content, the structure/code of the page, etc. Generally speaking, I would not add a long chunk of "Why Buy With Us" text - not only is it going to increase duplicate-content risks, but most people won't read it. Consider something short and punchy - maybe even an image or link that goes to a site with a full description. That way, most people will get the short message and people who are worried can get more details on a stand-alone page. You could even A/B test it - I suspect the long-form content may not be as powerful as you think.
(2) While duplicate content is not "penalized" in the traditional sense, the impact of it can approach penalty-like levels since the Panda updates.
(3) Definitely agreed with Mark that you have to watch both internal and external duplication. If you're a product reseller, for example, and you have a duplicate block in your own site AND you duplicate the manufacturer's product description, then you're at even more risk.
-
James- Great question.....let me provide a little guidance.....we have a bunch of ecommerce sites we help manage for SEO.I am going to lump together several of googles "focus areas" into one. They are duplicate content, shallow content and copied duplicate content. Because with an ecommerce site, all 3 of these items can be the same or interchangeable thing. Here are the major issues/things to focus on:Alot of ecommerce sites, in the past, have been able to generate substantial SEO value by listing products in variations of sizes and colors and with brief descriptions , and then create 1,000's of pages of what used to be considered unique content; (Shallow content). THOSE DAYS ARE GONE. Assuming you still have the standard information copied and pasted on every page, that you mention above, ideally you want 250 unique words of description of a product. Bare minimum you should have 100 words.....and in addition to the on-page content, you should make sure your meta descriptions are unique. Remember, Unique means relevant content that is different. With duplicate content issues, google isn't penalizing you to hurt your ranking but they will only give you SEO value for the page they think is unique...for example if you have 40 pages of the same product but small variations in color or size or sku, and little to differentiate the pages, then they will count those 40 pages as 1 page....you lose the opportunity to build 39 pages of unique content value. The last thing to be careful of is if you have product that other companies have.....(you are a distributor or supplier or wholesaler and not the manufacturer). Then the manufacturer posts standard info and a bunch of people copy it and use it. YOU WILL BE PENALIZED BY GOOGLE FOR THIS BECAUSE IT IS COPIED DUPLICATE CONTENT. Most important point to remphasis----you know you are going to have some duplicate content on a website......you know that it it likely that if you are selling different variations of the same product, that you will have alot of the same stuff.....again, make sure you have unique and different content focused on your keywords. Target at least 50% different or unique content on each page as a MINIMUM.....Hope this helps.Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Subdirectories
Hi there and thank you in advance for your help! I'm seeking guidance on how to structure a resources directory (white papers, webinars, etc.) while avoiding duplicate content penalties. If you go to /resources on our site, there is filter function. If you filter for webinars, the URL becomes /resources/?type=webinar We didn't want that dynamic URL to be the primary URL for webinars, so we created a new page with the URL /resources/webinar that lists all of our webinars and includes a featured webinar up top. However, the same webinar titles now appear on the /resources page and the /resources/webinar page. Will that cause duplicate content issues? P.S. Not sure if it matters, but we also changed the URLs for the individual resource pages to include the resource type. For example, one of our webinar URLs is /resources/webinar/forecasting-your-revenue Thank you!
Technical SEO | | SAIM_Marketing0 -
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0