Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on ecommerce sites
-
I just want to confirm something about duplicate content.
On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content?
Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page.
If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products...
Cheers,
-
Yes, duplicate content can harm your e-commerce sites. It can confuse search engines, making it hard for your site to rank well. Here are some simple ways to deal with it:
Use Canonical Tags: This tells search engines which version of a page is the main one.
Unique Product Descriptions: Try to write unique descriptions for each product, even if they are similar.
Noindex, Follow Tags: For pages that you don't want indexed, use these tags to prevent search engines from listing them.For a full guide on handling duplicate content, check out this blog: https://www.resultfirst.com/blog/ecommerce-seo/how-to-handle-duplicate-content-on-your-ecommerce-site/
I hope it will be helpful for you.
-
@Dr-Pete Thanks, exactly what I was looking for. Really thank you very much
-
With the caveat that this is a 7-yo thread -- I'd say that it's generally more of a filter these days (vs. a Capital-P penalty). The OEM or large resellers are almost always going to win these battles, and you'll be at a disadvantage if you duplicate their product descriptions word-for-word.
Can you still rank? Sure, but you're going to have an easier time if you can add some original value. If you aren't allowed to modify the info, is there anything you can add to it -- custom reviews (not from users, but say an editorial-style review), for example? You don't have to do it for thousands of products. You could start with ten or 25 top sellers and see how things go.
-
-
What do you suggest as a solution if you are a reseller of a product and you are using the same description as measurements, characteristics etc? Especially if your wholeseller demands not to alternate the titles and the descriptions.
-
Then you are saying that all resellers selling, for example, an X model of sports shoes will get penalised because they are using the same description? Test: take a phrase or a paragraph from the most authoritative brand and paste to google. You will have results from other resellers. They don't actually look "penalized" if you see their PA score...
-
-
I'm going to generally agree with (and thumb up) Mark, but a couple of additional comments:
(1) It really varies wildly. You can, with enough duplication, make your pages look thin enough to get filtered out. I don't think there's a fixed word-count or percentage, because it depends on the nature of the duplicate content, the non-duplicate content, the structure/code of the page, etc. Generally speaking, I would not add a long chunk of "Why Buy With Us" text - not only is it going to increase duplicate-content risks, but most people won't read it. Consider something short and punchy - maybe even an image or link that goes to a site with a full description. That way, most people will get the short message and people who are worried can get more details on a stand-alone page. You could even A/B test it - I suspect the long-form content may not be as powerful as you think.
(2) While duplicate content is not "penalized" in the traditional sense, the impact of it can approach penalty-like levels since the Panda updates.
(3) Definitely agreed with Mark that you have to watch both internal and external duplication. If you're a product reseller, for example, and you have a duplicate block in your own site AND you duplicate the manufacturer's product description, then you're at even more risk.
-
James- Great question.....let me provide a little guidance.....we have a bunch of ecommerce sites we help manage for SEO.I am going to lump together several of googles "focus areas" into one. They are duplicate content, shallow content and copied duplicate content. Because with an ecommerce site, all 3 of these items can be the same or interchangeable thing. Here are the major issues/things to focus on:Alot of ecommerce sites, in the past, have been able to generate substantial SEO value by listing products in variations of sizes and colors and with brief descriptions , and then create 1,000's of pages of what used to be considered unique content; (Shallow content). THOSE DAYS ARE GONE. Assuming you still have the standard information copied and pasted on every page, that you mention above, ideally you want 250 unique words of description of a product. Bare minimum you should have 100 words.....and in addition to the on-page content, you should make sure your meta descriptions are unique. Remember, Unique means relevant content that is different. With duplicate content issues, google isn't penalizing you to hurt your ranking but they will only give you SEO value for the page they think is unique...for example if you have 40 pages of the same product but small variations in color or size or sku, and little to differentiate the pages, then they will count those 40 pages as 1 page....you lose the opportunity to build 39 pages of unique content value. The last thing to be careful of is if you have product that other companies have.....(you are a distributor or supplier or wholesaler and not the manufacturer). Then the manufacturer posts standard info and a bunch of people copy it and use it. YOU WILL BE PENALIZED BY GOOGLE FOR THIS BECAUSE IT IS COPIED DUPLICATE CONTENT. Most important point to remphasis----you know you are going to have some duplicate content on a website......you know that it it likely that if you are selling different variations of the same product, that you will have alot of the same stuff.....again, make sure you have unique and different content focused on your keywords. Target at least 50% different or unique content on each page as a MINIMUM.....Hope this helps.Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it Ok to have multiple domains (separate website different content) rank for similar keywords?
Is it 'OK' to have multiple domains in the following instance? Does Google actively discourage multiple (but completely different sites) domains from the same company appearing in the search results for the same and or similar keywords if the content is slightly different? This is where the 'main site' has the details, and you can purchase product, and the second site is a blog site only. We are creating a separate content blogsite; which would be on a second domain that will be related to one portion of content on main site. They would be linking back and forth, or maybe the blog site would just link over to the main site so they can purchase said product. This would be a similar scenario to give you an idea of how it would be structured: MAIN SITE: describes a few products, and you can purchase from this site SECOND SITE, different domain: a blog site that contains personal experiences with one of the products. BOTH sites will be linked back and forth....or as mentioned maybe the blog site could just link over to the 'main site' Logo would be a modified version of the main logo and look and feel of the sight would be similar but not exactly the same. MORE INFO: the main site has existed for way over 10 years, starting to gain some traction in an extremely competitive market, but does not rank super high, is gaining traction due to improvements in speed, content, onpage SEO, etc... So in addition to my main question of is this 'ok' to have this second domain, also will it hurt the rankings or negatively affect the 'main' site? Wondering about duplicate content issues, except it will be slightly different...
SEO Tactics | | fourwhitesocks0 -
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
When removing a product page from an ecommerce site?
What is the best practice for removing a product page from an Ecommerce site? If a 301 is not available and the page is already crawled by the search engine A. block it out in the robot.txt B. let it 404
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0