Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
-
Hi,
My company is having new distributor contract, and we are starting to sell products on our own webshop.
Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it.
With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content.
If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)?
What else should we be aware off beside checking 'product description' for duplicate content?
Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us.
Keep it hard work and thank you very much for your answers,
Cheers,
Dusan
-
Thank you again Monica. The reviews are definitely be implemented. Good luck!
-
I think you should stay above 90% unique.
I would strongly encourage you to make sure you add user generated content to the pages. Even rewriting the content to be exactly unique will not be enough to guarantee your pages can rank. The content is going to be written in unique words, but it will essentially be the same thing. You will need something that is uniquely valuable to the user.
No you won't necessarily be penalized, but, it will become harder to rank in branded searches.
-
Thank you Monica, really good answer!
Now, with copyscape.com what is the percentage of uniqueness that is 'allright'?
We are selling laboratory products for cell analysis, tittles and descriptions of our products are highly scientific. Manufacturer is the only one who can write it. Re-writing is our only option.
However, i am not that scared of being penalies any more. Thank you very much!
-
I would add that if you are going to have user generated content, make sure there's a review process so it doesn't get spammed/abused.
-
Monica said basically everything I would and probably a little better. The review system is extremely helpful in generating unique content, for a few reasons. One, you don't have to write it yourself (just review it), second customers want to hear from other customers, third the way a user describes a product may actually attract keyword opportunities you may not have thought of on your own.
-
Hi Dusan,
I have a couple of suggestions for you. The first, to answer your question, is that copyscape.com will let you compare two pieces of content for free, and then it will tell you the percentage of uniqueness. This will be a good way to tell if your rewrites are adequate.
My second suggestion is to implement a review system that will stream user generated content onto your pages. The duplicate content "penalty" you are referring to is really not a penalty in the empirical sense. When it comes to ranking, Google will look at two sites with the same content and pick one to display. Usually the branded site would win in a branded search. There are many other factors, like page rank and domain authority that can influence which page is displayed, and the user query can influence that as well.
Having uniquely valuable content on your site, like user generated comments and reviews, can be the offset for your duplicate content issue. Rewriting the manufacturer's descriptions isn't really going to accomplish the goal of offering the searcher something they can't find anywhere else. In my opinion, just rewriting the content isn't enough of an advantage to beat out other sites. You have to offer something valuable, that can only be found on your site. User generated content is (in my opinion) the best content you can have. Every consumer reads reviews when they are available. They want to find out what regular people have to say about a product. Is is the right size, is the color consistent, how long did it last, is this a fair price? These are all questions that can be answered in a review system.
I added reviews to my Ecommerce site about 6 months ago and have seen great success. 80% of my content is the same as 4 other sites, except my category pages. I write extremely unique content for those pages, which helps me target long tail and branded key terms. Then my product pages have the manufacturer's descriptions, tech specs and warranty info plus the user generated content. It has been very successful whenever I have implemented it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
Duplicate page content errors for Web App Login
Hi There I have 6 duplicate content errors, but they are for the WebApp login from our website. I have put a Noindex on the Sitemap to stop google from indexing them to see if that would work. But it didn't. These links as far as I can see are not even on the website www.skemaz.net, but are links beyond the website and on the Web App itself eg : <colgroup><col width="529"></colgroup>
Intermediate & Advanced SEO | | Skemazer
| http://login.skemaz.net |
| http://login.skemaz.net/LogIn?ReturnUrl=%2Fchangepassword |
| http://login.skemaz.net/Login |
| http://login.skemaz.net/LogIn?ReturnUrl=%2FHome | Any suggestions would be greatly appreciated. Kind regards Sarah0 -
Duplicate Pages #!
Hi guys, Currently have duplicate pages accross a website e.g. https://archierose.com.au/shop/cart**#!** https://archierose.com.au/shop/cart The only difference is the URL 1 has a hashtag and exclamation tag. Everything else is the same. We were thinking of adding rel canonical tags on the #! versions of the page to the correct URLs. But Google doens't seem to be indexing the #! versions anyway. Does anyone know why this is the case? If Google is not indexing them, is there any point adding rel canonical tags? Cheers, Chris https://archierose.com.au/shop/cart#!
Intermediate & Advanced SEO | | jayoliverwright0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
Is publishing a large quantity of content at once a bad idea?
If you plan on doubling the size of your site with original, unique content, is it better to publish it all at once or over a period of time? Is there any penalty for publishing it all at once?
Intermediate & Advanced SEO | | nicole.healthline1