Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
-
Hi,
My company is having new distributor contract, and we are starting to sell products on our own webshop.
Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it.
With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content.
If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)?
What else should we be aware off beside checking 'product description' for duplicate content?
Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us.
Keep it hard work and thank you very much for your answers,
Cheers,
Dusan
-
Thank you again Monica. The reviews are definitely be implemented. Good luck!
-
I think you should stay above 90% unique.
I would strongly encourage you to make sure you add user generated content to the pages. Even rewriting the content to be exactly unique will not be enough to guarantee your pages can rank. The content is going to be written in unique words, but it will essentially be the same thing. You will need something that is uniquely valuable to the user.
No you won't necessarily be penalized, but, it will become harder to rank in branded searches.
-
Thank you Monica, really good answer!
Now, with copyscape.com what is the percentage of uniqueness that is 'allright'?
We are selling laboratory products for cell analysis, tittles and descriptions of our products are highly scientific. Manufacturer is the only one who can write it. Re-writing is our only option.
However, i am not that scared of being penalies any more. Thank you very much!
-
I would add that if you are going to have user generated content, make sure there's a review process so it doesn't get spammed/abused.
-
Monica said basically everything I would and probably a little better. The review system is extremely helpful in generating unique content, for a few reasons. One, you don't have to write it yourself (just review it), second customers want to hear from other customers, third the way a user describes a product may actually attract keyword opportunities you may not have thought of on your own.
-
Hi Dusan,
I have a couple of suggestions for you. The first, to answer your question, is that copyscape.com will let you compare two pieces of content for free, and then it will tell you the percentage of uniqueness. This will be a good way to tell if your rewrites are adequate.
My second suggestion is to implement a review system that will stream user generated content onto your pages. The duplicate content "penalty" you are referring to is really not a penalty in the empirical sense. When it comes to ranking, Google will look at two sites with the same content and pick one to display. Usually the branded site would win in a branded search. There are many other factors, like page rank and domain authority that can influence which page is displayed, and the user query can influence that as well.
Having uniquely valuable content on your site, like user generated comments and reviews, can be the offset for your duplicate content issue. Rewriting the manufacturer's descriptions isn't really going to accomplish the goal of offering the searcher something they can't find anywhere else. In my opinion, just rewriting the content isn't enough of an advantage to beat out other sites. You have to offer something valuable, that can only be found on your site. User generated content is (in my opinion) the best content you can have. Every consumer reads reviews when they are available. They want to find out what regular people have to say about a product. Is is the right size, is the color consistent, how long did it last, is this a fair price? These are all questions that can be answered in a review system.
I added reviews to my Ecommerce site about 6 months ago and have seen great success. 80% of my content is the same as 4 other sites, except my category pages. I write extremely unique content for those pages, which helps me target long tail and branded key terms. Then my product pages have the manufacturer's descriptions, tech specs and warranty info plus the user generated content. It has been very successful whenever I have implemented it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Loading Content Asynchronously for Page Speed Purposes?
Pages for my companies play process load slowly because the process is heavy. Below the play process there is a block of text, put mostly there for SEO purposes. R&D are proposing to load the SEO Area only after the play process is loading.
Intermediate & Advanced SEO | | theLotter
This seems like a very bad solution, because loading the SEO Area asynchronously will make the content unreadable to Google. Am I missing something?0 -
Content Landing Page
Hey Mozzers, I wanted to get some opinions on here. I'm going to be building out the content on my site a lot of the next couple of months, and have recently started thinking about creating a content landing page. For those not familiar with the concept it's the idea of building this page that basically just pulls together all the content you've written on a specific subject & serves as hopefully a link magnet & destination for people interested in the topic. So my question is this, I am just outlining all of the different posts & areas that I want to cover on specific topics & it is a lot. I'm talking ~20 posts on each subject. Do you think that would be too much content to try & get on one page? Should I break it down to a more finite 5-7 links to high quality articles per page, or create basically this monster guide that links to all these different articles I'll create. Looking forward to getting your opinion, Chris
Intermediate & Advanced SEO | | chris.kent0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Duplicated Pages (Travel Industry)
Hi, I have 6 duplicated pages. I want to know the best way to resolve this issue. The title for all pages is 'Paradisus All Inclusive Luxury Resorts - Book your stay at Paradisus Resorts' http://www.paradisus.com/booking-template.php | http://www.paradisus.com/booking-template.php?codigoHotel=5889 |
Intermediate & Advanced SEO | | Melia
| | http://www.paradisus.com/booking-template.php?codigoHotel=5891 line 9 http://www.paradisus.com/booking-template.php |
| | http://www.paradisus.com/booking-template.php?codigoHotel=5910 line 9 http://www.paradisus.com/booking-template.php |
| | http://www.paradisus.com/booking-template.php?codigoHotel=5911 line 9 |0 -
How are they avoiding duplicate content?
One of the largest stores in USA for soccer runs a number of whitelabel sites for major partners such as Fox and ESPN. However, the effect of this is that they are creating duplicate content for their products (and even the overall site structure is very similar). Take a look at: http://www.worldsoccershop.com/23147.html http://www.foxsoccershop.com/23147.html http://www.soccernetstore.com/23147.html You can see that practically everything is the same including: product URL product title product description My question is, why is Google not classing this as duplicate content? Have they coded for it in a certain way or is there something I'm missing which is helping them achieve rankings for all sites?
Intermediate & Advanced SEO | | ukss19840