Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I fix my portfolio causing duplicate content issues?
Hi, Im new to this whole duplicate content issue. I have a website, fatcatpaperie.com that I use the portofolio feature in Wordpress as my gallery for all my wedding invitations. I have a ton of duplicate content issues from this. I don't understand at all how to fix this. I'd appreciate any help! Below is an example of one duplicate content issue. They have slightly different names, different urls, different images and all have no text. But are coming up as duplicates. Would it be as easy as putting a different metadescription for each?? Thanks for the help! Rena | "Treasure" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/treasure-designers-fine-press 1 0 0 0 200 3 duplicates "Perennial" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/perennial-by-designers-fine-press 1 0 0 0 200 1 of 3 duplicates "Primrose" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/8675 1 0 0 0 200 2 of 3 duplicates "Catalina" by Designers Fine Press - Fat Cat Paperie http://fatcatpaperie.com/portfolio-item/catalina-designers-fine-press |
On-Page Optimization | | HonestSEOStudio0 -
Blog on server or embedded? Duplicate content?
Wondering what would be best in terms of SEO. Should I install some blog software actually on the website or can I just embed say a blogger.com blog? if I did that would they consider it duplicate content?
On-Page Optimization | | Superflys0 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
Duplicate lower and uppercase keywords
I get significantly different report cards for the same keywords with and without uppercase letters on the same URL. The only difference between the two sets of keywords is the first letters of every keywords, e.g.: "Air freight" - results in an "F"
On-Page Optimization | | DigiTeamatDSV
"air freigth" - results in an "A" Should I stick to lowercase keywords only - and won't the search engines ignore the case anyway?0 -
Is my blog simply duplicate content of my authors' profiles?
www.example.com/blog is the full list of blog posts by various writers. The list contains the title of each article and the first paragraph from the article. In addition to /blog being indexed, each author's contribution list is being indexed separately. It's not a profile, really, just a list of articles in the same title & paragraph format of the /blog page. So if /blog a list of 10 articles written by two writers, I have three pages: /blog/author1 is a list of 4 articles /blog/author2 is a list of 6 different articles /blog is a list of 10 articles (the 4+6 from the two writers) Is this going to be considered duplicate content?
On-Page Optimization | | Brocberry0 -
Summarize your question.Images being seen as duplicate content/pages
My images suddenly are appearing in my crawl reports as duplicate content, without meta tags, this happened over night and cant figure out why.
On-Page Optimization | | RBYoung0 -
Duplicate content
crawler shows following links as duplicate http://www.mysite.com http://mysite.com http://www.mysite.com/ http://mysite.com. http://mysite.com/index.html How can i solve this issue?
On-Page Optimization | | bhanu22170 -
Duplicate Page Content and Duplicate Page Title
Hi All, I'm new in SEOMoz and have some questions after I have already spend 2-3 days trying to resolve the problems identified from Crawling one of my clients websites. I get quite a lot of Duplicate Page Conntent and Page Titles warnings and trying to find a workaround through the forums and posts. I continuously get this error on most of my pages: URL: http://domain.com/benefits with the same Page but with a WWW in front URL: http://www.domain.com/benefits Any advice will be highly appreciated. Thanks, Athos
On-Page Optimization | | athosk0