Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated Content on Wordpress Mobile&Desktop versions- Is it bad for SEO?
Hello, I use a Wordpress theme for displaying mobile and desktop versions separately. The problem is that if you use tools like screaming frog or if you look at the code (view source), you can detect duplicated content. But if you're browsing on your mobile you will see only the content I have created for the mobile version and the same if you're looking at the desktop version, you will only see the desktop content.
On-Page Optimization | | AlphaRoadside
Is this creating an SEO problem? If it is, please let me know why and if it has a solution. Thanks in advance.0 -
Ratings pages are Duplicate Content
This brought up another question. should the review page (which now has a canonical to the item page) be Index,follow? My item review pages are showing up with Duplicate Content errors in MOZ. Here are two examples http://www.americanmusical.com/ItemReview--i-HAM-SK1-LIST http://www.americanmusical.com/ItemReview--i-MAC-203680902-LIST is the problem that the pages contain the same code and questions with very little customer created info?
On-Page Optimization | | dianeb1520 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Duplicate content with mine other websites
Hi,
On-Page Optimization | | JohnHuynh
I have a primary website www.vietnamvisacorp.com and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. What the best solution for this issue? Please help! p/s: I am a webmaster of all websites duplicate content Thank you!0 -
Thin content and tabs on page
I am reviewing a site, and the web designer used tabs to impart information. I think the tabs idea looks great, but it leaves the page looking thin. Here is a link to a product page, could anyone chime in please? http://www.aireindustrial.net/spill-berms/foam-berm-drive-over-berms.asp Thanks in advance for your opinion!
On-Page Optimization | | drufast10 -
E commerce Website canonical and duplicate content isssue
i have a ecomerce site , i am just wondering if any one could help me answer this the more info page can be access will google consider it as duplicate and if it does then how to best use the canonical tag http://domain.com/product-page http://domain.com/product-page/ http://domain.com/product-Page http://domain.com/product-Page/ also in zencart when link product it create duplicate page content how to tackle it? many thanks
On-Page Optimization | | conversiontactics0 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | | MeghanPrudencio0