Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content above the Fold, or Below
Hi, I have an ecommerce site with several categories that I consider good landing pages. In order to get better search results I add content to these pages, usually above the fold, then after the content products are listed. Example:https://www.carburetor-parts.com/Carburetor-Kits_c_568.html I worry that customers get to the page and since they don't see the products above the fold, they move on. Should I be putting content in the footer instead of the header and if so how does that effect SEO? This has been bugging me for a long time. Thanks
On-Page Optimization | | MikeCarbs
Mike0 -
How to deal with this duplicate content
Hello our websites offers prayer times in the US and UK. The problem is that we have nearby towns where the prayer times are the same and the pages (exp : https://prayer-times.us/prayer-times-lake-michigan-12258-en and https://prayer-times.us/prayer-times-lake-12147-en) are in duplicate . Same issue for this page https://prayer-time.uk/prayer-times-wallsend-411-en How can we solve this problem
On-Page Optimization | | Zakirou0 -
Duplicate Content
When I say duplicate content, I don't mean that the content on a clients site is displaying on another site on the web or taken from a site on the web. A client has a few product pages and each product page has content on the bottom of the page (4-5 paragraphs) describing the product. Now, this content is also displaying on other pages, but re-worded so it's not 100% duplicate. Some pages show a duplicate content % ranging from 12% to 35% and maybe 40%. Just curious if I should suggest having each product page less than 10% duplicated. Thanks for your help.
On-Page Optimization | | Kdruckenbrod0 -
Page title contents
In my page title, I have my product name. Is it beneficial to also include another keyword like: Buy wedding dress online Australia: e..g. (page title) amelie wedding dress | buy wedding dress online Australia. Or is it better just using: Amelie wedding dress
On-Page Optimization | | CostumeD0 -
Duplicate content because of member only restrictions on a forum.
Our website's Community Forum links to the membership profile pages, which by default are blocked for non-members. https://www.foodbloggerpro.com/community/ https://www.foodbloggerpro.com/community/member/1301/ We're getting warnings in Moz for duplicate content (and errors) on these member profile pages. Any ideas for how we can creatively solve this problem? Should we redirect those pages or just beef them up with more content? Just ignore it and assume that search spiders will be smart enough to figure it out? See attached video for further explanation. Community_Area.mp4
On-Page Optimization | | Bjork0 -
Duplicate title tags, how to solve that?
We are currently running the "yellow pages". The problem is that Google Webmasters reports a lot of duplicate title tags. It's because we have three languages and the title consists of company name. for example: FCR Media Lietuva, UAB (The same in all languages). Of course we make different meta desriptions and so on. How should we solve this problem or should be just leave it as it is?
On-Page Optimization | | FCRMediaLietuva0 -
Meta descriptions better empty or with duplicate content?
I am working with a yahoo store. Somehow all of the meta description fields were filled in with random content from throughout the store. For example, a black cabinet knob product page might have in its description field the specifications for a drawer slide. I don't know how this happened. We have had a programmer auto populate certain fields to get them ready for product feeds, etc. It's possible they screwed something up during that, this was a long time ago. My question. Regardless of how it happened. Is it better for me to have them wipe these fields entirely clean? Or, is it better for me to have them populate the fields with a duplicate of our text from the body. The site has about 6,500 pages so I have and will make custom descriptions for the more important pages after this process, but the workload to do them all is too much. So, nothing or duplicate content for the pages that likely won't receive personal attention?
On-Page Optimization | | dellcos1 -
Duplicate content harms individual pages or whole site?
Hi, One section of my site is a selection of Art and Design books. I have about 200 individual posts, each with a book image and a description retrieved from Amazon (using their API). Due to several reasons not worth mentioning I decided to use the Amazon description. I don't mind if those pages rank well or not, but I need them as additional content for my visitors as they browse my site. The value relies in the selection of books. My question is if the duplicate content taken from Amazon harms only each book page or the whole site. The rest of the site has unique content. Thanks! Enrique
On-Page Optimization | | enriquef0