Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clarification on duplicate content
Hi, if I have a page that unintentionally ranks for a term that I want to create a page for - say "atlanta apartments" - should I still create a page specifically intended to rank for "atlanta apartments"? Will canonical tags be crucial in this case? Hoping to avoid creating duplicate content and instead create the correct content for a specific term.
Content Development | | smiller760 -
Tool to identify duplicated content on other sites
Hi does anyone know of a tool that could be used to identify if a site is using our content without permission? Thanks
Content Development | | turismodevino10 -
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Duplicate Content From Huffington Post Blog
A client who writes blog posts for Huffington Post also wants an identical version of the blog posted to his personal site. Do you think there could be a problem of being punished for duplicate content? Would a better SEO practice be to have the client do an on-site blog just linking to the Huffington Post blog and providing information about it?
Content Development | | EmarketedTeam0 -
Will our two retail sites get hit with duplicate content?
Our retail site just rolled out a second online store. The URL is new and it is showing some of the same products from the same vendors (probably about 40% of the fist store is in the second store). Down the road, we will remove the products from the first site, however, we are keeping it for now. The products show up on both sites, with the same images, and the same descriptions and almost the same URL query string. Are we going to get hit with any penalties due to duplicate content?
Content Development | | klmarketing0 -
Best way to resolve duplicate content issue?
Not sure about what to do about this - I have a client who has a ton of pages (around 1200) which are all City specific pages, for long-tail search. These are all written with paragraphs in the format such as: Order to [City] today. So every page has essentially the same content. The site also only has 1562 pages, so with 1200 of them being City-specific same-content pages, that can't be good. However the problem is that these pages still rank very well (usually Position 1 or 2) for the terms they're targeting, and bring in enough traffic and revenue to justify their purpose. We also have Country specific pages, and these are all with unique content, rather than the scripted content on the City pages. So for example, for Italy we might have: Italy Page (Unique Content) Rome (Duplicate Content) Milan (Duplicate Content) Venice (Duplicate Content) etc. (Duplicate Content) For a low traffic country (Austria), we tried to 301 the City pages to the Country page, but that only resulted in us seeing a drop in search results for the city keywords, from (usually) Position 1 to more like Page 3 or 4, so quite a drop. So, without writing 1200 pages worth of unique content, what would your advice be?
Content Development | | TME_Digital0 -
Best way to avoid duplicate content issues here.
I am planning to write an article that refutes some claims made in another article. The original article is a 20 page pdf. What I plan to do is to take quotes from this PDF and then under each quote write my arguments for or against the quote. If I take direct quotes from the article, is Google likely to see this as duplicate content?
Content Development | | MarieHaynes0 -
Duplicate content via syndication?
I have a full text RSS feed of my blog available for users with RSS readers. A few sites have said they would like to republish the unedited feed on their site (so my blog postings show up on their sites with links back to my site embedded). I'm wondering if this is a good/bad idea (to let them republish my postings) and/or if I should do anything in the feed to protect myself from an SEO point of view? Am I at risk of some kind of duplicate content penalty from Google, or will Google figure out that I'm the original source (which would be good) since the blog postings have links back to my site? Thanks!
Content Development | | scanlin0