Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a good tool for finding Duplicate Content?
Hi! Are there any recommendation voor Duplicate Content finder tools? Is it offered on MOZ (not the one that belongs to 'seo issues') or a different site?
Content Development | | Marketing-SurpriseFactory1 -
Content Duplication Issue On Content Publishing Site
I am running Free Article Posting site but I discover people are posting their content which already has been published on different sites before. What should i need to do in order to save my site from Google penalty.3 waiting for your kind help in this regard Thanks in advance
Content Development | | Mustansar0 -
Can someone PLEASE help me find a solution to use a custom search engine for iPhones?? Thanks in advance!
Hey mozzers, I'm in quite the pickle today and would really appreciate some help! i need a way to have my members set their default custom search engine on their iPhones and androids to our sites search engine. Google chrome does on desktops but not iPhones. Thanks for your time/help, Tyler Abernethy
Content Development | | TylerAbernethy0 -
How to Get Rid of Duplicate Content Captured on Article Lists
We have a ton of articles and blog posts on our site. Currently, we display summary lists of articles that contain the first paragraph of the article in the summary list. However, in my reports, this is coming back as duplicate content with the full article itself. How do I fix this? Ex: article main page- http://www.robots.com/articles/10 First article on that page- http://www.robots.com/articles/viewing/grippers-for-robots (which shows up as duplicate content with the main artilce page). With our blogs, we have the most recent 5 blogs (in the same summary format) listed on our main blog page. We then have categories that people can sort by. But again, this is causing us duplicate content because those pages show the first paragraph of the blogs related to that category. Ex: blog main page- http://www.robots.com/blog. First blog listed on that page- http://www.robots.com/blog/viewing/robots-and-automation-bringing-jobs-back-to-the-united-states (which then shows as duplicate content with the main blog page). And then you can also select categories to see related topics: http://www.robots.com/blog/category/buying-a-robot which is showing as duplicate content also. Help! How can I prevent this? Thanks! JWanner
Content Development | | jwanner0 -
Duplicate content problem
Hi, i have a serious problem. I work in joomla and sometimes it can be annoying. When you set up a category, you need to give it a name and maybe this is a huge error on my part as i did not really think about the names beforehand. The situation i have now is, all my sections are in front page mode, but because you have to name the categories in order to write articles, i am now left with a load of blog sections such as http://www.in2town.co.uk/benidorm/benidorm-news Now i have a main section called Benidorm news so i have duplicate sections, i want to know if i can redirect the http://www.in2town.co.uk/benidorm/benidorm-news to go to the main benidorm section or if there is a better way of doing it. i have left this blod layout the way it is to show you, but the others i just have it where it shows the title and then goes to the article. I work in k2 and would be grateful if anyone can let me know the solution to this as semoz is showing that i have many duplicate titles and content many thanks
Content Development | | ClaireH-1848860 -
Duplicate Websites
What would you do if a competitor had their main domain, and then another domain targeting your local area with the same exact content? That's currently happening to me, and I'm not sure what I should do about it, if anything: http://www.bozemanchevrolet.com and http://www.montanachevy.com
Content Development | | ResslerMotors0 -
Duplicate content for manually setup blog and wordpress blog
We have a website where the ecommerce will not allow us to host blog. So we created our own manual blog page setup. Will this flag duplicate content on Google? http://www.homesupershops.com/blog and http://www.homesupershops.com/blog-july have same content. How come on a word press the same content on http://www.vizionseo.com/blog/ and http://www.vizionseo.com/blog/2011/05/how-can-your-business-rank-high-on-google-maps/ does not flag duplicate content?
Content Development | | VizionSEO990 -
Duplicate content and Facebook
If i have content on my site and the same content duplicated on my facebook pages, will google treat this as duplicate content? At the moment when i copy and paste a line of text from the content on my site Facebook is returned first.
Content Development | | Turkey0