Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product descriptions, when do they become classed as duplicate content, how different do they have to be?
I look after 3 sites which have a lot of crossover on products. We have 1000s of products and I've made it a requirement that we give each it's on description on each of the sites. This sounds like the right thing to but it's very hard for our content writers to write three different versions descriptions, especially when we have variations on the products so potentially writing unique product descriptions for 4-5 very similar products on three separate sites. We've worked very hard to create unique content deep through the site on all categories, subcategories and tag combinations and along with the other SEO work we've done over the last couple of years is producing great results. My question is now far do we have to go? I'm busy writing some product descriptions for a 3rd party site for some of our products, the easy thing to do is just copy and paste but I want Google to see the descriptions as unique. Whilst all SEO advice will say 'write unique descriptions' from a practical point of view this isn't especially useful as there doesn't really seem to be much guidance on how different they need to be. I gather we can't just move around the paragraphs or jumble up sentences a bit but it is easier to work from a description and change it than it is to start from a blank slate (our products range form being very interesting and unique, to quite everyday so sometimes tough to create varied unique content for). Does anyone know of any guidance or evidence of just how clever the Google algorithm is and how close content has to be before it becomes classed as the same or similar? Thanks Pete
Content Development | | PeterLeatherland0 -
Using Brand Descriptions
We have an online site with approximately 800 products. When we add new products we typically use the photos and product description provided by the brand. The problem is there are numerous other sites using the same descriptions. All other content on the site is original. Is having small blocks of content that are the same on multiple sites likely to cause problems with Google?
Content Development | | twitime0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
Correction Duplicate Page Title Problems for a Blog
EDITED: To just focus on the issue at hand. I am trying to figure out the SEO rules instead of just working on the content. Please bear with me. I am adept technically. I just do not know the rules of the SEO process or even some of the termology. So I’m trying to attack problems one at time. Today’s problem – **Duplicate Page Titles ** We evidently have thousands of Duplicate Page Titles. We are using Joomla 2.5 & Easyblog. Our sitemap is automated from XML Sitemap Easyblog takes the title of the sites and uses it for a name of the summary pages. We post 5 blog items per page and all the names are the same. http://www.OursiteName.com/?start=5 Page Title = Site Name http://www.OursiteName.com/?start=10 Page Title = Site Name A similar thing happens on the sorting by Author or Category etc etc. Basically non-duplicate pages are looking like duplicates. What is the best practice / approach? Using the Robot.txt or XML Sitemap to tell Google not to crawl these pages? Writing a script or edit the Easyblog code to edit the 2000 duplicate Page Titles? Other thoughts?
Content Development | | Romana0 -
Using Google Blogger
Hi, On my website I have a specific blog section, which like most people is linked to Wordpress. I have been advised that to achieve better SEO its also a good idea to use Google Blogger, because it will link back to your website from Google. Obviously we don't want to submit the same articles to Google blogger or else that would be duplicated, but is this something that could benefit and is anyone else doing this? Thanks!
Content Development | | Pulsar0 -
If I write a article for a client and use Social Book Marking software, will this add value (or does the value depend on how ofen it gets shared)?
I'm about to start experimenting with bookmarking software. My first step is trying Ping.com (because it's free), but I am also keen to try Onlywire.com (which comes at a cost- darn). My questions are: when I create an article for a client and post to all these social sites will this make a difference? Or will it only make a difference if people share the article? Has anyone been experimenting with social bookmarking software that can tell me what results they have had and how they got there? Thanks a mil!
Content Development | | 2Stroke0 -
How to make use of blog networks?
I read on a forum recently that someone was using blog networks as his primary tool for SEO. I'm not sure what he meant beyond this, however... Can anyone provide tips on how I might use one, and which are some of the best? Thank you very much in advance.
Content Development | | ZakGottlieb710 -
Duplicate content
Hello Seomoz team, i'm french and so my english is not very good ;-). I work for a brand site and we publish content about our products. The problem is : as a brand site, many sites that sell our products, copy our content. And we have duplicate content. And since these sites have worked SEO, they put in place rel canonical tag. as a brand, how to avoid being accused by Google duplicate content? tanks for you answer. I hope it's clear. Take care Denis
Content Development | | android_lyon0