Duplicate Terms of Use and Privacy Policy, is it a problem?
-
Hi,
If i use same terms of use and privacy policy content across my websites, does it amounts to duplicate content issues? Does it affect my websites in any manner?
Regards
-
Duplicate content is one of many hundreds of factors. If you have a very well crafted site, highly optimized, and with a very strong inbound link profile, but only a couple pages (ones that are not highly relevant to your primary topical focus) are duplicate, the potential negative impact on your overall rankings will be minimal.
This is true for most SEO factors. If any single factor has a flaw, but it's not a flaw that applies to the whole site, that single factor is going to have minimum impact on the overall site.
-
You can do almost anything you wish on a "noindex" tagged page. You are telling the search engine bot to exclude the page from the search index, so the page should not affect your ranking.
The reason your site's # of pages is a factor, is your overall site is viewed as a whole. If you have a basic site with 10 pages, and there is a problem with 1 of the pages having duplicate content, then 10% of your site is affected, and this can impact how the search engine views your site. If your site hosted a forums with 10k pages, then that 1 page would represent 0.001 of your site, so the impact would not have any real effect.
-
Thanks for the helpful reply Alan! Can you please explain this - "If it's only a few pages, sure, duplicate content there could have an impact". How duplicate content issues vary between small and big sites? I was under the impression that number of pages do not have any influence in duplicate content.
Is it okay to use the same privacy policy and terms of use across different websites as long as i noindex,follow them?
-
How big is your site? If it's only a few pages, sure, duplicate content there could have an impact. But in reality, I expect your site is not primarily made up of keyword phrases that either of those pages would be optimized for, and that you have more than a few pages. If so, any "negative" aspect would not be severe.
Having said that, it really is best to just use a robots meta tag set to noindex,follow (my preference instead of blocking completely in the robots.txt file.
-
Thanks for the reply Ryan! But if i don't block it via robots.txt file or noindex tag, will it affect my site negatively? I mean the overall site
-
I would recommend blocking pages such as privacy policy, terms of use, legal, etc. It is unlikely these pages would ever bring traffic to your site. Even if they did, it is not going to be the quality traffic you desire.
In robots.txt you can add
Disallow: /pages/privacy/
substitute your local path for /pages/privacy/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content on site blog as a separate blog. Will unpublishing on one blog evade duplicate content issues?
I just discovered my client was posting the same content as the site I'm working on for him on a separate blog. I don't want to run into duplicate content issues. Both are Wordpress sites. Will it suffice to simply unpublish duplicate entries on the other blog and leave the posts as drafts?
Content Development | | locallyrank0 -
Duplicate content penalty
Hi there, I'd like to ensure I avoid a duplicate content penalty and could do with some advice. There is a popular blogger in my industry. I have agreed to add his blog to my website. He currently posts his blog on one of the popular free blogger platforms, and will continue to do this. The issue is that I will be posting duplicate content onto my site and I want to ensure that I do not trigger a google penalty. Is there a simple way form me to inform Google of the original source of the content. My intitial thoughts are: 1. Add a noindex to the Robots.txt file 2. Add a link at the beginning of the article pointing to the original source 3. Adding a rel=canonical tag in the header of each blog entry pointing to the original blog post which resides on a completely different domain. Thanks DBC
Content Development | | DBC011 -
What is your take on using Google images for blogs
In your opinion, do you think it is okay to use Google images for your blog posts as long as it credits or includes a link to the site in which you located it? We tried looking at stock art for some license-free photos and had no luck in coming up with good pictures. What rules or guidelines do you follow when looking for pictures for your customers blog (or your own)? A lot of our posts have to do with "what to do in this city" or "Where to go in this city". Do you think it's okay to use Google images for these types of posts?
Content Development | | qlkasdjfw0 -
Use Of H1 Tags
I just have a quick question. I have seen a few people mention that the use of more than 1 H1 tag has little bearing on things. I am trying to sort out a site for a friend and it has 2 H1 tags. My issue is that it is ranking for a quite competitive keyword (don't know how as the whole site layout is a total mess) but in my efforts to clean up the site I am wondering if I should actually reduce the No of H1 Tags to 1 or leave 'as is' for fear of breaking something. Thanks Paul
Content Development | | propertyhunter0 -
Duplicate external links?
I have been guest posting at a variety of reputable blogs in my niche. I generally write once or twice a month and have a bio link with a link to my blog. I'm wondering if multiple links from the same domain (but different pages) helps, or if there are some diminishing returns here. Should I only be writing one post for them? Of course, there are other non-SEO benefits too, because these are reputable sites. But I'm wondering how this helps my SEO? Thanks in advance!
Content Development | | JodiFTM0 -
301 Redirect & Duplicate Content
We currently have 16465 audiobook products presented at our Web store. 5411 of them are out-of-publication (OOP). Here's an example: Harry Potter Audiobook 2 : Harry Potter and the Chamber of Secrets - J.K. Rowling - cassette audiobook Many of the 5411 OOP products are duplicates and triplicates of one title but were offered on a different medium (cassette, CD or MP3 CD) or were a different type (abridged, unabridged, dramatized). The description (story-line) is the same for all. Because we know once a page gets on the Internet, it can live there for years, we decided to keep OOP product pages at our Web store to: Let those who may have searched for the product and clicked on a link to an OOP product's page that it was no longer available. Invite them to explore our Web store. Let them know that although the product may not be available on cassette, CD or MP3 CD, that it might be available as a digital download. We know that Google does NOT like duplicate content from one site to another and even within the same site. If we redirect all the 5411 pages to one OOP page, will this eliminate this duplicate content issue? The OOP page would explain that the title they were looking for is no longer available but that it might be available as a digital download.
Content Development | | lbohen0 -
Duplicate Page Content WordPress blog with categories?
Just got a crawl report back from SEOmoz and it gives me lots of errors for "duplicate page content". Upon investigating, I notice this is because my WP blog is setup into categories so the home page is almost identical to one of the category pages. None of my actually posts are the same but the category pages have some overlap since the same post could show up in two or more categories. Is this a problem or can I just ignore this error? Any thing I should be doing differently? Thanks!
Content Development | | frankthetank20 -
Duplicate content on the homepage
Hello SEOMOZ Is giving me an error on duplicated content on my site. When viewing the details it is showing the following as duplicated content domain.co.uk/ domain.co.uk domain.co.uk/index.html Obviously these are the same pages. Why is it seeing them as seperate. Does anyone know how I can resolve this issue? Many thanks
Content Development | | lcdesign0