Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content behind a Paywall
-
We have a website that is publicly visible. This website has content.
We'd like to take that same content, put it on another website, behind a paywall.
Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this?
Thanks!
Mike
-
Hi Mike, just to be clear on what Thomas is suggesting, as I think he might be getting mixed up between noindex and robots.txt.
If you simply add noindex,nofollow to a bunch of pages, this could still get you in trouble. Noindex doesn't mean DO NOT CRAWL, it means DO NOT INDEX. There's a big difference.
If something has noindex, Google can still crawl that content but they won't put it in the search results.
The only way to completely make sure that Google won't crawl content is by blocking it in robots.txt or in your case putting it behind a username and password.
So to answer your question, yes it's fine as long as it's behind a login, Google can't punish you for it since they can't see it.
I hope this helps,
Craig
-
Make sure you make the pages so that there are no follow pages and no index there's no risk of Google penalizing you for duplicate content whatsoever.
Build away. I hope I have helped you
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use expired domain content on My Blog
Hello Expert, I wanted to know, can I use expired domain content on my blog channel. I have done many searches on google but couldn't find a satisfactory answer. Please help me to find out this.
Content Development | | vijay77960 -
How to deal with lot of old content that doesn't drive traffic - delete?
Hi community, i hope someone can help me with this, We are migrating our e-commerce site next februari. I'm preparing the content migration. For a large part exact copies of our product listing and product detail pages will be migrated.
Content Development | | Marketing-Omoda
However, we also have a lot of old blog content, which is, because of seasonality and trendiness, outdated and doesn't drive traffic anymore. It actually is just worthless content. (Not only as a traffic driver, this also counts for extremely low to none internal driven traffic (both internal search and internal navigation). We have about 4.000+ blogs of which about 100 drive the most traffic (mostly incited by e-mail and social campaigns and internal navigation promoted on important category landing pages during some period. Is it a bad signal to search engines to delete these old content pages? I.a.: going from a content-rich to a content-poor site?
Off course I will migrate the top 100 traffic earning content and provide proper redirects to them0 -
References for Healthcare Blog Content?
Hey everyone, We have a couple B2C medical/healthcare clients we produce content for and I was wondering what the industry stance is when it comes to giving references at the end of a blog, assuming there were no statistics or direct quotes used in the content. A lot of our content is written via research on a specific condition/treatment and doesn't really dive deep into specific medical nuances. Things like risks, recovery timelines, questions to ask, etc. are written about mostly. Still, should we be providing general references at the end of blogs to sites like WebMD, Medscape, etc. Thanks for any input!
Content Development | | danielreyes0 -
Multiple listings for the same product - how to avoid duplication?
Hi everyone, We are working on an e-commerce site that has three different physical stores that each hold different products in stock. A listing is created on the website for each product in each location but if the same product is in all three stores the images and description are duplicate content We can't have one product page and tag locations to it as this would then require keeping two different stock systems updated, but is there a way to avoid the three pages being duplicates of each other? Many thanks for the advice!
Content Development | | A_Q0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
Free Duplicate Content Checker Tools ?
Hi Moz, I am really looking for free tools which can carry my content duplication issue, as i visited http://moz.com/community/q/are-there-tools-to-discover-duplicate-content-issues-with-the-other-websites suggested copyscape which is paid. I want FREE to handle my duplication issue.' Thanks in Advance. Best,
Content Development | | Futura
Teginder1 -
Duplicate Content
I have a service based client that is interested in optimizing his website for all the services that he provides in all the locations that he provides them in. For example: Service 1, location 1 Service 1, location 2 Service 2, location 1 Service 2, location 2 He wants to essentially create an individual page for each of the above, but i'm concerned that he will be penalized for duplicate content. Each of the pages would have the keyword in the url, page title and within the main body of content. We would certainly alter the content somewhat, but not sure how much a difference this would make. Any thoughts or advice would be greatly appreciated.
Content Development | | embracedarrenhughes1 -
Wordpress Duplicate Pages/ URL's - Help !
Hi guys, I have been running SEOMoz for just over a month and slowly cleaning up one of my Wordpress Blogs. While going through the crawl reports I have noticed that I have duplicate pages showing on the crawl. For example, the main post would be; www.xxxxx.com/blog/post-title Then I see another URL which would be; **www.xxxx.com/blog/page/59 ** When I click on either URL it goes back to the actual post title URL. What's with these page URL's ? Isn't these two URL's showing duplicate content to the search engines ? Any suggestions would be greatly appreciated.
Content Development | | dcc0