Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate Content behind a Paywall
-
We have a website that is publicly visible. This website has content.
We'd like to take that same content, put it on another website, behind a paywall.
Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this?
Thanks!
Mike
-
Hi Mike, just to be clear on what Thomas is suggesting, as I think he might be getting mixed up between noindex and robots.txt.
If you simply add noindex,nofollow to a bunch of pages, this could still get you in trouble. Noindex doesn't mean DO NOT CRAWL, it means DO NOT INDEX. There's a big difference.
If something has noindex, Google can still crawl that content but they won't put it in the search results.
The only way to completely make sure that Google won't crawl content is by blocking it in robots.txt or in your case putting it behind a username and password.
So to answer your question, yes it's fine as long as it's behind a login, Google can't punish you for it since they can't see it.
I hope this helps,
Craig
-
Make sure you make the pages so that there are no follow pages and no index there's no risk of Google penalizing you for duplicate content whatsoever.
Build away. I hope I have helped you
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reusing content on different ccTLDs
We have a client with many international locations, each of which has their own ccTLD domain and website. Eg company-name.com, company-name.com.au, company-name.co.uk, company-name.fr, etc. Each domain/website only targets their own country, and the SEO aim is for each site to only rank well within their own country. We work for an individual country's operations, and the international head office wants to re-use our content on other countries' websites. While there would likely be some optimsation of the content for each region, there may be cases where it is re-used identically. We are concerned that this will cause duplicate content issues. I've read that the separate ccTLDs should indicate to search engines that content is aimed at the different locations - is this sufficient or should we be doing anything extra to avoid duplicate content penalties? Or should we argue that they simply must not do this at all and develop unique content for each? Thanks Julian
Content Development | | Bc.agency0 -
How to deal with lot of old content that doesn't drive traffic - delete?
Hi community, i hope someone can help me with this, We are migrating our e-commerce site next februari. I'm preparing the content migration. For a large part exact copies of our product listing and product detail pages will be migrated.
Content Development | | Marketing-Omoda
However, we also have a lot of old blog content, which is, because of seasonality and trendiness, outdated and doesn't drive traffic anymore. It actually is just worthless content. (Not only as a traffic driver, this also counts for extremely low to none internal driven traffic (both internal search and internal navigation). We have about 4.000+ blogs of which about 100 drive the most traffic (mostly incited by e-mail and social campaigns and internal navigation promoted on important category landing pages during some period. Is it a bad signal to search engines to delete these old content pages? I.a.: going from a content-rich to a content-poor site?
Off course I will migrate the top 100 traffic earning content and provide proper redirects to them0 -
Images & Duplicate Content Issues
Here's a scenario for you: The site is running WordPress and the images are uploaded to the media section. You can set image attributes there such as the Description & Alt Tag. Let's say you'd like to reuse the same image in two different blog posts. The image keeps the same Description & Alt Tag associated with it in the media section. Would this be considered duplicate content? What would be the best practice in this case to reuse the same image in multiple posts?
Content Development | | VicMarcusNWI0 -
Multiple listings for the same product - how to avoid duplication?
Hi everyone, We are working on an e-commerce site that has three different physical stores that each hold different products in stock. A listing is created on the website for each product in each location but if the same product is in all three stores the images and description are duplicate content We can't have one product page and tag locations to it as this would then require keeping two different stock systems updated, but is there a way to avoid the three pages being duplicates of each other? Many thanks for the advice!
Content Development | | A_Q0 -
Can We Publish Duplicate Content on Multi Regional Website / Blogs?
Today, I was reading Google's official article on Multi Regional website and use of duplicate content. Right now, We are working on 4 different blogs for following regions. And, We're writing unique content for each blog. But, I am thinking to use one content / subject for all 4 region blogs. USA: http://www.bannerbuzz.com/blog/ UK: http://www.bannerbuzz.co.uk/blog/ AUS: http://www.bannerbuzz.com.au/blog/ CA: http://www.bannerbuzz.ca/blog/ Let me give you very clear ideas on it. Recently, We have published one article on USA website. http://www.bannerbuzz.com/blog/choosing-the-right-banner-for-your-advertisement/ And, We want to publish this article / blog on UK, AUS & CA blog without making any changes. I have read following paragraph on Google's official guidelines and It's inspire me to make it happen. Which is best solution for it? Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers.
Content Development | | CommercePundit0 -
Duplicate Content
I have a service based client that is interested in optimizing his website for all the services that he provides in all the locations that he provides them in. For example: Service 1, location 1 Service 1, location 2 Service 2, location 1 Service 2, location 2 He wants to essentially create an individual page for each of the above, but i'm concerned that he will be penalized for duplicate content. Each of the pages would have the keyword in the url, page title and within the main body of content. We would certainly alter the content somewhat, but not sure how much a difference this would make. Any thoughts or advice would be greatly appreciated.
Content Development | | embracedarrenhughes1 -
Is there a tool for measuring content freshness?
i.e. crawling a site to identify last date of new or changed content? Thanks.
Content Development | | PeterTroast0 -
Duplicate external links?
I have been guest posting at a variety of reputable blogs in my niche. I generally write once or twice a month and have a bio link with a link to my blog. I'm wondering if multiple links from the same domain (but different pages) helps, or if there are some diminishing returns here. Should I only be writing one post for them? Of course, there are other non-SEO benefits too, because these are reputable sites. But I'm wondering how this helps my SEO? Thanks in advance!
Content Development | | JodiFTM0