Duplicate Content behind a Paywall
-
We have a website that is publicly visible. This website has content.
We'd like to take that same content, put it on another website, behind a paywall.
Since Google will not be able to crawl those pages behind the paywall is there any risk to ua doing this?
Thanks!
Mike
-
Hi Mike, just to be clear on what Thomas is suggesting, as I think he might be getting mixed up between noindex and robots.txt.
If you simply add noindex,nofollow to a bunch of pages, this could still get you in trouble. Noindex doesn't mean DO NOT CRAWL, it means DO NOT INDEX. There's a big difference.
If something has noindex, Google can still crawl that content but they won't put it in the search results.
The only way to completely make sure that Google won't crawl content is by blocking it in robots.txt or in your case putting it behind a username and password.
So to answer your question, yes it's fine as long as it's behind a login, Google can't punish you for it since they can't see it.
I hope this helps,
Craig
-
Make sure you make the pages so that there are no follow pages and no index there's no risk of Google penalizing you for duplicate content whatsoever.
Build away. I hope I have helped you
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Duplicate Blog Content
Hey Mozzers! I have a client who is a dentist, with multiple offices, and wants to use an identical blog post (including images, alt text, text, tags, everything pretty much) from one of his office's website on his other office's website to save time and mimic the success of the original content for his other office. Everything I've researched says this is a HUGE no-no, but I'd love to hear if anyone else has tried to do something like this and if they were successful in doing so (implementing rel=cannonical or 301?). Also, if he is the owner of both sites and they both receive low traffic will Google even notice? My biggest worry is that if I did post the content on his other site, identically, that it would dilute the visibility of the original post, which has and is continuing to surpass our organic search goals... The main goal though, is to drive traffic to BOTH sites via organic search using the same content. Would love to hear everyone's opinions if this is possible or unrealistic... Thanks! -D
Content Development | | Derrald0 -
Why is content getting longer?
I find it odd that with the way life is today -- the gotta-have-it-now, instant gratification, can't hold someone's attention span for longer than 3 seconds -- why Google is wanting content to be REALLY long?? I've read articles saying content should be as long as 2,000 words per page. This just seems nuts to me. No one wants to read anymore. Look at how short Twitter posts are and how videos are so prevalent now. Any thoughts?
Content Development | | SEOhughesm0 -
Content Syndication Service
Am curious to get the forum's opinion on content syndication services like SYNND . Has anyone tried them or any other content syndication networks. Does the forum have any recommendations for good quality syndication networks that can be used to distribute content to quality sources.
Content Development | | SEO5Team0 -
Marking our content as original, where the rel=author tag might not be applied
Hello, Can anyone tell, if it is possible to protect text –type content without the rel=author tag? We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased. My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google. The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:) What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10. Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own. So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done. (Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.) Thank you in advance for all of you, sharing your thoughts with me on the topic.
Content Development | | Dilbak0 -
What is the best way to get around duplicate content when you are advertising exactly the same content on two different sites?
I am currently trying to improve exposure for an online degrees website but the content for the degree program pages is exactly the same as the company's main website. What would you suggest for getting around the duplicate content issue as a lot of the curriculum content will obviously be the same for each module, etc? Thanks
Content Development | | BeattieGroup0 -
Duplicate content - 6 websites, 1 IP. Is the #1 site knocked down too?
Yes I know, running multiple websites on 1 IP isn't smart. 6 Websites with duplicate content on 1 IP is even worse. It's a technical issue we can't solve quickly. Thing is, our #1 website, which has the highest DA and PR, was the first website with all this content. All other websites we're running were launched a few months, and some a few years, later. All content was copied from the #1 website. I'd say the other websites would get knocked down by Google, because they duplicated the content. Google should see that our #1 website was the first that uploaded this content. Therefore our #1 website should rank normally. Questions is: What does Google think of duplicate content when all websites are on 1 IP? Is, or will our #1 website get punished as well?
Content Development | | Webprint0 -
Duplicate content?
I am not understanding this - I see a duplicate content warning. When I look into it I see these two urls: http;//search-engine-upgrade.com http;//search-engine-upgrade.com/default.asp (NOT a blog)
Content Development | | dcmike0