Premium Content
-
Hey Guys
I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method.- Is it healthy for the site to be removing tons of content of live pages and replace with a log in options
- Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it.
Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index.
Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option)
Thanks Guys and I appreciate any help!
-
Can you show the beginning of the article after 30 days is up? Similar to what New Scientist do: http://www.newscientist.com/article/mg22229720.600-memory-implants-chips-to-fix-broken-brains.html
-
I don't think this is the best option of "forcing" users to create an account, or to have them sign up.
First, I think you may run into an issue in the future of having Google come to reindex a page that is completely different or empty. As a solution, I would use Javascript or another method to load the form you want the user to fill out, without a way to close it and view the page of content beneath. Once they view the popup, have a link at the bottom of the box that states something light like:
"Oops! Looks like your 30 trial subscription has run out. This content is only available to premium users. To sign up for a premium account, please fill out the form fields above. If you do not wish to upgrade your account at this time, click here to return to the home page."
Another option would be to allow restricted access to certain features, so that people actually want to sign up because it provides more value.
"The site is pretty big so google has some tolerance of things we can get away with it."
I doubt that. Google is not going to play favorites with a site due to the number of pages or unique URL's. Did someone at Google tell you this? If not, then no, no, no. Chances are, you are going to have more non-registered or non-premium users try and access your content than registered, thus increasing the amount of pages that have the deleted or non-viewable content. Also, if you have that many pages, why risk losing them in the index?
-
One thing you can do is add an unavailable after meta tag to the pages. This tells google to drop the pages from the index after a certain day. I don't think it is unhealthy, it is the way a lot of sites work to be honest.
Also for the thin content, you should try to send a 410 status code for people trying to open the page after it is behind the paywall, but at the same time display the login page that tells them what is happening. Google should take note of that.
Here is a video I saw posted yesterday on another question, around the 35 minute mark Matt Cutts address this question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "random" content
Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.
Technical SEO | | Vanderlindemedia0 -
Duplicate content issue on Magento platform
We have a lot of duplicate pages (600 urls) on our site (total urls 800) built on the Magento e-commerce platform. We have the same products in a number of different categories that make it easy for people to choose which product suits their needs. If we enable the canonical fix in Magento will it dramatically reduce the number of pages that are indexed. Surely with more pages indexed (even though they are duplicates) we get more search results visibility. I'm new to this particular SEO issue. What do the SEO community have to say on this matter. Do we go ahead with the canonical fix or leave it?
Technical SEO | | PeterDavies0 -
Can Googlebot crawl the content on this page?
Hi all, I've read the posts in Google about Ajax and javascript (https://support.google.com/webmasters/answer/174992?hl=en) and also this post: http://moz.com/ugc/can-google-really-access-content-in-javascript-really. I am trying to evaluate if the content on this page, http://www.vwarcher.com/CustomerReviews, is crawlable by Googlebot? It appears not to be. I perused the sitemap and don't see any ugly Ajax URLs included as Google suggests doing. Also, the page is definitely indexed, but appears the content is only indexed via its original source (Yahoo!, Citysearch, Google+, etc.). I understand why they are using this dynamic content, because it looks nice to an end-user and requires little to no maintenance. But, is it providing them any SEO benefit? It appears to me that it would be far better to take these reviews and simply build them into HTML. Thoughts?
Technical SEO | | danatanseo0 -
Duplicate Content Vs No Content
Hello! A question that has been throw around a lot at our company has been "Is duplicate content better than no content?". We operate a range of online flash game sites, most of which pull their games from a feed, which includes the game description. We have unique content written on the home page of the website, but aside from that, the game descriptions are the only text content on the website. We have been hit by both Panda and Penguin, and are in the process of trying to recover from both. In this effort we are trying to decide whether to remove or keep the game descriptions. I figured the best way to settle the issue would be to ask here. I understand the best solution would be to replace the descriptions with unique content, however, that is a massive task when you've got thousands of games. So if you have to choose between duplicate or no content, which is better for SEO? Thanks!
Technical SEO | | Ryan_Phillips0 -
API for testing duplicate content
Does anyone know a service or API or php lib to compare two (or more) pages and to return their similiarity (Level-3-Shingles). API would be greatly prefered.
Technical SEO | | Sebes0 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0 -
How do I get content to be indexed at the top?
I have a paragraph at the top of my homepage. I was told I could use css to make the content visually appear at the bottom of the page but it would still get indexed at the top of the page, still giving it the same level of importance. Can anyone tell me how to do this?
Technical SEO | | BradBorst0 -
Multiple domain names with similar content
Hi, we've got multiple domains that point to the same website and same content. The only difference is the currency and some text, you could say only about 5% difference in each domain's content: http://www.redwrappings.com.au/
Technical SEO | | Essentia
http://www.redwrappings.com/ Will Google penalise us for having 95% similar content for each domain (they sell the same products but in different currencies)? We shoudn't really put canonical link, should we? Because 5% of the content is different, which means they are not identical. What would be the best solution if this is a problem? Thanks0