Premium Content
-
Hey Guys
I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method.- Is it healthy for the site to be removing tons of content of live pages and replace with a log in options
- Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it.
Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index.
Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option)
Thanks Guys and I appreciate any help!
-
Can you show the beginning of the article after 30 days is up? Similar to what New Scientist do: http://www.newscientist.com/article/mg22229720.600-memory-implants-chips-to-fix-broken-brains.html
-
I don't think this is the best option of "forcing" users to create an account, or to have them sign up.
First, I think you may run into an issue in the future of having Google come to reindex a page that is completely different or empty. As a solution, I would use Javascript or another method to load the form you want the user to fill out, without a way to close it and view the page of content beneath. Once they view the popup, have a link at the bottom of the box that states something light like:
"Oops! Looks like your 30 trial subscription has run out. This content is only available to premium users. To sign up for a premium account, please fill out the form fields above. If you do not wish to upgrade your account at this time, click here to return to the home page."
Another option would be to allow restricted access to certain features, so that people actually want to sign up because it provides more value.
"The site is pretty big so google has some tolerance of things we can get away with it."
I doubt that. Google is not going to play favorites with a site due to the number of pages or unique URL's. Did someone at Google tell you this? If not, then no, no, no. Chances are, you are going to have more non-registered or non-premium users try and access your content than registered, thus increasing the amount of pages that have the deleted or non-viewable content. Also, if you have that many pages, why risk losing them in the index?
-
One thing you can do is add an unavailable after meta tag to the pages. This tells google to drop the pages from the index after a certain day. I don't think it is unhealthy, it is the way a lot of sites work to be honest.
Also for the thin content, you should try to send a 410 status code for people trying to open the page after it is behind the paywall, but at the same time display the login page that tells them what is happening. Google should take note of that.
Here is a video I saw posted yesterday on another question, around the 35 minute mark Matt Cutts address this question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from Wordpress Template
Hi Wondering if anyone can help, my site has flagged up with duplicate content on almost every page, i think this is because the person who set up the site created a lot of template pages which are using the same code but have slightly different features on. How would I go about resolving this? Would I need to recode every template page they have created?
Technical SEO | | Alix_SEO0 -
Sharing/hosting of content questions...
I just wanted to get opinion on some of the fundamentals and semantics of optimisation and content generation/distribution - your thoughts and opinions are welcome. OK, for example, lets assume (for illustration purposes) that I have a site - www.examplegolfer.com aimed at golfers with golf related content. The keywords I would like to optimise for are: golf balls golf tees lowering your golf handicap drive a golf ball further Now, I'm going to be creating informative, useful content (infographics, articles, how to guides, video demonstrations etc) centred around these topics/keywords, which hopefully our audience/prospects will find useful and bookmark, share and monition our site/brand on the web, increasing (over time) our position of these terms/keywords in the SERP's. Now, once I've researched and created my content piece, where should I place it? Let's assume it's an infographic - should this be hosted on an infographic sharing site (such as Visually) or on my site, or both? If it's hosted or embedded on my site, should this be in a blog or on the page I'm optimising for (and I've generated my keyword around)? For example, if my infographic is around golf balls, should this be embedded on the page www.examplegolfer.com/golf-balls (the page I'm trying to optimise) and if so, and it's also placed elsewhere around the internet (i.e on Visually for example), this could technically be seen as duplicated content as the infographic is on my site and on Visually (for example)? How does everyone else share/distribute/host their created content in various locations whilst avoiding the duplicated content issue? Or have I missed something? Also, how important is it to include my keyword (golf balls) in the pieces' title or anchor text? Or indeed within the piece itself? One final question - should the content by authoured/shared as the brand/company or an individual (spokesperson if you like) on behalf of the company (i.e. John Smith)? I'm all for creating great, interesting, useful content for my audience, however I want to ensure we're getting the most out of it as researching influencers, researching the piece and creating it and distributing it isn't a quick or easy job (as we all know!). Thoughts and comments welcome. Thanks!
Technical SEO | | Carl2870 -
Duplicate page content
hi I am getting an duplicate content error in SEOMoz on one of my websites it shows http://www.exampledomain.co.uk http://www.exampledomain.co.uk/ http://www.exampledomain.co.uk/index.html how can i fix this? thanks darren
Technical SEO | | Bristolweb0 -
Duplicate Content Issue with
Hello fellow Moz'rs! I'll get straight to the point here - The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate. We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or? Thank you all and happy rankings, James
Technical SEO | | JamesPiper0 -
How to tell if PDF content is being indexed?
I've searched extensively for this, but could not find a definitive answer. We recently updated our website and it contains links to about 30 PDF data sheets. I want to determine if the text from these PDFs is being archived by search engines. When I do this search http://bit.ly/rRYJPe (google - site:www.gamma-sci.com and filetype:pdf) I can see that the PDF urls are getting indexed, but does that mean that their content is getting indexed? I have read in other posts/places that if you can copy text from a PDF and paste it that means Google can index the content. When I try this with PDFs from our site I cannot copy text, but I was told that these PDFs were all created from Word docs, so they should be indexable, correct? Since WordPress has you upload PDFs like they are an image could this be causing the problem? Would it make sense to take the time and extract all of the PDF content to html? Thanks for any assistance, this has been driving me crazy.
Technical SEO | | zazo0 -
Duplicate Content within Website - problem?
Hello everyone, I am currently working on a big site which sells thousands of widgets. However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget. However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content. Put it on the main widget page was the plan but what do I do about the sub pages. I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David
Technical SEO | | OzDave0 -
Duplicate content
I am getting flagged for duplicate content, SEOmoz is flagging the following as duplicate: www.adgenerator.co.uk/ www.adgenerator.co.uk/index.asp These are obviously meant to be the same path so what measures do I take to let the SE's know that these are to be considered the same page. I have used the canonical meta tag on the Index.asp page.
Technical SEO | | IPIM0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0