Premium Content
-
Hey Guys
I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method.- Is it healthy for the site to be removing tons of content of live pages and replace with a log in options
- Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it.
Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index.
Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option)
Thanks Guys and I appreciate any help!
-
Can you show the beginning of the article after 30 days is up? Similar to what New Scientist do: http://www.newscientist.com/article/mg22229720.600-memory-implants-chips-to-fix-broken-brains.html
-
I don't think this is the best option of "forcing" users to create an account, or to have them sign up.
First, I think you may run into an issue in the future of having Google come to reindex a page that is completely different or empty. As a solution, I would use Javascript or another method to load the form you want the user to fill out, without a way to close it and view the page of content beneath. Once they view the popup, have a link at the bottom of the box that states something light like:
"Oops! Looks like your 30 trial subscription has run out. This content is only available to premium users. To sign up for a premium account, please fill out the form fields above. If you do not wish to upgrade your account at this time, click here to return to the home page."
Another option would be to allow restricted access to certain features, so that people actually want to sign up because it provides more value.
"The site is pretty big so google has some tolerance of things we can get away with it."
I doubt that. Google is not going to play favorites with a site due to the number of pages or unique URL's. Did someone at Google tell you this? If not, then no, no, no. Chances are, you are going to have more non-registered or non-premium users try and access your content than registered, thus increasing the amount of pages that have the deleted or non-viewable content. Also, if you have that many pages, why risk losing them in the index?
-
One thing you can do is add an unavailable after meta tag to the pages. This tells google to drop the pages from the index after a certain day. I don't think it is unhealthy, it is the way a lot of sites work to be honest.
Also for the thin content, you should try to send a 410 status code for people trying to open the page after it is behind the paywall, but at the same time display the login page that tells them what is happening. Google should take note of that.
Here is a video I saw posted yesterday on another question, around the 35 minute mark Matt Cutts address this question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
noindex, follow for thin content advice
Hello there We struggle with a number of none indexed pages. I want to ask your professional opinion. The robots tag is set up as follows, <meta name='robots' content='noindex, follow' /> those pages haven`t got any value but contain valuable pages.
Technical SEO | | Kingagogomarketing
Is setting up robots name="robots" content="noindex, nofollow" / would be a good solution? Here is the page https://www.lrbconsulting.co.uk/tag/enforcement/page/2/
with noindex robot tag. Please let me know what you think. #noindex, follow for thin content
#noindex, follow
#meta robots set up0 -
Moving content to a new domain
I need to move a lot of content with podcasts and show notes to a new domain. Instead of doing redirects, we want to keep some content on the current domain to retain the link value. There are business reason to keep content on both websites but the new website will primarily be used for SEO moving forward.If we keep the audio portion of the podcast on the old website and move the show notes and the audio portion of the podcast to the new website, is there any issues with duplicate content?Long-term, I presume Google will re-index the old and the new pages, thus no duplicate content, but I want to make sure I'm not missing anything. I was planning to fetch pages in Search Console as we migrate content.Thanks for your help!
Technical SEO | | JimmyFritz0 -
Possible scraper reusing content. Should I be concerned?
I've noticed a few overseas sites seem to be repurposing content from our blog. The process to report for DMCA seems lengthy. Should I be concerned enough to persue this or just write it off as something that happens? Here's an original - http://www.martinsprocket.com/sprocket-sense/sprocket-sense/2015/12/11/free-sprocket-CAD-models Here's an example - http://ptech.in/silica-crushing/free-martin-sprocket-autocad-drawing-download-martin.html Thanks! f9Wfk2h
Technical SEO | | sprockets0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
See any issues with this tabbed content page?
When I view source, and view as Googlebot it's showing as 1 long page of content = good. However, the developer uses some redirects and dynamic page generation to pull this off. I didn't see any issues from a Search perspective but would appreciate a second opinion: Click here Thanks!
Technical SEO | | 540SEO0 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230 -
Content Delivery Network
Anyone have a good reference for implementing a content delivery network? Any SEO pitfalls with using a CDN (brief research seems to indicate no problems)? I seem to recall that SEOmoz was using Amazon Web Services (AWS) for CDN. Is that still the case? All CDN & AWS experiences, advice, references welcomed!
Technical SEO | | Gyi0