Premium Content
-
Hey Guys
I woking on a site that publishes hundreds of new content a day and part of the content is only available for users for 30 days. After 30 days the content is only accessible to premium users.
After 30 days, the page removes the content and replaces it with a log in/ sign up option. The same URL is kept for each page and the title of the article.
I have 2 concerns about this method.- Is it healthy for the site to be removing tons of content of live pages and replace with a log in options
- Should I worry about Panda for creating tons of pages with unique URL but very similar source /content - the log in module and the text explaining that it is only available to premium users. The site is pretty big so google has some tolerance of things we can get away with it.
Should I add a noindex attribute for those pages after 30 days? Even though it can takes months until google actually removes from the index.
Is there a proper way for performing this type of feature in sites with a log in option after a period of time (first click free is not an option)
Thanks Guys and I appreciate any help!
-
Can you show the beginning of the article after 30 days is up? Similar to what New Scientist do: http://www.newscientist.com/article/mg22229720.600-memory-implants-chips-to-fix-broken-brains.html
-
I don't think this is the best option of "forcing" users to create an account, or to have them sign up.
First, I think you may run into an issue in the future of having Google come to reindex a page that is completely different or empty. As a solution, I would use Javascript or another method to load the form you want the user to fill out, without a way to close it and view the page of content beneath. Once they view the popup, have a link at the bottom of the box that states something light like:
"Oops! Looks like your 30 trial subscription has run out. This content is only available to premium users. To sign up for a premium account, please fill out the form fields above. If you do not wish to upgrade your account at this time, click here to return to the home page."
Another option would be to allow restricted access to certain features, so that people actually want to sign up because it provides more value.
"The site is pretty big so google has some tolerance of things we can get away with it."
I doubt that. Google is not going to play favorites with a site due to the number of pages or unique URL's. Did someone at Google tell you this? If not, then no, no, no. Chances are, you are going to have more non-registered or non-premium users try and access your content than registered, thus increasing the amount of pages that have the deleted or non-viewable content. Also, if you have that many pages, why risk losing them in the index?
-
One thing you can do is add an unavailable after meta tag to the pages. This tells google to drop the pages from the index after a certain day. I don't think it is unhealthy, it is the way a lot of sites work to be honest.
Also for the thin content, you should try to send a 410 status code for people trying to open the page after it is behind the paywall, but at the same time display the login page that tells them what is happening. Google should take note of that.
Here is a video I saw posted yesterday on another question, around the 35 minute mark Matt Cutts address this question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
SEO for a a static content website
Hi everyone, We would like to ask suggestions on how to improve our SEO for our static content help website. With the release of each new version, our company releases a new "help" page, which is created by an authoring system. This is the latest page: http://kilgray.com/memoq/2015/help-en/ I have a couple of questions: 1- The page has an index with many links that open up new subpages with content for users. It is impossible to add title tags to this subpages, as everything is held together by the mother page. So it is really hard to for users to find these subpage information when they are doing a google search. 2- We have previous "help" pages which usually rank better in google search. They also have the same structure (1 page with big index and many subpages) and no metadata. We obviously want the last version to rank better, however, we are afraid exclude them from bots search because the new version is not easy to find. These are some of the previous pages: http://kilgray.com/memoq/2014R2/help-en/ http://kilgray.com/memoq/62/help-en/ I would really appreciate suggestions! Thanks
Technical SEO | | Kilgray0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Magento Multistore and Duplicate Content
Hey all, I am currently optimizing a Magento Multistore running with two store views (one per language). Now when I switch from one language to another the urls shows: mydomain.de/.../examplepage.html?___store=german&___from_store=english The same page can also be reached by just entering mydomain.de/.../examplepage.html The question is: Does Google consider this as Duplicate Content or it it nothing to worry about? Or should I just do a dynamic 301 redirect from the 1st version to the 2nd? I read about some hacks posted in diferent magento forums but as I am working for a customer I want to avoid hacks. Also setting "Add Store Code to Urls" didn't help.
Technical SEO | | dominator0 -
Tips and duplicate content
Hello, we have a search site that offers tips to help with search/find. These tips are organized on the site in xml format with commas... of course the search parameters are duplicated in the xml so that we have a number of tips for each search parameter. For example if the parameter is "dining room" we might have 35 pieces of advice - all less than a tweet long. My question - will I be penalized for keyword stuffing - how can I avoid this?
Technical SEO | | acraigi0 -
Duplicate content in Magento
Hi all We got some serious issues with duplicate content on a Magento site that we are marketing. For example: http://www.citcop.se/varmepumpar-luft-luft/panasonic/panasonic-nordic-ce9nke-5-0kw http://www.citcop.se/panasonic/panasonic-nordic-ce9nke-5-0kw http://www.citcop.se/panasonic-nordic-ce9nke-5-0kw All of the above seem to work just fine as it is now but since they are excatly the same product they should ofcourse do a 301 redirect to the main page. Any ideas on how to sort this out in Magnto without having to resort to manual work in .htaccess? Have a great day Fredrik
Technical SEO | | Resultify0 -
URL content format - Any impact on SEO
I understand that there is a suggested maximum length for a URL so as not to be penalized by search engines. I'm wondering if I should if should optimize our ecommerce categories to be descriptive or use abbreviations to help keep the URL length to a minimum? Our products are segmented into many categories, so many products URL's are pretty long if we go the descriptive route. I've also heard that removing the category component entirely from a product URL can also be considered. I'm fairly new to all this SEO stuff, so I'm hoping the community can share their knowledge on the impact of these options. Cheers, Steve
Technical SEO | | SteveMaguire0 -
Canonical pagination content
Hello We have a large ecommerce site, as you are aware that ecommerce sites has canonical issues, I have read various sources on how best to practice canonical on ecommerce site but I am not sure yet.. My concert is pagination where I am on category product listing page.. the pagination will have all different product not same however the meta data will be same so should I make let's say page 2 or 3 to main category page or keep them as is to index those pages? Another issue is using filters, where I am on any page and I filter by price or manufacturer basically the page will be same so here It seems issue of duplicate content, so should I canonical to category page only for those result types? So basically If I let google crawl my pagination content and I only canonical those coming with filter search result that would be best practice? and would google webmaster parameter handling case would be helpful in this scenario ? Please feel free to ask in case you have any queries regards
Technical SEO | | CNMOnline28
Carl0