Membership/subscriber (/customer) only content and SEO best practice
-
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice?
A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!)
Thanks in advance, Luke
-
I'd say it's mostly transferable as plenty of content is found in both news and the main index. News is more of a service overlay that attempts to better handle user expectations for frequency and speed of response when it comes to news items. Still, old news gets into the index and treated like content from most any site so if you have a subscription based model that aligns with what they're recommending for more news orientated sites, at least you're fitting into a form of what they outline.
-
Everything I could find was related to Google News, but not the main index? Is it directly transferrable? Especially given it's the _oldest _content that's going to end up being paid for in my example.
-
As an example, the New York Times does this via tracking of how many full articles a user reads while allowing Googlebot full access to its articles. Sites that use this method employ "no cache" on Google so articles can't be read there and then various forms of tracking to ensure users are being counted correctly. Here are some thoughts on this and more from Google's side that might help you out: https://support.google.com/news/publisher/answer/40543. Cheers!
-
Don't want to hijack this thread at all, but I was looking for something very similar and wonder if we're thinking of the same thing?
A blog wants to make it's older content only available to premium members - but still retain a snippet of that content (perhaps the first few paragraphs (the posts are quite long) as visible to search engines. Thus allowing traffic to arrive on the site from the content, but not necessarily view it.
I saw that as being against the spirit of what Google wants to do, but was hoping for a little clarity on that. I wonder if the OP was thinking of something similar?
-
As Leonie states, the search engines are for public facing content. If your site is completely private then you'd be more interested in making sure it's not found anywhere other than by members, however it sounds like you have some aspects of the site that could be public or created to attract new members. Typically in these cases you pull small topical samples from the site that are shown to benefit the members and help articulate why membership is valuable. It may be a matter of having what is practically like two sites: the public facing, membership recruitment site, and the private, non-indexed membership site. Cheers!
-
Hi, if your whole website is for members and behind a login and password, Searchengines can't index the website and thus not visisble for others than your members.
if you want other people to find your website, you'll need a public part, which you can optimize for your users and searchengines.
the question is: do you want other people than your members find the website, if yes, than you'll need content that searchengines can find. If the answer is no you can hide the whole website behind a login and password.
i manage a website which a part of that is only for members. that part is not optimized and behind a login and password. The rest of the site is public and need to be found in the searchengines. This part is optimized for on - and off page seo.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
VTEX Infinite Scroll Design: What is On-Page SEO Best Practice?
We are migrating to the VTEX E Commerce platform and it is built on javascript, so there are no <a>tags to link product pages together when there is a long list of products. According to the Google Search Console Help document, "Google can follow links only if they are an</a> <a>tag with an href attribute." - Google Search Console Help document </a>http://support.google.com/webmasters/answer/9112205. So, if there a 1000 products, javascript just executes to deliver more content in order to browse through the entire product list. The problem is there is no actual link for crawlers to follow. Has anyone implemented a solution to this or a similar problem?
Intermediate & Advanced SEO | | ggarciabisco0 -
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Showing Different Content To Members & Non-Members/Google and Cloaking Risk
How do we safely show logged-in members/Google one type of content on a page and logged out/non-members another kind of content without getting slammed for cloaking? Right now we do this thing where we show Google everything on the page, but new visitors partial forum comments with the pitch to sign up and see full comments. So far, we have not gotten into trouble for this. The new idea is to show non-members a lot of marketing messages and one kind of navigation and then once they sign up and are logged in, show different or no marketing messages and a different kind of navigation. How do we stay out of trouble with this? Where is the cloaking line drawn? It's got me kinda nervous. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Best-of-the-web content: Graphical Tips
This question is for EGOL (if he's willing) and anyone else who wants to partake. EGOL is the best content writer I've ever run into, really. I'm wondering what his top 3 to 5 tips are on how to use graphical layout (font, images, graphics, organization, menu, etc) to make content irresistable. A couple of assumptions: The content is written really well from a perspective of authority. Also, we're not including video on this one. Again, anyone is welcome to answer this. Thanks!
Intermediate & Advanced SEO | | BobGW1 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9