Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Membership/subscriber (/customer) only content and SEO best practice
-
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice?
A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!)
Thanks in advance, Luke
-
I'd say it's mostly transferable as plenty of content is found in both news and the main index. News is more of a service overlay that attempts to better handle user expectations for frequency and speed of response when it comes to news items. Still, old news gets into the index and treated like content from most any site so if you have a subscription based model that aligns with what they're recommending for more news orientated sites, at least you're fitting into a form of what they outline.
-
Everything I could find was related to Google News, but not the main index? Is it directly transferrable? Especially given it's the _oldest _content that's going to end up being paid for in my example.
-
As an example, the New York Times does this via tracking of how many full articles a user reads while allowing Googlebot full access to its articles. Sites that use this method employ "no cache" on Google so articles can't be read there and then various forms of tracking to ensure users are being counted correctly. Here are some thoughts on this and more from Google's side that might help you out: https://support.google.com/news/publisher/answer/40543. Cheers!
-
Don't want to hijack this thread at all, but I was looking for something very similar and wonder if we're thinking of the same thing?
A blog wants to make it's older content only available to premium members - but still retain a snippet of that content (perhaps the first few paragraphs (the posts are quite long) as visible to search engines. Thus allowing traffic to arrive on the site from the content, but not necessarily view it.
I saw that as being against the spirit of what Google wants to do, but was hoping for a little clarity on that. I wonder if the OP was thinking of something similar?
-
As Leonie states, the search engines are for public facing content. If your site is completely private then you'd be more interested in making sure it's not found anywhere other than by members, however it sounds like you have some aspects of the site that could be public or created to attract new members. Typically in these cases you pull small topical samples from the site that are shown to benefit the members and help articulate why membership is valuable. It may be a matter of having what is practically like two sites: the public facing, membership recruitment site, and the private, non-indexed membership site. Cheers!
-
Hi, if your whole website is for members and behind a login and password, Searchengines can't index the website and thus not visisble for others than your members.
if you want other people to find your website, you'll need a public part, which you can optimize for your users and searchengines.
the question is: do you want other people than your members find the website, if yes, than you'll need content that searchengines can find. If the answer is no you can hide the whole website behind a login and password.
i manage a website which a part of that is only for members. that part is not optimized and behind a login and password. The rest of the site is public and need to be found in the searchengines. This part is optimized for on - and off page seo.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Kind of duplicate categories and custom taxonomy. Necessary, but bad for SEO?
Hello Everyone! I'm new here! My husband and I are working on creating a website: https://sacwellness.com .The site is an online therapist directory for the the Sacramento California area. Our problem is this: In wordpress our category system is being used for blog posts. Our theme is using a custom taxonomy system to categorize different therapist specialties, therapeutic approaches, etc. We've found ourselves in a position where our custom taxonomy and categories are near duplicates. for example we have the blog categories: ADHD counseling, Anxiety therapy, and Career counseling our corresponding custom taxonomy/therapist categories are: ADHD, Anxiety, and....(oops) career counseling. My understanding is that google doesn't see a difference between identically named categories and custom taxonomies and will so choose one to rank and disregard the other, effectively leaving you competing against yourself. is this true in a case like this? Can google maybe understand the difference because of the custom taxonomy and/or URL paths? if this is a problem is it ok to have near duplicates....like ADHD vs. ADHD counseling. This has been our solution so far....but now we're questioning it....derp x_x. I thought about tagging the categories with no index, but I think the archive pages would be useful for people. Essentially we have 2 sets of archives for each keyword. One is for blog posts, and one is for therapists who work with that particular issue along with the 6 most recent blog posts in that category.....because we are putting the 6 most recent blog posts at the bottom of the therapist pages I feel like it wouldn't be as terrible of a loss if we had to noindex the category pages. ....what do you think? Thank you!
Intermediate & Advanced SEO | | angelamaemae0 -
Best SEO for table in mobile view
I'm wondering what the best way to present a table for mobile view in terms of SEO? It's a complicated table (not simple rows & columns but also col spans) which doesn't work with any responsive techniques I can find. I can offer different content for desktop / mobile so desktop is OK. But what's the best way forward with Google for mobile? I could offer a jpg or simply an explanation to revisit the page on desktop, but neither of those options seem particularly Google-friendly?
Intermediate & Advanced SEO | | Ann640 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Faceted Navigation URLs Best Practices
Hi, We are developing new Products Pages with faceted filters. You can see it here: https://www.viatrading.com/wholesale-products/ We have a feature allowing to Order By and Group By, which alters the order of all products. There will also be the option to view Products as a table, which will contain same products but with different design and maybe slightly different content of each product. All this will happen without changing the URL, https://www.viatrading.com/all/ Is this the best practice? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Best practice for H1 on site without H1 - Alternative methods?
I have recently set up a mens style blog - the site is made up of articles pulled in from a CMS and I am wanting to keep the design as clean as possible - so no text other than the articles. This makes it hard to get a H1 tag into the page - are there any solutions/alternatives? that would be good for SEO? The site is http://www.iamtheconnoisseur.com/ Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
Does blocking foreign country IP traffic to site, hurt my SEO / US Google rankings?
I have a website is is only of interest to US visitors. 99% (at least) of Adsense income is from the US. But I'm getting constant attempts by hackers to login to my admin account. I have countermeasures fo combat that and am initiating others. But here's my question: I am considering not allowing any non US, or at least any non-North American, traffic to the site via a Wordpress plugin that does this. I know it will not affect my business negatively, directly. However, are there any ramifications of the Google bots of these blocked countries not being able to access my site? Does it affect the rankings of my site in the US Google searches. At the very least I could block China, Russia and some eastern European countries.
Intermediate & Advanced SEO | | bizzer0 -
Duplicate Content www vs. non-www and best practices
I have a customer who had prior help on his website and I noticed a 301 redirect in his .htaccess Rule for duplicate content removal : www.domain.com vs domain.com RewriteCond %{HTTP_HOST} ^MY-CUSTOMER-SITE.com [NC]
Intermediate & Advanced SEO | | EnvoyWeb
RewriteRule (.*) http://www.MY-CUSTOMER-SITE.com/$1 [R=301,L,NC] The result of this rule is that i type MY-CUSTOMER-SITE.com in the browser and it redirects to www.MY-CUSTOMER-SITE.com I wonder if this is causing issues in SERPS. If I have some inbound links pointing to www.MY-CUSTOMER-SITE.com and some pointing to MY-CUSTOMER-SITE.com, I would think that this rewrite isn't necessary as it would seem that Googlebot is smart enough to know that these aren't two sites. -----Can you comment on whether this is a best practice for all domains?
-----I've run a report for backlinks. If my thought is true that there are some pointing to www.www.MY-CUSTOMER-SITE.com and some to the www.MY-CUSTOMER-SITE.com, is there any value in addressing this?0