Indexing content behind a login
-
Hi,
I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login.
My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google!
At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will.
If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages!
I look forward to all of your suggestions as I'm struggling for ideas now!
Thanks
Steve
-
Thanks everyone... It's not as restrictive as patient records... Basically, because of the way our health service works in the UK we are not allowed to promote material around our medicines to patients, it should be restricted only to HCP's. If we are seen to be actively promoting to patients we run the risk of a heavy fine.
For this reason we need to take steps to ensure that we only target this information towards HCP's and therefore we require them to register before being able to access the content...
My issue is that HCP's may search for a Brand that we supply but we have to be very careful what Brand information we provide outside of log-in. Therefore the content we can include on landing pages cannot really be optimised for the keywords that they are searching for! Hence why I want the content behind log-in indexed but not easily available without registering...
It's a very difficult place to be!
-
I guess I was just hoping for that magic answer that doesn't exist! It's VERY challenging to optimise a site with these kinds of restrictions but I get I just need to put what I can on the landing pages and optimise as best I can with the content I can show!
We also have other websites aimed at patients where all the content is open so I guess I'll just have to enjoy optimising these instead
Thanks for all your input!
Steve
-
Steve,
Yes that would be cloaking. I wouldn't do that.
As Pete mentioned below, your only real options at this point are to make some of the content, or new content, available for public use. If you can't publish abstracts at least, then you'll have to invest in copywriting content that is legally available for the public to get traffic that way, and do your best to convert them into subscribers.
-
Hi Steve
If it can only be viewed legally by health practitioners who are members of your site, then it seems to me you don't have an option as by putting any of this content into the public domain on Google by whatever method you use will be deemed illegal by whichever body oversees it.
Presumably you cannot also publish short 25o word summaries of the content?
If not, then I think you need to create pages that are directly targeted at marketing the site to health practitioners. Whilst the pages won't be able to contain the content you want to have Google index, they could still contain general information and the benefits of becoming a subscriber.
Isn't that the goal of the site anyway, i.e. to be a resource to health practitioners? So, without being able to make the content public, you have to market to them through your SEO or use some other form or indirect or direct marketing to encourage them to the site to sign up.
I hope that helps,
Peter -
Thanks all... Unfortunately it is a legal requirement that the content is not made publicly available but the challenge then is how do people find it online!
I've looked at first click free and pretty much ever other option I could think of and yet to find a solution
My only option is to allow Google Bot through the authentication which will allow it to index the content but my concern is that this is almost certainly cloaking...
-
Please try looking at "First Click Free" by Google
https://support.google.com/webmasters/answer/74536?hl=en
I think this is along the lines of what you are looking for.
-
Hi Steve
As you already know, if a page is not crawlable it's not indexable. I don't think there is any way around this without changing the strategy of the site. You said, _"We have a number of open landing pages but we're limited to what indexable content we can have on these pages". _Is that limitation imposed by a legal requirement or something like that, or by the site owners because they don't want to give free access?
If the marketing strategy for the site is to grow the membership, then as it's providing a content service to its members then it has to give potential customers a sample of its wares.
I think there are two possible solutions.
(1) increase the amount of free content available on the site to give the search engines more content to crawl and make available to people searching or
(2) Provide a decent size excerpt, say the first 250 words of each article as a taster for potential customers and put the site login at the point of the "read more". That way you give the search engines something to get their teeth into which is of a decent length but it's also a decent size teaser to give potential customers an appetite to subscribe.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
What tools do you use to find scraped content?
This hasn’t been an issue for our company so far, but I like to be proactive. What tools do you use to find sites that may have scraped your content? Looking forward to your suggestions. Vic
White Hat / Black Hat SEO | | VicMarcusNWI0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Rel Canonical and Rel No Index, Follow
Hi, Cant implement rel next and prev as getting difficulty in coding - tried lot for same, but to no luck... Considering now rel=canonical and rel noindex,follow to 2 sections Deals and Discounts - We have been consistenly ranking on first position for over 1.5 yr, however recently slipped to position 4,5 on many keywords in this section URL - http://www.mycarhelpline.com/index.php?option=com_offers&view=list&Itemid=9 here, the page content for page 1 and 2 pertains to the current month and from page 3 to all other pages pertains to previous months. Is adding up rel canonical from page 3 to last page to page 1 - makes sense & also simultaneously add noindex, follow from page 3 to last page News & Reviews Section - Here, all news & article items are posted. Been the links of news items are primarily there. However, the pages are not duplicates, does adding noindex, follow makes sense here URL - http://www.mycarhelpline.com/index.php?option=com_latestnews&view=list&Itemid=10 Look forward for recommendations to implement the best - to gain SERP, avoid duplicate and white hat method.. Many thanks
White Hat / Black Hat SEO | | Modi0 -
Content ideas?
We run a printing company and we are struggling to come up with unique content people will actually want to know, is there any way of getting the ball rolling? We were thinking of ideas such as exhibition guide but this seems to have been overdone. Any help would be appreciated.
White Hat / Black Hat SEO | | BobAnderson0 -
Does posting a source to the original content avoid duplicate content risk?
A site I work with allows registered user to post blog posts (longer articles). Often, the blog posts have been published earlier on the writer's own blog. Is posting a link to the original source a sufficient preventative solution to possibly getting dinged for duplicate content? Thanks!
White Hat / Black Hat SEO | | 945010