Showing Different Content To Members & Non-Members/Google and Cloaking Risk
-
How do we safely show logged-in members/Google one type of content on a page and logged out/non-members another kind of content without getting slammed for cloaking?
Right now we do this thing where we show Google everything on the page, but new visitors partial forum comments with the pitch to sign up and see full comments. So far, we have not gotten into trouble for this.
The new idea is to show non-members a lot of marketing messages and one kind of navigation and then once they sign up and are logged in, show different or no marketing messages and a different kind of navigation.
How do we stay out of trouble with this? Where is the cloaking line drawn? It's got me kinda nervous.
Thanks... Darcy
-
Wow...I didn't know this! Thanks Dirk for putting me in the 5000 Moz points club!
-
Hi Marie
Couldn't resist to like this - I noticed that you were only missing one like to reach the Moz Walhalla...
Congrats,
Dirk
-
I agree with Dirk. This sounds like cloaking. It would be best to only show Google the content that non-members can see.
If you show Google content that a non-member can't see, then this is cloaking and could get you penalized. But, even if it doesn't get you penalized, it's possible it could get you into Panda trouble. Let's say I am searching for something and I see a Google result that shows me that your site has the answer to my query. I click on your site and realize that I can only see this content if I'm a member. I don't want to become a member, so I click away and find another site to read. If enough users do this, then this is a signal to Google (and likely to Panda) that readers don't like your site.
-
Hi Darcy,
If you apply the strict definition of Google, you are "inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor" - even if you don't do it with the intention to trick search engines (the inserted text = text which is invisible for non-registered users).
Is there a way to show the same content to both bots & humans, and still keeping the page
- attractive enough for search engines
- teasing enough for humans to register
It's difficult to guess the level of risk you're running - but once penalised, traffic drop is huge & recovery takes normally a long time (with no guarantee of full recovery)
rgds
Dirk
-
Hi Dirk,
Thanks for the response. Folks out of Google do not see the full page that Google saw. They see a snippet of comments and a pitch to log in or register to see full comments (in a forum). They don't see different content right now... they see less content, but the same as Google saw. Is that clearer?
Thanks... Darcy
-
Hi Darcy,
When people click on the results in Google - do they see the normal page (the one that Googlebot saw) or the version for the "new" users. If it's the second case - you are indeed cloaking according to Google's definition (https://support.google.com/webmasters/answer/66355).
If you're listed in Google News - you could participate in "First Click Free" (https://support.google.com/news/publisher/answer/40543?hl=en) - which basically allows you to hide your content behind a registration wall but still be indexed as long as you provide at least 5 pages (articles) /day
Not all participants to First Click Free are playing according to the rules (http://searchengineland.com/google-fails-enforce-first-click-free-203078) - but I guess your site isn't the Financial times.
You could continue what you're doing now, but you certainly run the risk of a penalty in my opinion
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content to different high authority websites
Let's say 1 article piece is highly relevant to multiple states and we pitch this article across the different domains in those states. Each article piece will be tweaked to localize the content. I understand that Google devalues links coming from low quality, websites that are spun up, but what about links that are basically the same content (but localized), across different high authority domains?
Intermediate & Advanced SEO | | imjonny1230 -
Letting Others Use Our Content: Risk-Free Attribution Methods
Hello Moz! A massive site that you've all heard of is looking to syndicate some of our original editorial content. This content is our bread and butter, and is one of the primary reasons why people use our site. Note that this site is not a competitor of ours - we're in different verticals. If this massive site were to use the content straight up, I'm fairly confident that they'd begin to outrank us for related terms pretty quickly due to their monstrous domain authority. This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours. They're also not open to including a link back to the product pages where the corresponding reviews live on our site. Are there other courses of action that could be proposed that would protect our valuable content? Is there any evidence that using schema.org (Review and Organization schemas) pointing back to our review page URLs would provide attribution and prevent them from outranking us for associated terms?
Intermediate & Advanced SEO | | edmundsseo1 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
What is best practice to eliminate my IP addr content from showing in SERPs?
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center). Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that). I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows. As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical. So here are my questions. Is there a better way? If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
Intermediate & Advanced SEO | | ovenbird0 -
Does Google bot read embedded content?
Is embedded content "really" on my page? There are many addons nowadays that are used by embedded code and they bring the texts after the page is loaded. For example - embedded surveys. Are these read by the Google bot or do they in fact act like iframes and are not physically on my page? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
10yr old Domain, Conflicting Message from Webmaster tools/Google search
This is the first time I have encountered this and am quite frankly a little baffled on how to proceed. We have some domains that are 10 years old, and do get some hits / impressions and they have a lot of content. So I redid the site in wordpress etc... Anyway, on Google the sites show up as www. , and on Webmaster tools,- the www. shows no impressions or anything, while the non-www domain shows up in google webmaster tools with data. The question is, if google displays the site as www. and webmaster tools shows data for non www. Which one do I proceed with, finding info on this has been pretty hard to do. Any input is appreciated, Thanks in advance:)
Intermediate & Advanced SEO | | choiceenergy0 -
301 or 302 - www.yoursite.com/uk/content
If your website CMS forces you to redirect from the homepage should it be a 301 or 302 Example includes www.direct.gov.uk which 302's it My view is that it should be a 302 in this instance and almost all others should be a 301 - the reason for this is that you want the www.direct.gov.uk to be the "primary" and one that is displayed in Google, whereas for anything else you want the URL of the location. Yes I know that ideally you don't have any redirection at all...
Intermediate & Advanced SEO | | AxonnMedia0