Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
-
Hi,
A client of mine has compliance issues in their industry and has to show two different types of content to visitors:
Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection.
I'm uneasy about this from an SEO POV because:
- Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content,
- It makes sense to separate them into three URL paths so that Google can index them all,
- It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available.
I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated.
Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
-
Thanks Peter - I didn't know you could do that. I'll pass it on to the developers (who might already know, but wouldn't hurt to reinforce its importance).
-
Thanks Russ. I think the differences to the content between the two will only be minor/superficial, so I guess the approach makes sense and shouldn't affect the SEO side of things too much.
-
You can return same page with different content based on cookie safe. Just don't forget to add "Vary: Cookie" in headers. This will to told browsers and bots that this content is different based on cookie.
-
I think this sounds perfectly fine. It is highly unlikely that you will see any problems from this, just don't expect to rank for content that is hidden behind a cookie-based authentication. It might not be best-practice in Google's eyes, but it isn't going to trigger any kind of penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
Intermediate & Advanced SEO | | FireMountainGems0 -
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
Hi All! So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic. For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path. We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain. What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
Intermediate & Advanced SEO | | PatBausemer0 -
Mobile Sitemap Issue
Hi there, I am having some difficulty with an error on Webmaster Tools. I'm concerned with a possible duplicate content penalty following the launch of my mobile site. I have attempted to update my sitemap to inform Google that a different mobile page exists in addition to the desktop page. I have followed Google's guidelines as outlined here:
Intermediate & Advanced SEO | | DBC01
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648 I'm having problems with my sitemap.xml file. Webmaster tools is reporting that it is not able to read the file and when I validate it I am getting an error stating that the 'Namespace prefix xhtml on link is not defined'. All I am trying to do is to create a sitemap that uses the rel="alternate" to inform Google that their is a mobile version of that specific page in addition to the desktop version. An instance of the code I am using is below: xml version="1.0" encoding="UTF-8"?> xml-stylesheet type="text/xsl" href="gss.xsl"?> <urlset< span="">xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"> http://www.mydomain/info/detail/ <xhtml:link< span="">rel="alternate" media="only screen and (max-width: 640px)" href="http://m.mydomain.com/info/detail.html"/> <lastmod></lastmod>2013-02-01T16:03:48+00:00<changefreq></changefreq>daily0.50</xhtml:link<></urlset<> Any help would be much appreciated. Thanks0 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0 -
Is it allowed to have different alt on same image on different pages?
Hi, I have images that match several different keywords and I wondered if I can give them different alts based on the page that they are displayed or will Google be angry with me? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Duplicate Content
Hi everyone, I have a TLD in the UK with a .co.uk and also the same site in Ireland (.ie). The only differences are the prices and different banners maybe. The .ie site pulls all of the content from the .co.uk domain. Is this classed as content duplication? I've had problems in the past in which Google struggles to index the website. At the moment the site appears completely fine in the UK SERPs but for Ireland I just have the Title and domain appearing in the SERPs, with no extended title or description because of the confusion I caused Google last time. Does anybody know a fix for this? Thanks
Intermediate & Advanced SEO | | royb0 -
301 Redirect or Canonical Tag or Leave Them Alone? Different Pages - Similar Content
We currently have 3 different versions of our State Business-for-Sale listings pages - the versions are: **Version 1 -- Preferred Version: ** http://www.businessbroker.net/State/California-Businesses_For_Sale.aspx Title = California Business for Sale Ads - California Businesses for Sale & Business Brokers - Sell a Business on Business Broker Version 2: http://www.businessbroker.net/Businesses_For_Sale-State-California.aspx Title = California Business for Sale | 3124 California Businesses for Sale | BusinessBroker.net Version 3: http://www.businessbroker.net/listings/business_for_sale_california.ihtml Title = California Businesses for Sale at BusinessBroker.net - California Business for Sale While the page titles and meta data are a bit different, the bulk of the page content (which is the listings rendered) are identical. We were wondering if it would make good sense to either (A) 301 redirect Versions 2 and 3 to the preferred Version 1 page or (B) put Canonical Tags on Versions 2 and 3 labeling Version 1 as the preferred version. We have this issue for all 50 U.S. States -- I've mentioned California here but the same applies for Alabama through Wyoming - same issue. Given that there are 3 different flavors and all are showing up in the Search Results -- some on the same 1st page of results -- which probably is a good thing for now -- should we do a 301 redirect or a Canonical Tag on Versions 2 and 3? Seems like with Google cracking down on duplicate content, it might be wise to be proactive. Any thoughts or suggestions would be greatly appreciated! Thanks. Matt M
Intermediate & Advanced SEO | | MWM37720