Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
-
Hi,
A client of mine has compliance issues in their industry and has to show two different types of content to visitors:
Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection.
I'm uneasy about this from an SEO POV because:
- Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content,
- It makes sense to separate them into three URL paths so that Google can index them all,
- It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available.
I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated.
Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
-
Thanks Peter - I didn't know you could do that. I'll pass it on to the developers (who might already know, but wouldn't hurt to reinforce its importance).
-
Thanks Russ. I think the differences to the content between the two will only be minor/superficial, so I guess the approach makes sense and shouldn't affect the SEO side of things too much.
-
You can return same page with different content based on cookie safe. Just don't forget to add "Vary: Cookie" in headers. This will to told browsers and bots that this content is different based on cookie.
-
I think this sounds perfectly fine. It is highly unlikely that you will see any problems from this, just don't expect to rank for content that is hidden behind a cookie-based authentication. It might not be best-practice in Google's eyes, but it isn't going to trigger any kind of penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content to different high authority websites
Let's say 1 article piece is highly relevant to multiple states and we pitch this article across the different domains in those states. Each article piece will be tweaked to localize the content. I understand that Google devalues links coming from low quality, websites that are spun up, but what about links that are basically the same content (but localized), across different high authority domains?
Intermediate & Advanced SEO | | imjonny1230 -
Google Manual Penalties:Different Types of Unnatural Link Penalties?
Hello Guys, I have a few questions regarding google manual penalties for unnatural link building. They are "partial site" penalties, not site wide. I have two sites to discuss. 1. this site used black hat tactics and bought 1000's of unnatural backlinks. This site doesn't rank for the main focus keywords and traffic has dropped. 2. this site has the same penalty, but has been all white hat, never bought any links or hired any seo company. It's all organic. This sites organic traffic doesn't seem to have taken any hit or been affected by any google updates. Based on the research we've done, Matt Cutts has stated that sometimes they know the links are organic so they don't penalize a website, but they still show us a penalty in the WMT. "Google doesn't want to put any trust in links that are artificial or unnatural. However, because we realize that some links may be outside of your control, we are not taking action on your site's overall ranking. Instead, we have applied a targeted action to the unnatural links pointing to your site." "If you don't control the links pointing to your site, no action is required on your part. From Google's perspective, the links already won't count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you're able to get the artificial links removed, submit areconsideration request. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action." Check that info above at this link: https://support.google.com/webmasters/answer/2604772?ctx=MAC Recap: Does anyone have any experience like with site #2? We are worried that this site has this penalty but we don't know if google is stopping us from ranking or not, so we aren't sure what to do here. Since we know 100% the links are organic, do we need to remove them and submit a reconsideration request? Is it possible that this penalty can expire on its own? Are they just telling us we have an issue but not hurting our site b/c they know it's organic?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
How do I Syndicating Content for SEO Benefit?
Right now, I am working on one E-Commerce website. I have found same content on that E-Commerce website from manufacturer website. You can visit following pages to know more about it. http://www.vistastores.com/casablanca-sectional-sofa-with-ottoman-ci-1236-moc.html http://www.abbyson.com/room/contemporary/casablanca-detail http://www.vistastores.com/contemporary-coffee-table-in-american-white-oak-with-black-lacquer-element-ft55cfa.html http://www.furnitech.com/ft55cfa.html I don't want to go with Robots.txt, Meta Robots NOINDEX & Canonical tag. Because, There are 5000+ products available on website with duplicate content. So, I am thinking to add Source URL on each product page with Do follow attribute. Do you think? That will help me to save my website from duplicate content penalty? OR How do I Syndicating Content for SEO Benefit?
Intermediate & Advanced SEO | | CommercePundit0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
.com and .edu difference
Hello, Can anyone tell me how big is the difference between a PR5 .com and a PR5 .edu Double, triple? How big? Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0