Different version of site for "users" who don't accept cookies considered cloaking?
-
Hi
I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there.
They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content.
Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content.
I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form.
From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case.
So my question is: would this be considered cloaking/put the site at risk in any way?
(They would prefer to not go down the First Click Free route as this will lower their email sign-ups.)
Thank you!
-
Yeah - that would work. Well it should work if done the right way.
-
I'm thinking that a javascript pop-up might achieve the same result and be lower risk, especially if the indexed content is visible underneath the pop-up
-
Hi,
You can actually cap FCF at X number of visits per user per day by dropping a cookie. Otherwise what you are proposing is potentially a bit dodgy - if a human tester visits the site and gets a different experience to the bot, you might be at risk. I dbout you will get found out but at the same time, if you want to go pure white hat, then you need to follow the rules. Your call really.
A
-
Hi. Thanks but I don't want to use FCF if I can help it.
-
You can also use Google First Click Free to let it index the site - really easy to set up the run. I suggest you use this, I did it at a previous company and it works so well it's not funny.
More info here:
http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redundant categorization - "boys" and "girls" category. Any other suggestions than implementing filtering?
One of our clients (a children's clothing company) has split their categories (outwear, tops, shoes) between boys and girls - There's one category page for girls outwear, and one category for boys outwear. I am suspecting that this redundant categorisation is diluting link juice and rankings for the related search queries. Important points: The clothes themselves are rather gender-neutral, girl's sweaters don't differ that much from the boy's sweaters. Our keyword research indicates that norwegians' search queries are also pretty gender neutral - people are generally searching after "children's dresses", "shoes for kids", "snowsuits", etc. So these gender specific categories are not really reflective of people's search behavior. I acknowledge that implementing a filter for "boys" and "girls" would be the best way to solve this redundant categorization, but that would simply be to expensive for our client. I'm thinking that some sort of canonicalisation would be the best approach to solve this issue. Are there any other suggestions or comments to this?
Technical SEO | | Inevo0 -
Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented. If this is hurting us, what would you recommend as a solution?
Technical SEO | | Deacyde0 -
Approach for an established site looking to serve different content to regions in a single country/lang
Hi guys, I have an established site that currently serves the same content to all regions - west and east - in a single country with the same language. We are now looking to vary the content across west and east regions - not dramatically, but the products offered will be slightly different. From what i gather, modifying the url is best for countries, so feels like overkill for regions within the same country. I'm also unlikely to have very unique content, outside of the varied products, so I'm mindful of duplicate/similar content, but I know I can use canonical tags to address. I have a fairly modern CMS that can target content based on region, but mindful of upsetting Google re; showing different content to what the bot might encounter, assuming this is still a thing. So, three questions from an SEO perspective - Do i need to really focus on changing my url structure, especially as I'm already established in a competitive market, or will I do more harm than good? Is the region in the URL a strong signal? If I should make some changes to the url and/or metadata, what are the best bang for buck changes you would make? How does Google Local fit into this? Is it a separate process via webmaster tools, or does it align to the above changes? Cheers!!! Jez
Technical SEO | | jez0000 -
Data Highlighter doesn't show page
We have an event related website http://www.sbo.nl so i wanted to use data highlighter because most of our event pages are the same. But data highlighter doesn't show those pages, I will see only an empty page. For example http://www.sbo.nl/veiligheid/brandveiligheid-gebouwen/ Does someone of you understand what is going on. Data highlighter does show the homepage. I am thinking it is maybe because of tabbed browsing, or the chat function in that page. Hope someone can help. You can see a screenshot of Data Highlighter http://www.clipular.com/c?6659030=2X2gQv4O8_9RzcZ1Hk_7xGtCPYo&f=d40975c80bdd11dc357f050cafa73a80 I hope someone can help because i am lost 🙂 Cheers Ruud
Technical SEO | | RuudHeijnen0 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | | PooleyK0 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Would you advise or not to include a sites name at the end of each title tage? Example "Top 3 things to do in New York | yoursite.com"
Having a discussion on if we should include our domain at the end of each site page. Re read over the following pages and question remove the brand to keep under 70 characters? or Shorten the page Title before the brand? Let the title exceed 70 characters? What is your thoughts?
Technical SEO | | Harley2g0 -
I am getting an error message from Google Webmaster Tools and I don't know what to do to correct the problem
The message is:
Technical SEO | | whitegyr
"Dear site owner or webmaster of http://www.whitegyr.com/, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team" I have always tried to follow Google's guidelines and don't know what I am doing wrong, I have eight different websites all getting this warning and I don't know what is wrong, is there anyone you know that will look at my sites and advise me what I need to do to correct the problem? Website with this warning:
artistalaska.com
cosmeticshandbook.com
homewindpower.ws
montanalandsale.com
outdoorpizzaoven.net
shoes-place.com
silverstatepost.com
www.whitegyr.com0