Different version of site for "users" who don't accept cookies considered cloaking?
-
Hi
I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there.
They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content.
Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content.
I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form.
From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case.
So my question is: would this be considered cloaking/put the site at risk in any way?
(They would prefer to not go down the First Click Free route as this will lower their email sign-ups.)
Thank you!
-
Yeah - that would work. Well it should work if done the right way.
-
I'm thinking that a javascript pop-up might achieve the same result and be lower risk, especially if the indexed content is visible underneath the pop-up
-
Hi,
You can actually cap FCF at X number of visits per user per day by dropping a cookie. Otherwise what you are proposing is potentially a bit dodgy - if a human tester visits the site and gets a different experience to the bot, you might be at risk. I dbout you will get found out but at the same time, if you want to go pure white hat, then you need to follow the rules. Your call really.
A
-
Hi. Thanks but I don't want to use FCF if I can help it.
-
You can also use Google First Click Free to let it index the site - really easy to set up the run. I suggest you use this, I did it at a previous company and it works so well it's not funny.
More info here:
http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Organic search traffic has dropped by 35% since 18 September, we don't know why.
Organic traffic to our website has dropped 35% since 18 September 2017 to date. From 1 January to 18 September 2017 organic traffic was up by just under 1% over all (Google up by 1.32%). Paid search traffic over the same time has remained steady. There is nothing we can think of that we've done that has caused the drop. We had an issue with Google page speed test failing when running a test but we resolved this issue on 20 November and in that time we've seen an even greater drop (44% in the last week). The drop is seen across the 3 main search engines, not just Google, which points toward something we've done, but as mentioned, we can't think of any significant change we made in September that would have such negative effects. There is little difference across devices. Is anyone aware of a significant event in September in the search engine world that may have influenced our organic traffic? Any help gratefully received.
Technical SEO | | imaterus0 -
Falling rankings - can't figure out why
I am still fairly green on in depth SEO, however I have a good grasp on making a site SEO friendly, as my skills are more down to website construction than technical SEO, however, I am working on a site at the moment which just continues to lose rankings and is slipping further and further. Keywords are dropping week on week in rankings Search visibility is also dropping week on week On site sales have fallen massively in the last quarter We have made huge improvements on the following; Moved the site to a faster stand alone cloud vps server - taken page rank scores from 54 to 87%. Added caching (WP Rocket) & CDN support. Improved URL structure (Woocommerce) removed /product and or /product-category from URLS to give more accurate & relevant structures. Added canonical URLs to all product categories (We use Yoast Premium) Amended on page structures to include correct H tags. Improved Facebook activity with a huge increase in engagements These are just some of the improvements we have made, yet we're still seeing huge drops in traffic and rankings. One insight I have noted which may be a big pointer, is we have 56 backlinks.... which I know is not good and we are about to address this. I suspect this is the reason for the poor performance, but should I be looking at anything else? Is there anything else we should be looking at? As I said, I'm no SEO specialist, but I don't think there's been any Penguin penalty, but my expertise is not sufficient enough to dig deeper. Can anyone offer any constructive advice at this stage? I'm thinking things to look at that could be hurting us that isn't immediately obvious? The site is www.glassesonspec.co.uk Thanks in advance Bob
Technical SEO | | SushiUK0 -
Is using part of a meta description already on your site for another product considered duplicate?
I'm writing meta descriptions for this site, trying to keep them different, however, for two product types, I want to add the same info I added in the other likeminded product's meta descriptions. Is this ok as long as it's not the whole sentence or am I really to rewrite the same info another way, which is hard for " quick shipping available for x amount of colors ". Any Advice?
Technical SEO | | Deacyde0 -
Blog page won't get indexed
Hi Guys, I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now. I found this in the robots.txt file: Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/ I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt? Cheers!
Technical SEO | | Happy-SEO2 -
Should I add 'nofollow' to site wide internal links?
I am trying to improve the internal linking structure on my site and ensure that the most important pages have the most internal links pointing to them (which I believe is the best strategy from Google's perspective!). I have a number of internal links in the page footer going to pages such as 'Terms and Conditions', 'Testimonials', 'About Us' etc. These pages, therefore, have a very large number of links going to them compared with the most important pages on my site. Should I add 'nofollow' to these links?
Technical SEO | | Pete40 -
Google webmaster tool doestn allow me to send 'URL and all linked pages"
Hello! I made a lot of optimization changes in my site ( seo urls, and a lot more ) , I always use Google Webmaster tools, fetch as Google Bot to refresh my site but now it doesnt allow me to 'Send URL and all linked pages' check the attachment Thank you
Technical SEO | | matiw0 -
If you are organizing the site structure for an ecommerce site, how would you do it?
Should you use not use slashes and use all dashes or use just a few slashes and the rest with dashes? For example, domain.com/category/brand/product-color-etc OR domain.com/anythinghere-color-dimensions-etc Which structure would you rather go for and why?
Technical SEO | | Zookeeper0