Different version of site for "users" who don't accept cookies considered cloaking?
-
Hi
I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there.
They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content.
Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content.
I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form.
From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case.
So my question is: would this be considered cloaking/put the site at risk in any way?
(They would prefer to not go down the First Click Free route as this will lower their email sign-ups.)
Thank you!
-
Yeah - that would work. Well it should work if done the right way.
-
I'm thinking that a javascript pop-up might achieve the same result and be lower risk, especially if the indexed content is visible underneath the pop-up
-
Hi,
You can actually cap FCF at X number of visits per user per day by dropping a cookie. Otherwise what you are proposing is potentially a bit dodgy - if a human tester visits the site and gets a different experience to the bot, you might be at risk. I dbout you will get found out but at the same time, if you want to go pure white hat, then you need to follow the rules. Your call really.
A
-
Hi. Thanks but I don't want to use FCF if I can help it.
-
You can also use Google First Click Free to let it index the site - really easy to set up the run. I suggest you use this, I did it at a previous company and it works so well it's not funny.
More info here:
http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are "appliance repair" and "appliance repair los angeles" consider the same keyword?
Hello, I know that you can't optimize two pages for 1 keyword because Google will get confused and will rather prefer my competitor. But I can't get if it will consider "appliance repair" and "appliance repair los angeles" same keywords? The homepage of my website, https://www.ifixappliancesla.com, is optimized for "appliance repair", one of the inner pages is optimized for "appliance repair los angeles". None of them shows on the first page in local SERPs for any of those quires. I am wondering if this is because Google sees it as both pages are optimized for "appliance repair"?
Technical SEO | | VELV0 -
Recently migrated to https version of volusion site. 301 redirect link chain question
I recently migrated to a https version of a volusion site. They have some type of internal 301 redirect method to accommodate for the entire site. I have also used the 301 redirect manager to redirect categories and pages which I have changed. The question is if I have changed a page internally in the redirect manager from say source. /bluewidget to say. target. /superbluewidget is it wiser or even possible to do it this way to reduce the redirect chain from 3 to 2 steps source. /bluewidget to. target https://www.example/superbluewidget can a relative link be targeted to a full url to reduce steps in a 301 redirect link chain. Thanks
Technical SEO | | mrkingsley0 -
Falling rankings - can't figure out why
I am still fairly green on in depth SEO, however I have a good grasp on making a site SEO friendly, as my skills are more down to website construction than technical SEO, however, I am working on a site at the moment which just continues to lose rankings and is slipping further and further. Keywords are dropping week on week in rankings Search visibility is also dropping week on week On site sales have fallen massively in the last quarter We have made huge improvements on the following; Moved the site to a faster stand alone cloud vps server - taken page rank scores from 54 to 87%. Added caching (WP Rocket) & CDN support. Improved URL structure (Woocommerce) removed /product and or /product-category from URLS to give more accurate & relevant structures. Added canonical URLs to all product categories (We use Yoast Premium) Amended on page structures to include correct H tags. Improved Facebook activity with a huge increase in engagements These are just some of the improvements we have made, yet we're still seeing huge drops in traffic and rankings. One insight I have noted which may be a big pointer, is we have 56 backlinks.... which I know is not good and we are about to address this. I suspect this is the reason for the poor performance, but should I be looking at anything else? Is there anything else we should be looking at? As I said, I'm no SEO specialist, but I don't think there's been any Penguin penalty, but my expertise is not sufficient enough to dig deeper. Can anyone offer any constructive advice at this stage? I'm thinking things to look at that could be hurting us that isn't immediately obvious? The site is www.glassesonspec.co.uk Thanks in advance Bob
Technical SEO | | SushiUK0 -
New "Static" Site with 302s
Hey all, Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with! We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s. One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/. My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail. Has anyone had a similar challenge with a static site and found a way to overcome it?
Technical SEO | | danny.wood1 -
Is it considered spam to promote a website in different cities?
I have a client who wants to promote their Mobile application and website design company in California based cities. Example Keyword los angeles iphone app development - Local Monthly Search 260 san diego iphone app development - Local Monthly Search 320 san jose iphone app development - Local Monthly Search 0 san francisco windows mobile app development - Local Monthly Search 0 california iphone app development - Local Monthly Search 0 Selected san diego iphone app development los angeles drupal web design - Local Monthly Search 0 san diego drupal web design - Local Monthly Search 0 san jose drupal web design - Local Monthly Search 0 san francisco graphic design - Local Monthly Search 3600 california drupal web design - Local Monthly Search 0 Selected san francisco graphic design I have 33 keywords in 5 cities. just want to know if it is considered spam?
Technical SEO | | KLLC0 -
Sitemap for pages that aren't on menus
I have a site that has pages that has a large number, about 3,000, pages that have static URLs, but no internal links and are not connected to the menu. The pages are pulled up through a user-initiated selection process that builds the URL as they make their selections, but,as I said, the pages already exist with static URLs. The question: should the sitemap for this site include these 3,000 static URLs? There is very little opportunity to optimize the pages in any serious kind of way, if you feel that makes a difference. There is also no chance that a crawler is going to find its way to these pages through the natural flow of the site. There isn't a single link to any of these pages anywhere on the site. Help?
Technical SEO | | RockitSEO0 -
Is using splash pages considered cloaking?
For example I'm thinking of running people through a squeeze page when they come from search engines (for first time visitors only/cookied) ... example: http://www.whitehouse.gov/ Is it going to hurt SEO? Because basically you are serving a page that is different, than the one displayed in the SERP's.
Technical SEO | | achilles130