Different version of site for "users" who don't accept cookies considered cloaking?
-
Hi
I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there.
They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content.
Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content.
I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form.
From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case.
So my question is: would this be considered cloaking/put the site at risk in any way?
(They would prefer to not go down the First Click Free route as this will lower their email sign-ups.)
Thank you!
-
Yeah - that would work. Well it should work if done the right way.
-
I'm thinking that a javascript pop-up might achieve the same result and be lower risk, especially if the indexed content is visible underneath the pop-up
-
Hi,
You can actually cap FCF at X number of visits per user per day by dropping a cookie. Otherwise what you are proposing is potentially a bit dodgy - if a human tester visits the site and gets a different experience to the bot, you might be at risk. I dbout you will get found out but at the same time, if you want to go pure white hat, then you need to follow the rules. Your call really.
A
-
Hi. Thanks but I don't want to use FCF if I can help it.
-
You can also use Google First Click Free to let it index the site - really easy to set up the run. I suggest you use this, I did it at a previous company and it works so well it's not funny.
More info here:
http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Review rich snippet isn't appearing anymore
Hi, We have recently lost the review rich snippet for product that used to consistently show it for a long time. On top of that we have also lost the breadcrumb. The markup hasn't changed on these pages and gwt data tool tester doesn't show any anomalies. A few weeks back we have deployed a new property that list reviews from the first site without actually being under the same domain. Would that be an issue, review may have been considered as plagiarism somehow? Is there a way for us to confirm this theory? What are others factors that may have lead us to lose those rich snippet? Thanks
Technical SEO | | mattam0 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
The use of tabs on productpages, do or don't?
Does google has any trouble reading content tabs? The content is not loaded by ajax and is already in the page source code.
Technical SEO | | wilcoXXL
As i'm checking some big e-commerce websites or (amazon.com for example) they get rid of the tabs with content and put the different content below eachother. Is his better for SEO purpose? But what about user experience? For users it think it is easier to navigate by tabs then to have a long page to scroll. What do you guys think about this issue?0 -
Help!!! Website won't index after taking it over from another IT Company
Hi, A while back we took over a website that was built in Wordpress. We rebuilt it on another platform and switched the servers over whilst retaining the same domain.I had access to the old GA Account however so did the old IT company. Therefore I created a new GA account and used that in the new website pages.Recently we found the website had been blacklisted (previous to us taking it over) and now after being crawled a lot, only 2 pages have been indexed (over a 2month period).We have submitted a request for revision (to relist the website) buthave had no movement.**Just wondering if having a old, active account that was still linked to their old website would affect our Google listing?****Will dropping the old GA Tracking code/script into the site and deleting the new account enable Google to index?**Also, there is ample content, metadata and descriptions on the site.I welcome any help on this please!
Technical SEO | | nimblerdigital0 -
Walking into a site I didn't build, easy way to fix this # indexing problem?
I recently joined a team with a site without a) Great content b) Not much of any search traffic I looked and all their url's are built in this way: Normal looking link -> not actually a new page but # like: /#content-title And it has no h1 tag. Page doesn't refresh. My initial thought is to gut the site and build it in wordpress, but first have to ask, is there a way to make a site with /#/ content loading friendly to search engines?
Technical SEO | | andrewhyde0 -
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO? Is there a better way to do what I'm trying to do?
Technical SEO | | Ocularis0 -
ECommerce Site, URL's, Canonical and Tracking Referral Traffic
I'm very, very new to eCommerce websites that employ many different URL's to track referral traffic. I have a client that has 18 different URL's that land on the Home Page in order to track traffic from different referral sources. For example: http://erasedisease.com/?ref=abot - Tracks traffic from an affiliate source http://erasedisease.com/?ref=FB01 - Tracks traffic from a FB Ad http://erasedisease.com/?ref=sas&SSAID=289169 - Tracks more affiliate traffic ...and the list goes on and on. My first question is do you think this could hinder our Google rankings? SEOMoz Crawl doesn't show any Duplicate Content Errors, so I guess that's good. I've just been reading a lot about Canonical Url's and eCommerce sites, but I'm not sure if this is a situation where I'd want to use some kind of canonical plugin for this Wordpress website or not. Any advice would be greatly appreciated. Thanks so much!!
Technical SEO | | Linwright0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0