Different version of site for "users" who don't accept cookies considered cloaking?
-
Hi
I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there.
They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content.
Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content.
I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form.
From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case.
So my question is: would this be considered cloaking/put the site at risk in any way?
(They would prefer to not go down the First Click Free route as this will lower their email sign-ups.)
Thank you!
-
Yeah - that would work. Well it should work if done the right way.
-
I'm thinking that a javascript pop-up might achieve the same result and be lower risk, especially if the indexed content is visible underneath the pop-up
-
Hi,
You can actually cap FCF at X number of visits per user per day by dropping a cookie. Otherwise what you are proposing is potentially a bit dodgy - if a human tester visits the site and gets a different experience to the bot, you might be at risk. I dbout you will get found out but at the same time, if you want to go pure white hat, then you need to follow the rules. Your call really.
A
-
Hi. Thanks but I don't want to use FCF if I can help it.
-
You can also use Google First Click Free to let it index the site - really easy to set up the run. I suggest you use this, I did it at a previous company and it works so well it's not funny.
More info here:
http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do when half of my pages aren't being viewed?
My site is roughly 1000 pages. I've begun refreshing older content. I noticed about half of my pages have no incoming traffic. Should I look at combining some of these pages and 301 redirecting the former links to that new "bigger" page and then having my home page show that new consolidated content? They don't have good back links either. Example layout now: Home Page - Restaurants [show list of cuisines] - User clicks on Italian [show list of all Italian restaurants] - Choice 1 - Choice 2 Even though my main page is seen by about 100,000 people a month, it doesn't seem like anyone is interested in going down that path so none of the restaurants are clicked. How could I improve the user interface/experience and incorporate best Google practices? Thanks, Steve
Technical SEO | | recoil0 -
Google Deindexing Site, but Reindexing 301 Redirected Version
A bit of a strange one, a client's .com site has recently been losing rankings on a daily basis, but traffic has barely budged. After some investigation, I found that the .co.uk domain (which has been 301 redirected for some years) has recently been indexed by Google. According to Ahrefs the .co.uk domain started gaining some rankings in early September, which has increased daily. All of these rankings are effectively being stolen from the .com site (but due to the 301 redirect, the site loses no traffic), so as one keyword disappears from the .com's ranking, it reappears on the .co.uk's ranking report. Even searching for the brand name now brings up the .co.uk version of the domain whereas less than a week ago the brand name brought up the .com domain. The redirects are all working fine. There's no instance of any URLs on the site or in the sitemaps leading to the .co.uk domain. The .co.uk domain does not have any backlinks except for a single results page on ask.com. The site hasn't recently had any design or development done, the last changes being made in June. Has anyone encountered this before? I'm not entirely sure how or why Google would start indexing 301'd URLs after several years of not indexing these.
Technical SEO | | lyuda550 -
These days on Google results, it also shows the site map. I submitted my company's sitemap and it still does not show?What am I doing wrong?
Look at the image in the link. I want my company to look like the "pluralsight" website in Google. I want it to show the sitemap. I have already submitted the sitemap to Google few days back, what am I doing wrong? search?sourceid=chrome-psyapi2&ion=1&espv=2&ie=UTF-8&q=pluralsight&oq=pluralsight&aqs=chrome..69i57j0l5.11024j0j8
Technical SEO | | Deein0 -
Many "spin-off" sites - 301 or 401/410?
Hi there, I've just started a new job with a rental car company with locations all over New Zealand and Australia. I've discovered that we have several websites along the lines of "rentalcarsnewzealand", "bigsaverentals" etc that are all essentially clones of our primary site. I'm assuming that these were set up as some sort of "interesting" SEO attempt. I want to get rid of them, as they create customer experience issues and they're not getting a hell of a lot of traffic (or driving bookings) anyway. I was going to just 301 them all to our homepage - is this the right approach? Several of the sites are indexed by Google and they've been linked up to a number of sites - the 301 move wouldn't be to try to derive any linkjuice or anything of that nature, but simply to get people to our main site if they do find themselves clicking a link to one of those sites. Thanks very much for your advice! Nicole
Technical SEO | | AceRentalCars0 -
Should I add 'nofollow' to site wide internal links?
I am trying to improve the internal linking structure on my site and ensure that the most important pages have the most internal links pointing to them (which I believe is the best strategy from Google's perspective!). I have a number of internal links in the page footer going to pages such as 'Terms and Conditions', 'Testimonials', 'About Us' etc. These pages, therefore, have a very large number of links going to them compared with the most important pages on my site. Should I add 'nofollow' to these links?
Technical SEO | | Pete40 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
If multiple links on a page point to the same URL, and one of them is no-followed, does that impact the one that isn't?
Page A has two links on it that both point to Page B. Link 1 isn't no-follow, but Link 2 is. Will Page A pass any juice to Page B?
Technical SEO | | Jay.Neely0 -
Should I have a 'more' button for links?
I have a website that has a page for each town. rather than listing all the towns with a link to each, I want to show only the most popular towns and have a 'more' button that shows all of them when you click it. I know that the search engine can always see the full list of links and even though the visitor can't this doesn't go against Google guidelines because there is no deception involved, the more button is quite clear. However, my colleague is concerned that this is 'making life hard' for the search engines and so the pages are less likely to be indexed. I disagree. Is he right to worry about this??
Technical SEO | | mascotmike0