Honeypot Captcha - rated as "cloaked content"?
-
Hi guys,
in order to get rid of our very old-school captcha on our contact form at troteclaser.com, we would like to use a honeypot captcha.
The idea is to add a field that is hidden to human visitors but likely to be filled in by spam-bots. In this way we can sort our all those spam contact requests.
More details on "honeypot captchas":
http://haacked.com/archive/2007/09/11/honeypot-captcha.aspxAny idea if this single cloaked field will have negative SEO-impacts? Or is there another alternative to keep out those spam-bots?
Greets from Austria,
Thomas -
Just in case anyone stumbles across this topic:
We started using honeypot captchas in 2011 and it really paid off. Not only because we got rid of the old captchas, but also because they are keeping out 99,99% of all bot inquiries or spam.
-
Hey Casey,
Thanks for the reply. Will have this tested soon. Really looking forward to getting rid of that captcha.
Regards,
Thomas
-
Hi Thomas,
I've done some studies on this and you will be fine using this technique and Google won't give you any problems doing it. Check out my post on the Honeypot Technique, http://www.seomoz.org/blog/captchas-affect-on-conversion-rates. The technique works quite well blocking about 98% of SPAM.
Casey
-
Hi Keri,
Those are users without Java-Support.
Does that mean that Java Script is no issue then? -
Thomas, double-check if that stat is for users without Java, or users without javascript.
-
Good point, thanks.
As 15% of our visitors don't have Java, this won't work out
Actually we're trying to get rid of the captcha to increase our CR, that's why the "honeypot" version is very appealing.
-
You won't get any SEO impact, think about it for all the form with JS interaction on big sites
One easy solution is to use ajax post of the form only, very effective BUT you won't be able to get contact from visitors without javascript enabled. Maybe a good alternative.
Otherwise, you can use Recaptcha : http://www.google.com/recaptcha
This is free and easy to setup, works well with bots and access to everyone !
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content on desktop and mobile
My website hasn't using responsive design and separate domain for mobile optimize, But we using dynamic serving for mobile version, Read more about dynamic serving here So our website must different design for both version, And then what would be happen in term of SEO if our website hasn't show the same content as desktop but still align with the main content, Such as Desktop has longer content compare to mobile version or Desktop has long H1 but mobile is shorter than. What should we do for this case and how to tell Google Bot.
Technical SEO | | ASKHANUMANTHAILAND0 -
"Ghost" errors on blog structured data?
Hi, I'm working on a blog which Search Console account advises me about a big bunch of errors on its structured data: Structured data - graphics Structured data - hentry list Structured data - detail But I get to https://developers.google.com/structured-data/testing-tool/ and it tells me "all is ok": Structured data - test Any clue? Thanks in advance, F0NE5lz.png hm7IBtV.png aCRJdJO.jpg 15SRo93.jpg
Technical SEO | | Webicultors0 -
My sites "pages indexed by Google" have gone up more than qten-fold.
Prior to doing a little work cleaning up broken links and keyword stuffing Google only indexed 23/333 pages. I realize it may not be because of the work but now we have around 300/333. My question is is this a big deal? cheers,
Technical SEO | | Billboard20120 -
How can I change the page title "two" (artigos/page/2.html) in each category ?
I have some categories and photo galleries that have more than one page (i.e.: http://www.buffetdomicilio.com/category/artigos and http://www.buffetdomicilio.com/category/artigos/page/2). I think that I must change the tittle and description, but I don't how. I would like to know how can I change the title of each of them without stay with duplicate title and description. Thank you! ahcAORR.jpg
Technical SEO | | otimizador20130 -
Duplicate content with "no results found" search result pages
We have a motorcycle classifieds section that lets users search for motorcycles for sale using various drop down menus to pick year-make-type-model-trim, etc.. These search results create urls such as:
Technical SEO | | seoninjaz
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=On-Off Road&vehicle_model=Tiger&vehicle_trim=800 XC ABS We understand that all of these URL varieties are considered unique URLs by Google. The issue is that we are getting duplicate content errors on the pages that have no results as they have no content to distinguish themselves from each other. A URL like:
www.example.com/classifieds/search.php?vehicle_manufacturer=Triumph&vehicle_category=Sportbike
and
www.example.com/classifieds/search.php?vehicle_manufacturer=Honda&vehicle_category=Streetbike Will have a results page that says "0 results found". I'm wondering how we can distinguish these "unique" pages better? Some thoughts:
-make sure <title>reflects what was search<br />-add a heading that may say "0 results found for Triumph On-Off Road Tiger 800 XC ABS"<br /><br />Can anyone please help out and lend some ideas in solving this? <br /><br />Thank you.</p></title>0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Content and url duplication?
One of the campaign tools flags one of my clients sites as having lots of duplicates. This is true in the sense the content is sort of boiler plate but with the different countries wording changed. The is same with the urls but they are different in the sense a couple of words have changed in the url`s. So its not the case of a cms or server issue as this seomoz advises. It doesnt need 301`s! Thing is in the niche, freight, transport operators, shipping, I can see many other sites doing the same thing and those sites have lots of similar pages ranking very well. In fact one site has over 300 keywords ranked on page 1-2, but it is a large site with an 12yo domain, which clearly helps. Of course having every page content unique is important, however, i suppose it is better than copy n paste from other sites. So its unique in that sense. Im hoping to convince the site owner to change the content over time for every country. A long process. My biggest problem for understanding duplication issues is that every tabloid or broadsheet media website would be canned from google as quite often they scrape Reuters or re-publish standard press releases on their sites as newsworthy content. So i have great doubt that there is a penalty for it. You only have to look and you can see media sites duplication everywhere, everyday, but they get ranked. I just think that google dont rank the worst cases of spammy duplication. They still index though I notice. So considering the business niche has very much the same content layout replicated content, which rank well, is this duplicate flag such a great worry? Many businesses sell the same service to many locations and its virtually impossible to re write the services in a dozen or so different ways.
Technical SEO | | xtopher660 -
Copying Content With Permission
Hi, we received an email about a guy who wants to copy and paste our content on his website, he says he will keep all the links we put there and give us full credit for it, so besides keeping all the links on the page, which is the best way for him to give us the credit? a link to the original article? an special meta tag? what? Thank you PS.Our site its much more authorative than his and we get indexed within 10min from the moment we publish a page, so I don't worry about him out raking us with our own content.
Technical SEO | | andresgmontero0