Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
-
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
-
Here is Google's official recommendations for website testing. According to them, no amount of cloaking is okay. Try using one of the other methods suggested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO - Spammy Backlinks By Competitor
Hi Everyone, Someone has generated more than 22k spam backlinks (on bad keywords) for my domain.Will it hurt on my website (SEO Ranking)? Because it is already in the top ranking. How could I remove all the spammy backlinks? How could I know particular competitior who have done this?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Help! Is this what is called "cloaking"?
Friend asked me to look at her website. Ran it through screaming frog and BAM, instead of 4 pages i was expecting it returned HUNDREDS. 99.9% of them are for cheap viagra and pharmaceuticals. I asked her if she was selling viagra, which is fine, I don't judge. But she swears she isn't. http://janeflahertyesq.com I ran it through google site:janeflahertyesq.com and sure enough, if you click on some of those, they take you to canadien pharmacys selling half priced blue pills. a) is this cloaking? if not, what is going on? b) more importantly, how do I we get rid of those hundreds of pages / de-indexed She's stumped and scared. Any help would be greatly appreciated. Thank you all in advance and for the work you do.
White Hat / Black Hat SEO | | TeamPandoraBeauty0 -
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
What is the difference between Positive Impact, No Impact, Negative Impact and Extremely Negative Impact in term of Google Update like panda or penguin etc.
White Hat / Black Hat SEO | | dotlineseo0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Negative SEO
How do identify if somebody is giving you negative links. If I look at who is linking my site I suddenly see an none related website linking to my site http://plastische-chirurgie-borsten.be/ URL is translated "plastic-surgery-breast" The site is full of links. Would this be an attempt to negative SEO? How can I see the effect of such links?
White Hat / Black Hat SEO | | nono_1974
Should I disavow this link? kind regards,0 -
Identifying a Negative SEO Campaign
Hi A friend/clients site has recently dropped 2-3 pages (from an average #2 - #3 position on page 1 over last few months) for a primary target keyword & suspects a Neg SEO campaign hence asked me to look into it. I checked on Removeem and the KW does not generate a red (or even a pink) result. I looked at Ahrefs & MajSEO, backlinks and referring domains have dropped over the period the KW dropped hence presume i can be sure its not a neg campaign since this would show an opposite pattern (as per articles like this: http://moz.com/blog/to-catch-a-spammer-uncovering-negative-seo ) ? Also site has very few site wide backlinks. The keyword is a 3 word phrase with 2 of those words being in the domain and brand name hence presume such kw are relatively safe from neg seo campaigns anyway I would have presumed the backlink/ref-domain drop may well explain the ranking drop but site still in first field of view of page 1 for the other keyphrases which 2 out of the 3 are words are same as effected keyphrase (and also in the domain/brand name) so would have thought these would have dropped too if a neg campaign. Also many of the anchor texts in the disapeared backlinks are for one of the other partial match variant keyphrases which are still top of page 1. Anchor text is at 4.35% for the effected kw according to MajSEO Im pretty confident from the above that i can conclude no negative seo campaign has occurred, nor other type of penalty and probably just a 'wobble' at Google that may well right itself shortly Would appreciate feedback though from others that im concluding correctly just for confirmation ? Many Thanks Dan
White Hat / Black Hat SEO | | Dan-Lawrence1 -
Possibilities of Negative Co-Citation and/or Co-Occurrence?
Knowing how co-citation and co-occurrence function, or how we speculate that they function, it seems there could be several ways that competitors could associate negative words and phrases with sites they compete with. This could also be disastrous for reputation management. Someone could associate negative terms about a person or business without linking to them and it could do harm. Does this make sense? Is this possible or are there safe-checks in place?
White Hat / Black Hat SEO | | Atlanta-SMO0 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090