Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
-
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
-
Here is Google's official recommendations for website testing. According to them, no amount of cloaking is okay. Try using one of the other methods suggested.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Wanna see Negative SEO?
One of my clients got hit with negative SEO in the past few days. Check it out in ahrefs. The site is www.thesandiegocriminallawyer.com. Any advice on what, if anything, I should do? Google disavow? Thanks.
White Hat / Black Hat SEO | | mrodriguez14401 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Creating duplicate site for testing purpose. Can it hurt original site
Hello, We are soon going to upgrade the cms to latest version along with new functionlaities - the process may take anywhere from 4 week to 6 weeks. may suggest - we need to work on live server, what we have planned take exact replica of site and move to a test domain, but on live server Block Google, Bing, Yahoo - User-agent: Google Disallow: / , User-agent: Bing Disallow: / User-agent: Yahoo Disallow: / in robots.txt Will upgrade CMS and add functionality - will test the entire structure, check url using screaming frog or xenu and move on to configure the site on original domain The process upgradation and new tools may take 1 - 1.5 month.... Concern is that despite blocking Google, Bing & Yahoo through User agent disallow - can still the url can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.. Many thanks
White Hat / Black Hat SEO | | Modi1 -
Yet another Negative SEO attack question.
I need help reconciling two points of view on spammy links. On one hand, Google seems to say, "Don't build spammy links to your website - it will hurt your ranking." Of course, we've seen the consequences of this from the Penguin update, of those who built bad links got whacked. From the Penguin update, there was then lots of speculation of Negative SEO attacks. From this, Google is saying, "We're smart enough to detect a negative SEO attack.", i.e: http://youtu.be/HWJUU-g5U_I So, its seems like Google is saying, "Build spammy links to your website in an attempt to game rank, and you'll be penalized; build spammy links to a competitors website, and we'll detect it and not let it hurt them." Well, to me, it doesn't seem like Google can have it both ways, can they? Really, I don't understand why Competitor A doesn't just go to Fiverr and buy a boatload of crappy exact match anchor links to Competitor B in an attempt to hurt Competitor B. Sure, Competitor B can disavow those links, but that still takes time and effort. Furthermore, the analysis needed for an unsophisticated webmaster could be daunting. Your thoughts here? Can Google have their cake and eat it too?
White Hat / Black Hat SEO | | ExploreConsulting0 -
Negative SEO impacting client rankings - How to combat negative linking?
I have a client which have been losing rankings for the key term "sell gold" in Google AU. However, while doing some investigating I realized that we have been receiving links from bad neighborhoods such as porn, bogus .edu sites as well as some pharmaceutical sites. We have identified this as negative SEO and have moved forward to disavow the links in Google. However, I would like to know what other measures can be taken to combat this type of negative SEO linking? Any suggestions would be appreciated!
White Hat / Black Hat SEO | | dancape0 -
Is there such thing as white hat cloaking?
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this? Thanks!
White Hat / Black Hat SEO | | CHECOM0 -
Negative SEO on my website with paid +1's
Hi guys, I need a piece of advice. Some scumbag played me quite well with paid +1's on my two articles and now I'm in a problem.
White Hat / Black Hat SEO | | Fastbridge
http://sr.stateofseo.com/seo-vesti/google-implementacija-ssl-protokola-not-provided-problem/
http://sr.stateofseo.com/napredni-seo/najnovije-promene-google-panda-algoritma/
They are both translated articles (written originally by me on the same website). I've noticed those +1's (476 on both articles) when my website received a penalty for "SEO" keyword on Google.rs (Serbian Google) and I'm now on the 11th page.
Other keywords still rank just fine. Not cool, right? Now, I think there could be two solutions:
First one is to remove my inner link that's pointing to my homepage with "SEO" anchor, and hope for the best. Second one is to completely remove/delete those two articles and wait for Google to reindex the website and hopefully remove my ban. Do you guy have some other ideas how can I fix this or remove / disavow those +1 or somehow explain to the Google crew / algo that I'm just a humble SEO without any evil thoughts? 🙂 Thank you in advance.0