How to hold a variable constant for an A/B test
-
For example, let's say you want to A/B test a title tag change. You are hoping to identify whether a title tag change increases CTR. But, position is always fluctuating a bit and that affects CTR, too. So, I'm interested in how you could hold position constant in order to isolate the change in CTR that is due to the title tag change. Does anyone know of resources/tools/tutorials for how to do this?
It's been... a very long time since I took statistics (-: I have access to Excel, MS Access, and R studio.
-
Thanks for the reply. I'll check out the links you included.
I thought it might be helpful to show an example of what got me thinking along these lines. This is a study about A/B testing of title tag changes. They don't say they accounted for position fluctuation; maybe they didn't, but it seems like you'd have to in order to have meaningful results.
-
I don't see a way of doing this accurately. With organic rank fluctuations, seasonality, competition it's near impossible to "hold" a position. However, if you have a position in serps for a specific keyword with little fluctuation and get decent traffic you can establish this as your baseline. Once the title is updated and appears in the serps, begin monitoring w/search console, analytics or etc. and see if you see any decent jumps. Some links for additional information: Finding the ROI of Title tag changes using Google's CausalImpact R package, SEO Split Testing & A Beginner’s Guide to A/B Testing: Effective SEO Landing Pages. Would love to hear if anyone else has a better way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Validator vs. Rich Results Test
I am working on a schema markup project. When I test the schema code in the Schema Markup Validator, everything looks fine, no errors detected. However, when I test it in the Rich Results Test, a few errors come back.
Intermediate & Advanced SEO | | Collegis_Education
What is the difference between these two tests? Should I trust one over the other?1 -
Redirect wordpress from /%post_id%/%postname%/ to /blog/%postname%/
Hi what is the code to redirect wordpress blog from site.com/%post_id%/%postname%/ to site.com/blog/%postname%/ We are moving the site to a new server and new url structure. Thanks in advance
Intermediate & Advanced SEO | | Taiger0 -
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
Should I literally delete all the articles I published in 2010/2011?
We became a charity in December and redirected everything from resistattack.com to resistattack.org. Both sites weren't up at the same time, we just switched over. However, GWT still shows the .com as a major backlinker to the .org. Why? More importantly, our site just got hit for the first time by an "unnatural link" penalty according to GWT. Our traffic dropped 70% overnight. This appeared shortly after a friend posted a sidewide link from his site that suddenly sent 10,000 links to us. I figured that was the problem, so I asked him to remove the links (he has) and submitted a reconsideration request. Two weeks later, Google refused, saying.. "We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes." We haven't done any "SEO link building" for two years now, but we used to publish a lot of articles to ezinearticles and isnare back in 2010/2011. They were picked up and linked from hundreds of spammy sites of course, none of which we had anything to do with. They are still being taken and new backlinks created. I just downloaded GWT latest backlinks and it's a nightmare of crappy article sites. Should I delete everything from EZA/isnare and close my account? Or just wait longer for the 10,000 links to be crawled and removed from my friends site? What do I need to do about the spammy article sites? Disavow tool or just ignore them? Any other tips/tricks?
Intermediate & Advanced SEO | | TellThemEverything0 -
E-Commerce site - How do I geo-target towns/cities/states if there aren't any store locations?
Site = e-commerce Products = clothing (no apparel can be location specific like sports gear where you can do the location specific team gear (NBA, NFL, etc)) Problems = a. no store front b. I don't want to do any sitewides (footers, sidebars, etc) because of the penguin update Question = How do you geo-target these category pages and product pages? Ideas = a. reviews with clients locations b. blog posts with clients images wearing apparel and location description and keywords that also links back to that category or be it product page (images geo- targeted, tags, and description) c. ? Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0 -
Internal Search / Faceted Navigation
Hi there, I'm working on an e-learning site with the following content pages: main page, category pages, course pages, author pages, tag pages. We will also have an internal search for users to search by keyword for courses & authors & categories. Is it still recommend to "noindex, follow" and disallow in robots.txt internal search results? Or for a site like this, is it better to use faceted navigation? It seems that faceted navigation is mostly for e-commerce sites. What is the latest thinking on SEO best practices for internal search result pages?
Intermediate & Advanced SEO | | mindflash0 -
Domain w/ Identical Content to Site we are Optimizing
Hi Guys, We've been optimizing a client's site for about a year or so now and on a call the other day the client brought up that he owns and operates another site that's marketing the same product, but to a difference audience (we work on the direct to consumer side, this is a distributior focused site),with the same exact content as the site we are optimizing. Obviously this is a major duplcant content issue and we need to get it resolved very quickjly. We have already reccomendt to the client that we re-write content, but this is where my questions comes in - Which site should we rewrite the content on? The site we are optimizing is the more impoorant of the two, while we still want the other site to hold rankings we dont want to end up accidently optimizing the other site wherein the site we are working on full time suffers a lost when a "compeiting" site creates compeltely new content and may, accidentally, end up ranking higher than the site we are focusing on full time. As links also play a role, would that be a KPI to look at here in determining which site gets new content and which does not? In this scenairo, would would you guys recommend? Just want to make sure I'm dotting all my I's, and crossing T's here. Many thanks to all in advance, Mike
Intermediate & Advanced SEO | | Havas_Disco0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280