App Store Optimization - Google Play Console A/B Testing: Should I Optimize for Active Devices or Users
-
Can't seem to find any ASO type of communities to ask this question to and since the Moz community has been so helpful, I thought I'd try this out.
I've been doing A/B Testing for featured graphics on the Google Play Store. There are segments of active devices and users and I've been keeping track daily for ~1.5 months. The data that I have written down displays that active devices shows a positive, however users show a negative. Google Play Console choose to display the scaled installs for active devices.
When we do A/B Tests on Google Play Console, should we choose the winner based on the active devices or users?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Show Unique User Goal Completions in Analytics Instead of Totals
Hello Everyone, Currently within Google Analytics, I have a mostly unfiltered view and several goals setup. One of my goals is tracking the download of an eBook (setup to track visits to the 'success' page of the download). At the moment (as I understand it), that tracks "TOTAL" goal completions. So, if the same user/person downloads the eBook twice, I will see 2 goal completions. What I'm trying to figure out is how to generate a report or view or something that tells me how many "unique" users have downloaded the eBook. Is this possible to do? And, can I do it using past data or do I have to setup a new filter that would only track unique users going forward? Thanks in advance.
Conversion Rate Optimization | | Elite-Rob0 -
Server-Side A/B Testing - Okay for SEO?
Hey Moz Community! I've been digging into the differences between server-side testing and client-side testing and had a generic question. Is it safe to run server-side A/B testing? For example, if I want to Split Test the home page of a site and show 50% of my traffic one home page, and show 50% of my traffic a completely different (read: new template, new content, new CTAs, etc) home page, are there any implications to SEO and organic search? I've spent about five hour researching and from what I can find A/B testing is acceptable as long as you don't show Googlebot different content or run A/B tests on Googlebot. Matt Cutts, head of Webspam at Google, has stated that A/B testing does not impact search rankings. "A/B or split testing or other forms of testing web sites is okay by Google as long as you don't test GoogleBot or don't treat GoogleBot differently." The biggest concerns for SEO cloaking, so from my understanding, for server-side testing, you'd need to do user-agent based redirection so that Googlebot (or any search bot) gets the normal version of the home page. The bots shouldn't be part of the test. Technically that is cloaking, but intention-wise, we're not trying to be sneaky. I've also read through this article about experimentation from Google developers here. Am I missing anything here or is there a definitive answer? If we serve a “B” as a different site for user testing, just exclude google bot by user-agent and we’re good? THANKS!
Conversion Rate Optimization | | andrewmeyer0 -
Customer journey / customer drop off
Hi All, I would like to understand how visitors navigate through my site and find out where the main drop out areas are (i.e. what pages / sections of the site do users leave on). I will then be segmenting by mobile, tablet, new visitor, returning etc. to see how the various subsets of users behave. To do this I generally do the following: Identify main sections of the (ecomm) site: homepage, category pages, product pages, cart, checkout 1, checkout 2, checkout x, payment confirmation. For each section above I either use a segment to isolate that section of the site, either by regex or a simple page selector and apply to the Audience >> Overview report and record the resulting session count. OR I filter the Behaviour >> Site Content >> All Pages report to isolate the various site sections and record unique pageviews. I then plot these figures horizontally under a heading for each section of the site representing a flow between the pages of the site with a calculation showing the difference between each section of the site which represents user drop off. Hope that makes sense. What I am interested to know is, do you have any better suggestions to the process laid about above. Do you see any issues with this process?
Conversion Rate Optimization | | datarat1 -
Google Experiments issues
Hey fellow Mozzers! We're in the middle of working on CRO for a client of ours and we were going to try out Google Experiments for the first time. Following the click path described on the Google help page should take me to an option that says 'Create Experiment', but this isn't coming up - do we need to create the various URLs before it will let us set up an experiment, or has the system changed now? I've found in the past Google has a habit of updating its interfaces but not the advice on using the affected tools, so wondering if I'm looking at old information? Secondly, does the Experiment show a user the same variation of the page throughout their journey? What I mean by that is: their site currently has a real mish-mash of page styles, and I think a large part of boosting their conversions is probably down to ensuring consistency across the site as it currently seems like you're bouncing around different sites. But will this issue be made worse by running Experiments - ie, will I enter the site on Variation A, click a link and be shown the next page in Variation B and so on? If so, are we better off running the experiment on one page at a time and using what we've learnt to impact the next page? Thank for any help you can give! Nick.
Conversion Rate Optimization | | themegroup1 -
Are website optimization and conversion rate optimization roughly the same thing?
This is mostly a semantics question, but I also want to check that I have a basic understanding of the two concepts. Are the two terms more or less interchangeable or are there any crucial differences? I always thought of website optimization as the complementary partner to SEO. While the ultimate goal of SEO is getting people TO your website, website optimization is focused on refining your website so that those people STAY on your website. When I think of conversion rate optimization, I'd imagine that's pretty much the same goal. Refining a website so that more people stay and ultimately convert (buy something, subscribe to a newsletter, etc). Is my understanding of one (or both) of them flawed, or is it six of one, half a dozen of the other? Thanks!
Conversion Rate Optimization | | BrianAlpert780 -
What company/person do you recommend for improving conversion rates on landing pages?
Hello! I'm looking for specific recommendations of companies with whom you have been "Tickled pink at fantastic conversion improvement results!" I'm the customer for this service. 🙂 I seek a page-level review of a specific area of my site. I'm looking for output of specific recommendations for improving conversion rates. I want to avoid paying for a "generic best practices report" with one or two ideas for my site sprinkled in. I realize that I will be called to complete proper input into the process. I am ready to do my share with the level of detail required. So, if you have an enthusiastic recommendation, please share. Details are appreciated. Most important, please tell me why this company made you very happy. I sincerely look forward to SEOMozzer input. Thank you kindly, Loren
Conversion Rate Optimization | | groovykarma0 -
Which A/B test software do you prefer? Howcome?
I am about to start some research on software for experimental design on websites. So I'd really like to know which A/B test software do you prefer? Howcome?
Conversion Rate Optimization | | ThomasHgenhaven1 -
How accurate is the Geo-Targeting of Google and Bing/Yahoo PPC ads?
I have a client who serves a local market and who has had trouble in the past with people out of her service area clicking on ads. She asked how accurate the geo-targeting option is. I know it's not possible to be completely accurate resolving the location of IP addresses, but was wondering if there are any recent statistics out there regarding how accurate or inaccurate geo-targeting is. I did some quick searches but did not see any current numbers. Many Thanks!
Conversion Rate Optimization | | JKuly0