Google User Click Data and Metrics
-
Assuming that Google is using click data from users to calculate rankings (bounce rate, time on site, task completion, etc.) where does Google get the data, especially from browsers that aren't Chrome?
-
This was example with GA. I believe that they use dwell time and next or subsequent searches for this.
Because they can't fight against shopping cart abandonment's and other issues. So they have some as benchmark against other sites. If your metrics are above average in your industry then it's great. If your metrics are weak - you're in trouble. You can see benchmarking in Google Analytics. So whatever you do just try to make better metrics than them. Example - i just have seen that some of mine sites have pages/session 1.40 vs 2.99 in benchmark. Also mine session duration is 1:32 vs. 2:19 in benchmark.
Similar metrics are in PPC too - you need to be above the average for better positions, prices and conversions.
I know that all this explanation can sound little bit messy... but this is question all SEO specialists think about these days. If you know the answers you can become millionaire and retire quick.
-
But how does Google measure form completions or purchases for rankings?
Again, I'm not talking about Google analytics. We use it heavily for our ecommerce sites. I know how the UA tracking code works. Google claims that they don't use GA data for rankings, and I would tend to believe them.
-
That's tricky. There are lot of theories about Analytics, Chrome, AI, RNN, etc. Of course there also lot of speculations too!
BUT here Josh Bachynski explain that task completion is correlated with with user metrics - time on session, bounce rate and average pages per session. Also others - please note subsequent search in mine prev answer. So in theory sites with better time, less bounce are considered as high quality. You can check also other videos from Josh in YouTube where he explain this many times.
One of easiest way to track task completion is to add goals in Analytics and/or add events tracking too. Goals can be different - contact form filled, lead form filled, software download, whitepaper request, signup form, playing video, etc. Events can be - comments viewed, gallery viewed, video stopped, etc.Then you can see how many of your visitors do tasks and how many do events. This will be for your own insurance that they're inside of page and do something there.
Trick is that Google will use only SERP visitors and their metrics. I can have site with 20k visitors daily from Facebook/Twitter and only 200 from Google SERP. I don't saying that 20k visitors can be wrong, but they will be almost useless for clicking test. Things will be different if we have 20k daily from SERP and 200 from Facebook/Twitter.
So - whatever you do just when you receive SERP traffic keep it in site. This is higher priority for better ranking.
-
Thanks for the answer. Spot on.
There's been a lot of speculation on "task completion" and how it relates to ranking. If completing a task is a purchase on an ecommerce site, how is Google measuring it? Is it only through Chrome or by some other means?
how does Google measure when someone completes a form?
is that possible, or is Google just checking to make sure that the cart and the form work correctly? Was that the point of the "Zombie" update?
-
If you remember before 5 years ago all urls was unencrypted in SERP and lot of tools using this for capturing "keywords" and linking them to pages. After they introduce this in 2010 they begin rollout in few years and today only way to see keywords is in SearchConsole. Of course encryption is for "to improve your search quality and to provide better service". Original text can be seen here. Please note "provide better service" there. This is tricky!
So imagine that you search for moz and here is actual URL i can see now:
https://www.google.bg/search?q=moz&ie=utf-8&oe=utf-8&gws_rd=cr&ei=4wNVVpnZBYGoUZiXh4AG
you can definitely see keyword there in ?q=moz now first result is Moz.com and it's URL is:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFggfMAA&url=https%3A%2F%2Fmoz.com%2F&usg=AFQjCNHNW83KUfvLcZOMILlYW49NobxUig&sig2=nOVvQ05KIPrGB3XFAFmIGgAs you can clearly see - there isn't keyword anymore but everything comes with encrypted data (ved, usg, sig2). This link /url is actual redirector that count your click on specific result and position.Now if i click on 1st result and go in Moz.com i can scroll down and i find "this isn't MOZ i'm looking for" so within some time (few seconds) i will return to SERP. This is actual "dwell time" and bounce back to SERP. It's negative signal because it's show to Google that result he return for first place isn't correct with human verification. Now back on same SERP i can see Moz in Wikipedia:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=19&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFghhMBI&url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FMoz_(marketing_software)&usg=AFQjCNGCgqmsKNIdaZdGrbugf8bJk6NhTg&sig2=jS-vt68NFtD5YhgSV4lTGwIf i click this and i doesn't have return to SERP anymore this give to Google enough to calculate bounce rate for this site (only to return in SERP) so give Wikipedia some "goal completition". And time for next search can be used to calculated "time on site".And since all searches are encrypted they knows when specific user search for something and when they make new search based on already returned data. Example is "Napoleon". This can be anything - french emperor, movie, cake, drink and other things. So now i can do subsequent search "Napoleon height". This is example how one search can give me enough information to do another refined search. Other good example can be "32 us president". Then i can type "franklin d roosevelt height".
This was explained much better in closing MozCon 2015 presentation "SEO in a Two Algorithm World ":
http://www.slideshare.net/randfish/onsite-seo-in-2015-an-elegant-weapon-for-a-more-civilized-marketer
and you should see it. There also shown few tests inside with terrific results. -
I guess I should have phrased the question a little differently. This is not related to Google Analytics.
When I do a Google search, Google is able to track my actions, and is probably using the data as a ranking factor. Josh Bachynski did a Whiteboard Friday on it.
https://moz.com/blog/panda-41-google-leaked-dos-and-donts-whiteboard-friday
How is Google able to track user actions after they click on a SERP listing? Where are they getting their data?
-
Here is a good explanation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google disavow file
Does anybody have any idea how often Google reads the disavow file?
Intermediate & Advanced SEO | | seoman100 -
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Google does not favour php websites?
Hi there. An SEO company recently told me that google does not favour php development? This seems rather sketchy, I have not read that google doesn't favour this anywhere, did I just miss that part of SEO or are these guys blowing a little smoke?
Intermediate & Advanced SEO | | ProsperoDigital1 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Google Places Drop
Hi everyone! I have a client that was ranking very nicely for a number of keywords. In the 5 pack for most of the keywords we were targeting. His account went under review for some unknown reason about 2 months ago. It disappeared from the listing... Then a few weeks ago it became approved again. He is now no longer ranking for any of those keywords. He is ranking for some obscure ones but the money words are gone. Do you think this was due to the review? Some sort of GP update over the last 60 days? All of my other clients are still ranking strong in Google Places. Any ideas?
Intermediate & Advanced SEO | | SeattleJoe0 -
Google Business Places SEO Question
Hi All, I have set up different Google Business Places listings for each of my locations for my business. I want each location to rank for their retrospective keywords. So from an SEO point of view to help these rank.. When choosing Keywords/Categories for my listing , should I use my main keyword + my location example - I have a carpet cleaner hire depot in London and I want to rank for London so should my keyword be just carpet cleaner hire or Carpet Cleaner Hire London. My other depot is say in the location of Watford so should that listing have keywords that contain Watford in them ?. Also - Should the description contain my localised keywords ? I have looked around and cant' find much on this topic Any pointers would be greatly appreciated thanks Sarah
Intermediate & Advanced SEO | | SarahCollins0 -
Does Google check Whois
Hello everyone, I own quite a lot of website active in the same niche and sometimes targeting the same keywords, these sites are hosted at different IP's. But they all have the same Whois details, i was wondering if Google checks the Whois-data? And if it affects the serp's? Regards, Yannick
Intermediate & Advanced SEO | | iwebdevnl0 -
Custom Attributes in Google Places
Hi Guys I'm looking for some clarity of what I can and can't add to the custom attribute fields in a Google Places listing. From my understanding, you can add additional information about your services, but not what those services are. The issue I'm trying to resolve is that a client of mine offers far more than the 5 services/ category options Places allow. They are a home services company, covering all sorts from plumbing, painting and decorating, through to extensions etc. They have about 25 different services. At the moment I'm restricted to just getting rankings for 5 services (correlated to the categories in Places), when I'd like to rank locally for them all. As Google is showing local results for most search queries related to their services whether those searches are geographically modified or not, I'm in a position where even if I am ranking top 5 organically for the terms, I'm still on bottom of page 1, or top of page 2. Would it be wise to add these additional services to the custom attributes section of the Places listing, or would this set off the potential for a listing suspension? Any ideas how to combat this problem would be very welcome.
Intermediate & Advanced SEO | | PerchDigital0