Google User Click Data and Metrics
-
Assuming that Google is using click data from users to calculate rankings (bounce rate, time on site, task completion, etc.) where does Google get the data, especially from browsers that aren't Chrome?
-
This was example with GA. I believe that they use dwell time and next or subsequent searches for this.
Because they can't fight against shopping cart abandonment's and other issues. So they have some as benchmark against other sites. If your metrics are above average in your industry then it's great. If your metrics are weak - you're in trouble. You can see benchmarking in Google Analytics. So whatever you do just try to make better metrics than them. Example - i just have seen that some of mine sites have pages/session 1.40 vs 2.99 in benchmark. Also mine session duration is 1:32 vs. 2:19 in benchmark.
Similar metrics are in PPC too - you need to be above the average for better positions, prices and conversions.
I know that all this explanation can sound little bit messy... but this is question all SEO specialists think about these days. If you know the answers you can become millionaire and retire quick.
-
But how does Google measure form completions or purchases for rankings?
Again, I'm not talking about Google analytics. We use it heavily for our ecommerce sites. I know how the UA tracking code works. Google claims that they don't use GA data for rankings, and I would tend to believe them.
-
That's tricky. There are lot of theories about Analytics, Chrome, AI, RNN, etc. Of course there also lot of speculations too!
BUT here Josh Bachynski explain that task completion is correlated with with user metrics - time on session, bounce rate and average pages per session. Also others - please note subsequent search in mine prev answer. So in theory sites with better time, less bounce are considered as high quality. You can check also other videos from Josh in YouTube where he explain this many times.
One of easiest way to track task completion is to add goals in Analytics and/or add events tracking too. Goals can be different - contact form filled, lead form filled, software download, whitepaper request, signup form, playing video, etc. Events can be - comments viewed, gallery viewed, video stopped, etc.Then you can see how many of your visitors do tasks and how many do events. This will be for your own insurance that they're inside of page and do something there.
Trick is that Google will use only SERP visitors and their metrics. I can have site with 20k visitors daily from Facebook/Twitter and only 200 from Google SERP. I don't saying that 20k visitors can be wrong, but they will be almost useless for clicking test. Things will be different if we have 20k daily from SERP and 200 from Facebook/Twitter.
So - whatever you do just when you receive SERP traffic keep it in site. This is higher priority for better ranking.
-
Thanks for the answer. Spot on.
There's been a lot of speculation on "task completion" and how it relates to ranking. If completing a task is a purchase on an ecommerce site, how is Google measuring it? Is it only through Chrome or by some other means?
how does Google measure when someone completes a form?
is that possible, or is Google just checking to make sure that the cart and the form work correctly? Was that the point of the "Zombie" update?
-
If you remember before 5 years ago all urls was unencrypted in SERP and lot of tools using this for capturing "keywords" and linking them to pages. After they introduce this in 2010 they begin rollout in few years and today only way to see keywords is in SearchConsole. Of course encryption is for "to improve your search quality and to provide better service". Original text can be seen here. Please note "provide better service" there. This is tricky!
So imagine that you search for moz and here is actual URL i can see now:
https://www.google.bg/search?q=moz&ie=utf-8&oe=utf-8&gws_rd=cr&ei=4wNVVpnZBYGoUZiXh4AG
you can definitely see keyword there in ?q=moz now first result is Moz.com and it's URL is:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFggfMAA&url=https%3A%2F%2Fmoz.com%2F&usg=AFQjCNHNW83KUfvLcZOMILlYW49NobxUig&sig2=nOVvQ05KIPrGB3XFAFmIGgAs you can clearly see - there isn't keyword anymore but everything comes with encrypted data (ved, usg, sig2). This link /url is actual redirector that count your click on specific result and position.Now if i click on 1st result and go in Moz.com i can scroll down and i find "this isn't MOZ i'm looking for" so within some time (few seconds) i will return to SERP. This is actual "dwell time" and bounce back to SERP. It's negative signal because it's show to Google that result he return for first place isn't correct with human verification. Now back on same SERP i can see Moz in Wikipedia:
https://www.google.bg/url?sa=t&rct=j&q=&esrc=s&source=web&cd=19&cad=rja&uact=8&ved=0ahUKEwjenMnUq6rJAhXIRBQKHXuVCcUQFghhMBI&url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FMoz_(marketing_software)&usg=AFQjCNGCgqmsKNIdaZdGrbugf8bJk6NhTg&sig2=jS-vt68NFtD5YhgSV4lTGwIf i click this and i doesn't have return to SERP anymore this give to Google enough to calculate bounce rate for this site (only to return in SERP) so give Wikipedia some "goal completition". And time for next search can be used to calculated "time on site".And since all searches are encrypted they knows when specific user search for something and when they make new search based on already returned data. Example is "Napoleon". This can be anything - french emperor, movie, cake, drink and other things. So now i can do subsequent search "Napoleon height". This is example how one search can give me enough information to do another refined search. Other good example can be "32 us president". Then i can type "franklin d roosevelt height".
This was explained much better in closing MozCon 2015 presentation "SEO in a Two Algorithm World ":
http://www.slideshare.net/randfish/onsite-seo-in-2015-an-elegant-weapon-for-a-more-civilized-marketer
and you should see it. There also shown few tests inside with terrific results. -
I guess I should have phrased the question a little differently. This is not related to Google Analytics.
When I do a Google search, Google is able to track my actions, and is probably using the data as a ranking factor. Josh Bachynski did a Whiteboard Friday on it.
https://moz.com/blog/panda-41-google-leaked-dos-and-donts-whiteboard-friday
How is Google able to track user actions after they click on a SERP listing? Where are they getting their data?
-
Here is a good explanation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google image search
How does google decide which image show up in the image search section ? Is is based on the alt tag of the image or is google able to detect what is image is about using neural nets ? If it is using neural nets are the images you put on your website taken into account to rank a page ? Let's say I do walking tours in Italy and put a picture of the leaning tower of pisa as a top image while I be penalised because even though the picture is in italy, you don't see anyone walking ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Structured Data and Google Rich Cards for products
It appears Google is moving towards the Rich Cards JSON-LD for all data. https://webmasters.googleblog.com/2016/05/introducing-rich-cards.html However on an ecommerce site when I have schema.org microdata structured data inline for a product and then I add the JSON-LD structured data Google treats that as two products on the page even though they are the same. To make the matter more confusing Bing doesn't appear to support JSON-LD. I can go back to the inline structured data only, but that would mean when Rich Cards for products eventually come I won't be ready. What do you recommend I do for long term seo, go back to the old or press forward with JSON-LD?
Intermediate & Advanced SEO | | K-WINTER0 -
What is Google supposed to return when you submit an image URL into Fetch as Google? Is a few lines of readable text followed by lots of unreadable text normal?
I am seeing something like this (Is this normal?): HTTP/1.1 200 OK
Intermediate & Advanced SEO | | Autoboof
Server: nginx
Content-Type: image/jpeg
X-Content-Type-Options: nosniff
Last-Modified: Fri, 13 Nov 2015 15:23:04 GMT
Cache-Control: max-age=1209600
Expires: Fri, 27 Nov 2015 15:23:55 GMT
X-Request-ID: v-8dd8519e-8a1a-11e5-a595-12313d18b975
X-AH-Environment: prod
Content-Length: 25505
Accept-Ranges: bytes
Date: Fri, 13 Nov 2015 15:24:11 GMT
X-Varnish: 863978362 863966195
Age: 16
Via: 1.1 varnish
Connection: keep-alive
X-Cache: HIT
X-Cache-Hits: 1 ����•JFIF••••��;CREATOR: gd-jpeg v1.0 (using IJG JPEG v80), quality = 75
��C•••••••••• •
••
••••••••• $.' ",#(7),01444'9=82<.342��C• ••••
•2!!22222222222222222222222222222222222222222222222222��•••••v••"••••••��••••••••••••••••
•���•••••••••••••}•••••••!1A••Qa•"q•2���•#B��•R��$3br�
••••%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz���������������������������������������������������������������������������•••••••••••••••••••
•���••••••••••••••w••••••!1••AQ•aq•"2�••B���� #3R�•br�0 -
Hide Aggregation from Google?
Google isn't a fan of aggregation, but sometimes it is a good way to fill out content when you cannot cover every news story there is. What I'm wondering is if anyone has continued to do any form of aggregation based on a category and hide that url from Google. Example: example.com/industry-news/ -- is where you'd post aggregation stories but you block robots from crawling that. We wouldn't be doing this for search value just value to our readers. Thoughts?
Intermediate & Advanced SEO | | meistermedia0 -
Google and JavaScript
Hey there! Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html We have always put JS and CSS behind robots.txt, but now considering taking them out of robots. Any opinions on this?
Intermediate & Advanced SEO | | CleverPhD0 -
Google penalty or what???
Hi, we have a blog site xxxxxxxxxxx.es, that yesterday dissapear from google ranks all of a sudden it only appears if you write xxxxxxxxx.es I have checked gogle webmaster tools and there are no manual actions, no messages. Also, we don't have much links pointing to this site. Webmaster tools show only 319 links. We don't understand what have happenned. Never see something similar. What do you think? Any help would be appreciated. How do you proceed in this cases? It doesn't seem to be a link problem. How do you know what kind of penalty do you have? Thank you. Update: Hi, the domain is www.crearcorreoelectronico.es I have check the majestic seo, ose, and wmt and get the links. We have some links that are not good, but are automatic ones, that some portals generate. Maybe is something related with the content. I don't know Thanks
Intermediate & Advanced SEO | | teconsite1 -
Is there a way to contact Google besides the google product forum?
Our traffic from google has dropped more than 35% and continues to fall. We have been on this forum and google's webmaster forum trying to get help. We received great advice, have waited months, but instead of our traffic improving, it has worsened. We are being penalized by google for many keywords such as trophies, trophies and awards and countless others - we were on page one previously. We filed two reconsideration requests and were told both times that there were no manual penalties. Some of our pages continue to rank well, so it is not across the board (but all of our listings went down a bit). We have made countless changes (please see below). Our busy season was from March to May and we got clobbered. Google, as most people know, is a monopoly when it comes to traffic, so we are getting killed. At first we thought it was Penquin, but it looks like we started getting killed late last year. Lots of unusual things happened - we had a large spike in traffic for two days, then lost our branded keywords, then our main keywords. Our branded keywords came back pretty quickly, but nothing else did. We have received wonderful advice and made most of the changes. We are a very reputable company and have a feeling we are being penalized for something other than spamming. For example, we have a mobile site we added late last year and a wholesale system was added around the same time. Since the date does not coincide with Penquin, we think there is some major technical driver, but have no idea what to do at this point. The webmasters have all been helpful, but nothing is working. We are trying to find out what one does in a situation as we are trying to avoid closing our business. Thank you! Changes Made: 1. We had many crawl errors so we reduced them significantly 2. We had introduced a mobile website in January which we
Intermediate & Advanced SEO | | trophycentraltrophiesandawards
thought may have been the cause (splitting traffic, duplicate content, etc.),
so we had our mobile provider add the site to their robots.txt file. 3. We were told by a webmaster that their were too many
links from our search provider, so we have them put the search pages in a
robots.txt file. 4. We were told that we had too much duplicate content. This was / is true, as we have hundred of legitate products that are similar:
example trophies and certificates that are virtually the same but are
for different sports or have different colors and sizes. Still, we added more content and added no index tags to many products. We compared our % of dups to competitors and it is far less. 5. At the recommendation of another webmaster, we changed
many pages that might have been splitting traffic. 6. Another webmaster told us that too many people were
linking into our site with the same text, namely Trophy Central and that it
might have appeared we were trying to game the system somehow. We have never bought links and don't even have a webmaster although over the last 10 years have worked with programmers and seo companies (but we don't think any have done anything unusual). 7. At the suggestion of another webmaster, we have tried to
improve our link profile. For example,
we found Yahoo was not linking to our url. 8. We were told to setup a 404 page, so we did 9. We were told to ensure that all of the similar domains
were pointing to www.trophycentral.com/ so we setup redirects 10. We were told that a site that we have linking to us from too many places so we reduced it to 1. Our key pages have A rankings from SEOMOZ for the selected keywords. We have made countless other changes recommended by experts
but have seen no improvements (actually got worse). I am the
president of the company and have made most of the above recent changes myself. Our website is trophycentral.com0 -
Check Google ban on domainname
Hello all, If I wanted to know if a domainname has a google ban on it would the following be a good idea to test it. Place an article on the domain page with unique content and then link to the page so its gets indexed and then link to the article from a well indexed page. If it doesn't get indexed there might be a ban on the page, if it does get indexed there is no ban on the page... Or are there other points I should keep in mind while doing this. All help is very welcome. Cheers, Arnout
Intermediate & Advanced SEO | | hellemans0