Does Google Analytics Adjusted Bounce Rate Lead to Increase in Average Time per Visitor?
-
Hello,
I just recently implemented adjusted bounce rate onto one of the websites that I track via google analytics. (http://searchenginewatch.com/article/2322974/How-to-Implement-Adjusted-Bounce-Rate-ABR-via-Google-Tag-Manager-Tutorial)
Since doing so, obviously my bounce rate has gone down significantly, nearly half of what it use to be, but I've also noticed an increase in the average time per visitor. In fact, the increase of average time per visitor began the same day I adjusted the bounce rate.
Has this happened to anyone else?
Can someone please explain why/how this may occur?
-
You are correct, adding code to a page to 'adjust' the bounce rate can effect your 'average time per visitor' statistic.
This is because of how google measures the time spent on a page...
Normally, if a user opens one page, then does not visit any more pages on your site, it will count as a bounce (even if the user had remained on that page browsing for 10 minutes). This is because there is only one call made to google analytics when the page is opened. There is no call made to google analytics when the page is closed.
So normally, the 'time on page' is calculated by taking the time stamp of when the current page is opened, and comparing it to when the next page on your site is opened. The difference between the two is your 'time on (previous) page'.
So what happens when a user only opens one page on your site and leaves (bounces)? This will be counted as a 0 second visit (even if the user was on the site for 10 minutes). Thus bringing down the average visit time for all visits.
What happens when you add the 'adjusted bounce rate' code to your page, is that a 2nd call is made after x seconds to the google server.... Allowing google to know that the user has in fact remained on the page for an extended period of time. So now a whole bunch of these '0' second (bounced) sessions will be converted to longer sessions based on the time between the 2 time stamps.
The more 'one page only' visits you have to your site, the more this has the potential to skew your average session time.
On a side note, this will also effect the last page visited of multi-page sessions, as normally google would not know how much time was spent on the last page of the site as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Search Algorithm update to 'Local Snack Pack'
Hi there - I was wondering if anyone else has noticed a big shift in the Google Local 'snack pack' in the past 48 hours? We have noticed a big change in clients results - specifically today. Has anyone else noticed any changes or perhaps data on possible changes? I am aware of this update: https://www.seroundtable.com/big-google-search-algorithm-ranking-update-29953.html but perhaps there maybe another update since. Any input would be much appreciated! Phil.
Algorithm Updates | | Globalgraphics0 -
What are the technical details (touchpoints) of a website gathered by Google?
Hi all, Google crawls all the webpages and gathers content to index and ranking. Beside this general info, what are the all other possible technical details Google will be gathered about a website to rank or penalise or optimise the website in SERP? Like IP address, DNS server, etc.......Please share your knowledge and ideas on this. Thanks
Algorithm Updates | | vtmoz0 -
Consistent drop every time after ranking good for few days: Same experience?
Hi all, We been facing this ranking fluctuation issue for over an year. Every time we made some changes for better optimisation. We improve rank but eventually drop after few days. Most of the changes we employed are On-page like page loading, fixing broken links & redirects, page titles optimisation, etc. When can see the ranking improvement for our main keywords and related keywords for a while and we drop with in a week eventually. I wonder if someone face the same issue and any thoughts on this scenario? Thanks
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Am I the only one experiencing this Google SERP problem?
I perform Google searches every single day, sometimes several times in a day. These searches have nothing to do with being a marketer--they're simply as a consumer, researcher, person who needs a question answered, or in other words: a typical person. For about the past month or so, I have been unsuccessful at finding what I'm looking for on the first try EVERY SINGLE TIME. Yes, I mean it--every single time. I'm left either going all the way to the third page, clicking dozens of results and retuning to the SERPs, or having to start over with a differently worded query. This is far too often to be a coincidence. Has this been happening to anymore else? I know there was a recent significant algorithm update, right? I always look at algorithm updates through the eyes of an SEO, but I'm currently looking at it through the eyes for an average searcher, and I'm frustrated! It's been like trying to find something on Bing!
Algorithm Updates | | UnderRugSwept0 -
Help on Page Load Time
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too. I've made a research, and I found that GA is used to track page load time automatically, isn't it?
Algorithm Updates | | ivan.precisodisso0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Shortened Title in Google Places/Local Results in SERPs
I've been doing some local SEO lately and noticed something today. When I do a search for "State/town name Cat Toys", I see the title tag of the website in the local results as opposed to the business name. I'm happy they are showing up above the normal results, but I wonder if having the brand name at the end of the site title impacts clicks. For example: Site name: New Hampshire Cat Toys and Accessories | Cats R Us But in the places results the title is cut short because they show the address, so all they see is: New Hampshire Cat Toys and.... Do you think branding is especially important in local results? Or less important? I could hear arguments for both sides. I realize the site URL is shown in green below the title, but it's not the same as having a brand in the title portion. It also looks like some of the competition has just their name show up as opposed to their website title. Is this something I can fix in Google Places, or is something Google does on its own? Cheers, Vinnie
Algorithm Updates | | vforvinnie1