Adjustable Bounce Rate
-
Hi
I've been looking at analysing bounce rate in more depth, I wondered what people's views on adjustable bounce rate were? I've been reading this article http://searchenginewatch.com/sew/how-to/2322974/how-to-implement-adjusted-bounce-rate-abr-via-google-tag-manager-tutorial
Is it worth adding this? Or is it just as useful to look at time on page over bounce rate?
-
I've only just seen this
Thank you! I'll try and get to grips with User Flow, I need to dedicate some time to analysing the data
Becky
-
Hi
Thank you for the reply. I have looked at User Flow but I tend to get a bit lost in the amount of data and finding exactly what I need.
Can you segment and filter this by landing page?
I can see the drop offs, but not the drop off for new users - or is this report based on new users only?
Thank you!
-
Hi Becky,
You are correct - normally if a tag is fired it won't be counted as bounce (unless you set "noninteraction=true" - check https://support.google.com/analytics/answer/1033068#NonInteractionEvents)
Dirk
-
Amazing thanks!
-
Picking up on Dirk saying:
I prefer to know if people scroll to the end of the page (so I assume they have read the article) rather than just put an arbitrary time to fire an event.
This was shared the other day - it's a way of pulling in scroll-depth data into your Google Analytics reports. Incredibly useful:
-
Thanks, for me I think I want to know what pages people find useful and what ones they don't but with ecommerce it's a bit more difficult.
My overall goal is to provide content the user wants to see on product pages.
On that last point, I thought that when you add code to fire an event when someone has been on a page for X amount of time, if they only access this page, but you've set this event - it won't be counted as a bounce?
I'll read up on the ecommerce tracking too thanks!
-
It can be useful - it depends on what you want to know. If you do not implement either of them - the time on site will not be correct as there will be no time on site calculated for bounced visits.
Personally - I prefer to know if people scroll to the end of the page (so I assume they have read the article) rather than just put an arbitrary time to fire an event. It will in both cases make the time measurement on your site more accurate. Both ways of measurement will reduce the bounce rate.
I think it's certainly useful for e-commerce - but then I would rather use enhanced e-commerce tracking.
I don't really understand what you mean with "I thought that if you took into account the time spent on page, and set these parameters in analytics, that it wouldn't in fact be counted as a bounce?" - could you explain?
Dirk
-
Hi Dirk,
Thanks for your response. So are you saying Adjustable Bounce rate is also not beneficial?
I thought that if you took into account the time spent on page, and set these parameters in analytics, that it wouldn't in fact be counted as a bounce?
I'll also look into the content tracking you mentioned - is this also useful for ecommerce? I'm not always expected people to scroll right to the end of pages.
Thanks
-
Time on page has the same issue - suppose somebody visits your site - spends 10 minutes reading an article & then goes to another site. It will be counted as a bounced visit - but even worse - the 10 minutes spend on your site will not be measured in Analytics (check http://cutroni.com/blog/2012/02/29/understanding-google-analytics-time-calculations/)
This is one of the advantages of the Advanced Content tracking - it measures better what people are doing on your site. The fact that the bounce rate decreases for me isn't the big win - the fact that you get better time measurement on site & that you can check the interaction (do they scroll to the end) are the things that bring benefit.
If you don't want to use the tag manager - you can also do this with the normal tracking code: http://cutroni.com/blog/2014/02/12/advanced-content-tracking-with-universal-analytics/ (Cutroni is the Analytics Advocate @Google)
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved I have a "click rate juice" question would like to know.
Hello I have a "click rate juice" question would like to know. For example. I created a noindex site for a few days event purposes. Using a random domain like this: event.example.com. Expecting 5000+ clicks per day. Is it possible to gain some traffic juice from this event website domain "example.com" to my other main site "main.com" but without exposing its URL. Thought about using 301 redirecting "example.com" to "main.com". But it will reveal the example-b.com to the general public if someone visits the domain "example.com". Also thought about using a canonical URL, but it would not be working because the event site is noindex. or it would not matter at all 🤔 Wondering if there is a thing like this to gain some traffic juice for another domain? Thanks
Intermediate & Advanced SEO | | Blueli0 -
SEO Adjustments Where Content Isn't Front And Centre...
So I am wondering what people think for a SEO strategy for sites where (1) the interaction is a one-off event and (2) content is not often shared or something that people want. Specificially regarding two sites this applies to: Site 1 is basically a mortgage site. So customers interact with the site once and then most likely never again once their mortgage is sorted. Mortgages aren't great content pieces and customers don't really read a lot of the content - it's part of the reason loan officers/mortgage professionals exist... Site 2 is also for a one off purchase but it's an embarrassing problem that nobody would share content for because they don't want people to know that they sought help for this. This also makes getting backlinks hard. Also it is a one off purchase, never to be made again... Am interested in how people would adapt their SEO strategies to these circumstances - where content development and promotion is limited...
Intermediate & Advanced SEO | | GTAMP0 -
Should I set a max crawl rate in Webmaster Tools?
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue). Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing? I found this post on Moz, but it's dated from 2008. Any thoughts on this?
Intermediate & Advanced SEO | | LiamMcArthur0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Google Analytics: how to filter out pages with low bounce rate?
Hello here, I am trying to find out how I can filter out pages in Google Analytics according to their bounce rate. The way I am doing now is the following: 1. I am working inside the Content > Site Content > Landing Pages report 2. Once there, I click the "advanced" link on the right of the filter field. 3. Once there, I define to "include" "Bounce Rate" "Greater than" "0.50" which should show me which pages have a bounce rate higher of 0.50%.... instead I get the following warning on the graph: "Search constraints on metrics can not be applied to this graph" I am afraid I am using the wrong approach... any ideas are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau0 -
How can a page be top rated for a phrase it does not have?
Hi, While looking to buy a Christmas gift to my wife I was searching for yellow diamonds. Being a bit familiar with SEO I gotta understand how the following page was ranked 4th for "yellow diamonds": http://www.bluenile.com/diamonds/fancy-color-diamonds The phrase yellow diamonds is not mentioned even once! Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Intermediate & Advanced SEO | | NEWCRAFT0 -
Blog - on the domain or place on separate site, now that Panda ranks for bounce, TOP, depth of visit
Over 10 years ago, we decided to run our blog external to our main website. contrary to conventional wisdom then, we thought we’d have more control/opps for generating external anchor text links, plus working in a bona fide blog software environment (WP). As we had hoped, the blog generated alot of strong inbound links, captured inbound links of it own from other sites and I think, helped improve our SERPs and traffic. Once the blog was established and with the redesign of the website, we capitulated, and finally moved the blog onto the main domain. After reading a number of pieces on Panda and the new reality of SEO, sounds like bounce rates (in particular), time on page, and other GA measures may have a more profound influence on google rankings now. Given that blogs are notoriously for high bounce rates (ours is), low time on site, depth of visit, seems logical that it adversely affects our site averages for the main domain). Is it time to re-consider pulling our blog off the main domain to reassert the ‘true’ GA measures of the main domain? I guess it still gets down to the question... is the advantage of all the inbound links to the blog on the main domain of greater value than moving the blog off-site and reasserting better 'site stats' for google's pando algo? Thanks.
Intermediate & Advanced SEO | | ahw0