What is an acceptable bounce rate?
-
0% is of course the best case and 100% would be the worst case but what would is considered average. How do you address this subject with your clients?
-
I use www.blizzardmetrics.com to analyze bounce rates from a benchmark standpoint. Here is some data from November 2016 for 135 websites - Overall Bounce Rate 39.7%, Mobile Bounce Rate: 43.1%, Tablet Bounce Rate 53.7, Desktop Bounce Rate 39.7, Bounce rate from Organic Search Engines 38.4%, Bounce Rate from Direct Type-in: 59.9%, Bounce Rate from Referrals 28.7 (something fishy here, it was 54.1% in October and 69.4% in Nov 16), Bounce Rate from email 61.4 and Bounce Rate from Social 35.3.
This data is dynamic, if you head over to Blizzardmetrics, and add your site, all the numbers will update! If you are an agency and add a bunch of website, you can look at JUST your websites, or, all websites. You can also categorize by industry.
-
I have worked on well over 250 websites and most sites I have worked on have a BR from 15% to 50% and one as low as 3%, but instead of speculating in what is good or not why not let the words from Google's own Analytics Guru tell us what his POV is: "According to Google Analytics Guru Avinash Kausik “It is really hard to get a bounce rate under 20%, anything over 35% is cause for concern, 50% (above) is worrying”. Low/Good bounce rate indicates that visitor engagement on your site is good." There you have it. To add some statistical research here some research findings from rocketfuel.com that has his numbers staggered a little bit different, but kind of along the lines with Avinash: "As a rule of thumb, a bounce rate in the range of 26 to 40 percent is excellent. 41 to 55 percent is roughly average. 56 to 70 percent is higher than average, but may not be cause for alarm depending on the website. Anything over 70 percent is disappointing for everything outside of blogs, news, events, etc. - See more at: http://www.gorocketfuel.com/the-rocket-blog/whats-the-average-bounce-rate-in-google-analytics "
-
Thanks for the great advice. It is Greatly appreciated.
James Gonzales
-
Once you drill down to keywords, pages, time-on page and %exit, I think you'll probably see pretty quickly where you can address some things that will help.
I think you're right on the money with keeping your content fresh.
Good luck.
-
Thanks for the help on this. I went back and looked at our history and noticed that on each site we own our bounce rate increased as traffic to our sites increased. Early on we had our best site with a bounce rate of 36% and a year later we are at 41%. Our product is real estate. I may need to re-fresh our content on the landing pages or maybe it is just the increase and the quality of inbound links that have affected this metric.
-
Depends on the purpose of the page too.
EG: If your call to action of a page is to call a phone number, then a high bounce rate is acceptable due to the purpose being met.
Bounce rate is a great metric to measure improvements and calls to action. Try and get it lower by all means, but there's no silver bullet with bounce rate or magic number.
As David mentioned, a bounce could mean elimination of an unqualified lead, either way it's quality over quantity in most cases.Good luck.
-
Sorry, but there's no one-size-fits-all answer regarding what is acceptable and unacceptable. You should be taking into consideration the intent of the site to help determine what is acceptable. Let me give you an example:
One of our landing pages had a bounce rate of 58%. This was problematic because the landing was designed to generate leads. In essence we only had a shot at converting the 42% of traffic that didn't bounce when they hit the page. And of those, we were converting about 6%. For that particular product, we converted 5 out of every 100 leads generated, and the average lifetime value of the client was about $3K.
Long story short, it worth our time to deal with the higher bounce rate because the potential value of each lead was rather substantial. So, take it on a case-by-case basis, but remember that it's been said Google takes bounces rates into consideration as a ranking factor.
-
I would suggest a couple of things.
First of all I would suggest that bounce rate could be compared to a pulse. Over time, you'll discover an acceptable bounce rate (pulse) for a particular site and those rates may vary from site to site. An acceptable site bounce rate for us is about 50-55%. If the rate pushes toward 60%, it tells me there is something going on that I need to investigate more deeply.
If you're in ecommerce, product feeds will affect your bounce rate and you'll need to identify products that adversely inflate your bounce rate and address accordingly.
Secondly, bounce rate also applies to pages (which in turn affects site rate). Its relatively easy to identify pages that are affecting bounce rate. I know what pages on our site will have a higher bounce rate than others. If there is something I can do do reduce the bounce rate for a page, I do it.
Having said all that, I would throw a guess out there that an acceptable bounce rate would be between 45 and 65% with a rate in the 50% being realistic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved I have a "click rate juice" question would like to know.
Hello I have a "click rate juice" question would like to know. For example. I created a noindex site for a few days event purposes. Using a random domain like this: event.example.com. Expecting 5000+ clicks per day. Is it possible to gain some traffic juice from this event website domain "example.com" to my other main site "main.com" but without exposing its URL. Thought about using 301 redirecting "example.com" to "main.com". But it will reveal the example-b.com to the general public if someone visits the domain "example.com". Also thought about using a canonical URL, but it would not be working because the event site is noindex. or it would not matter at all 🤔 Wondering if there is a thing like this to gain some traffic juice for another domain? Thanks
Intermediate & Advanced SEO | | Blueli0 -
I am lost at where to go. My optimization rating is 95% + and rankings are on pages 4+. I would like to know what I should do to increase my rankings.
My site is Glare-Guard.com. My Domain Authority has not moved from 17 in a long time. i have done everything to optimize the different pages. I have 90%+ ratings for the various pages, yet I am still not even close to the first page for many of the keywords I am looking to rank for. Do you have any tips or ideas? Should I try to rewrite my content and add more information? I am just at a loss for where I should go to get the right traffic to my site. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | bigskyinc0 -
Adjustable Bounce Rate
Hi I've been looking at analysing bounce rate in more depth, I wondered what people's views on adjustable bounce rate were? I've been reading this article http://searchenginewatch.com/sew/how-to/2322974/how-to-implement-adjusted-bounce-rate-abr-via-google-tag-manager-tutorial Is it worth adding this? Or is it just as useful to look at time on page over bounce rate?
Intermediate & Advanced SEO | | BeckyKey0 -
Should I set a max crawl rate in Webmaster Tools?
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue). Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing? I found this post on Moz, but it's dated from 2008. Any thoughts on this?
Intermediate & Advanced SEO | | LiamMcArthur0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
404 Pages. Can I change it to do this without getting penalized ? I want to lower our bounce rate from these pages to encourage the user to continue on the site
Hi All, We have been streaming our site and got rid of thousands of pages for redundant locations (Basically these used to be virtual locations where we didn't have a depot although we did deliver there and most of them was duplicate/thin content etc ). Most of them have little if any link value and I didn't want to 301 all of them as we already have quite a few 301's already We currently display a 404 page but I want to improve on this. Current 404 page is - http://goo.gl/rFRNMt I can get my developer to change it, so it will still be a 404 page but the user will see the relevant category page instead ? So it will look like this - http://goo.gl/Rc8YP8 . We could also use Java script to show the location name etc... Would be be okay ? or would google see this as cheating. basically I want to lower our bounce rates from these pages but still be attractive enough for the user to continue in the site and not go away. If this is not a good idea, then any recommendations on improving our current 404 would be greatly appreciated. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Automated Quality Content Acceptable Even Though Looks Similar Across Pages
I have some advanced statistics modules implemented on my website, which is very high level added value for users. However, wording is similar across 1000+ pages, with difference being the statistical findings.
Intermediate & Advanced SEO | | khi5
Page Ex 1: http://www.honoluluhi5.com/oahu/honolulu-condos/
Page Ex: 2: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ As you can see same wording is used "Median Sales Price per Year", "$ Volume of Active Listings" etc etc....difference being the findings / results are obviously different. Questions: are search engines smart enough to realize the quality in this or do they see similar wording across 1000+ pages and p-otentially consider the pages low-quality content, because search engines are unable to identify the high level added value and complexity in pulling such quality data? If that may be the case, does that mean I ought to make the pages more "unique" by including a little piece of writing about each page to make them look more unique, even though it is not of value to users?0 -
Blog - on the domain or place on separate site, now that Panda ranks for bounce, TOP, depth of visit
Over 10 years ago, we decided to run our blog external to our main website. contrary to conventional wisdom then, we thought we’d have more control/opps for generating external anchor text links, plus working in a bona fide blog software environment (WP). As we had hoped, the blog generated alot of strong inbound links, captured inbound links of it own from other sites and I think, helped improve our SERPs and traffic. Once the blog was established and with the redesign of the website, we capitulated, and finally moved the blog onto the main domain. After reading a number of pieces on Panda and the new reality of SEO, sounds like bounce rates (in particular), time on page, and other GA measures may have a more profound influence on google rankings now. Given that blogs are notoriously for high bounce rates (ours is), low time on site, depth of visit, seems logical that it adversely affects our site averages for the main domain). Is it time to re-consider pulling our blog off the main domain to reassert the ‘true’ GA measures of the main domain? I guess it still gets down to the question... is the advantage of all the inbound links to the blog on the main domain of greater value than moving the blog off-site and reasserting better 'site stats' for google's pando algo? Thanks.
Intermediate & Advanced SEO | | ahw0