Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is an acceptable bounce rate?
-
0% is of course the best case and 100% would be the worst case but what would is considered average. How do you address this subject with your clients?
-
I use www.blizzardmetrics.com to analyze bounce rates from a benchmark standpoint. Here is some data from November 2016 for 135 websites - Overall Bounce Rate 39.7%, Mobile Bounce Rate: 43.1%, Tablet Bounce Rate 53.7, Desktop Bounce Rate 39.7, Bounce rate from Organic Search Engines 38.4%, Bounce Rate from Direct Type-in: 59.9%, Bounce Rate from Referrals 28.7 (something fishy here, it was 54.1% in October and 69.4% in Nov 16), Bounce Rate from email 61.4 and Bounce Rate from Social 35.3.
This data is dynamic, if you head over to Blizzardmetrics, and add your site, all the numbers will update! If you are an agency and add a bunch of website, you can look at JUST your websites, or, all websites. You can also categorize by industry.
-
I have worked on well over 250 websites and most sites I have worked on have a BR from 15% to 50% and one as low as 3%, but instead of speculating in what is good or not why not let the words from Google's own Analytics Guru tell us what his POV is: "According to Google Analytics Guru Avinash Kausik “It is really hard to get a bounce rate under 20%, anything over 35% is cause for concern, 50% (above) is worrying”. Low/Good bounce rate indicates that visitor engagement on your site is good." There you have it. To add some statistical research here some research findings from rocketfuel.com that has his numbers staggered a little bit different, but kind of along the lines with Avinash: "As a rule of thumb, a bounce rate in the range of 26 to 40 percent is excellent. 41 to 55 percent is roughly average. 56 to 70 percent is higher than average, but may not be cause for alarm depending on the website. Anything over 70 percent is disappointing for everything outside of blogs, news, events, etc. - See more at: http://www.gorocketfuel.com/the-rocket-blog/whats-the-average-bounce-rate-in-google-analytics "
-
Thanks for the great advice. It is Greatly appreciated.
James Gonzales
-
Once you drill down to keywords, pages, time-on page and %exit, I think you'll probably see pretty quickly where you can address some things that will help.
I think you're right on the money with keeping your content fresh.
Good luck.
-
Thanks for the help on this. I went back and looked at our history and noticed that on each site we own our bounce rate increased as traffic to our sites increased. Early on we had our best site with a bounce rate of 36% and a year later we are at 41%. Our product is real estate. I may need to re-fresh our content on the landing pages or maybe it is just the increase and the quality of inbound links that have affected this metric.
-
Depends on the purpose of the page too.
EG: If your call to action of a page is to call a phone number, then a high bounce rate is acceptable due to the purpose being met.
Bounce rate is a great metric to measure improvements and calls to action. Try and get it lower by all means, but there's no silver bullet with bounce rate or magic number.
As David mentioned, a bounce could mean elimination of an unqualified lead, either way it's quality over quantity in most cases.Good luck.
-
Sorry, but there's no one-size-fits-all answer regarding what is acceptable and unacceptable. You should be taking into consideration the intent of the site to help determine what is acceptable. Let me give you an example:
One of our landing pages had a bounce rate of 58%. This was problematic because the landing was designed to generate leads. In essence we only had a shot at converting the 42% of traffic that didn't bounce when they hit the page. And of those, we were converting about 6%. For that particular product, we converted 5 out of every 100 leads generated, and the average lifetime value of the client was about $3K.
Long story short, it worth our time to deal with the higher bounce rate because the potential value of each lead was rather substantial. So, take it on a case-by-case basis, but remember that it's been said Google takes bounces rates into consideration as a ranking factor.
-
I would suggest a couple of things.
First of all I would suggest that bounce rate could be compared to a pulse. Over time, you'll discover an acceptable bounce rate (pulse) for a particular site and those rates may vary from site to site. An acceptable site bounce rate for us is about 50-55%. If the rate pushes toward 60%, it tells me there is something going on that I need to investigate more deeply.
If you're in ecommerce, product feeds will affect your bounce rate and you'll need to identify products that adversely inflate your bounce rate and address accordingly.
Secondly, bounce rate also applies to pages (which in turn affects site rate). Its relatively easy to identify pages that are affecting bounce rate. I know what pages on our site will have a higher bounce rate than others. If there is something I can do do reduce the bounce rate for a page, I do it.
Having said all that, I would throw a guess out there that an acceptable bounce rate would be between 45 and 65% with a rate in the 50% being realistic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any Tips for Reviving Old Websites?
Hi, I have a series of websites that have been offline for seven years. Do you guys have any tips that might help restore them to their former SERPs glory? Nothing about the sites themselves has changes since they went offline. Same domains, same content, and only a different server. What has changed is the SERPs landscape. I've noticed competitive terms that these sites used to rank on the first page for with far more results now. I have also noticed some terms result in what seems like a thesaurus similar language results from traditionally more authoritative websites instead of the exact phrase searched for. This concerns me because I could see a less relevant page outranking me just because it is on a .gov domain with similar vocabulary even though the result is not what people searching for the term are most likely searching for. The sites have also lost numerous backlinks but still have some really good ones.
Intermediate & Advanced SEO | | CopBlaster.com1 -
How to get back links with higher rank ?
Hi All , These days I am finding new ways of creating back links. Could any one tell me how to get backlinks with higher DA ?
Intermediate & Advanced SEO | | mozentution2 -
Ogranization Schema/Microformat for a content/brand website | Travel
Hi, One of our clients have a website specific to a place, for eg. California Tourism in which they publish local information related to tourism, blogs & other useful content. I want to understand how useful is to publish Organization Schema on such website mentioning the actual Organization, which in this case is a Travel Agency? Or any other schema would fit in for such websites?
Intermediate & Advanced SEO | | ds9.tech0 -
Click Through Rate on Password Protected Pages
Hi Moz community, I have a website that has a large database with 800+ important pages, and want Google to know when people visit and stay on these pages. However, these pages are only accessible to people once they create an account with a password, and sign in. I know that since these pages are password protected, Google doesn't index them, but when our visitors stay for a while on our site browsing through our database, does this data get included in our CTR and Bounce Rate by Google? This is really important for Google to know about our database (that people are staying on our site for a while) for SEO purposes, so I wanted to know that if the CTR gets measured even though these pages aren't crawled. Thanks for the help!!
Intermediate & Advanced SEO | | danstern0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Google Analytics: how to filter out pages with low bounce rate?
Hello here, I am trying to find out how I can filter out pages in Google Analytics according to their bounce rate. The way I am doing now is the following: 1. I am working inside the Content > Site Content > Landing Pages report 2. Once there, I click the "advanced" link on the right of the filter field. 3. Once there, I define to "include" "Bounce Rate" "Greater than" "0.50" which should show me which pages have a bounce rate higher of 0.50%.... instead I get the following warning on the graph: "Search constraints on metrics can not be applied to this graph" I am afraid I am using the wrong approach... any ideas are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0