Bounce Back or Bounce Through
-
Bounce rate is defined as 'single page visits to a site divided by total visits to the site' as I understand it. It could be argued that a well designed site might vector people on to other sites effectively (I generally use Wikipedia this way for instance). On the other hand a site that bounces people back to where they came from may be genuinely poor. So the questions:
Is the bounce rate really calculated in the stated way by Google?
Is it used, as far as we know, as a metric for the search engine?
What should we do to mitigate the effects of this poor metric?!
thanks,
Mike
-
Actually, bounce rate would be of a concern to search engines, at least for visits that originate from the search engine. The SEs want the users to have a good experience, and if a user clicks on a result and then comes right back to the results page, the SEs may feel that the user did not have a good experience with that result and maybe a different result for that query should be shown.
-
Thanks, yes, it looks from this as if the experts think that Google is doing what we would hope they do and not take account of bounce through. Although of course there may be good reasons for a site not wanting bounce through either (as EGOL notes), it shouldn't be a concern for the search engines
-
As far as I'm aware, Google will use your 'bounce back' rate (where by users return to the search results page straight away) as a search metric as this could indicate whether the site is relevant for that specific search query. This was mentioned in the 2011 SEO Ranking Factors Report.
Hope that helps
-
If search engines are using this data they are certainly only using it for sites competing for the same or similar keywords.
A high bounce rate can be bad or it can be "normal". It would be bad if your site is offensive (and people run away), it can be bad if your site has irrelevant content for the query, it can be bad if your site has thin content, you can probably think of more.
It can be normal if you have a dictionary site and the searcher finds the word, gets the definition and leaves happily.
THE IMPORTANT THING TO DO..... I believe that everyone should be working to reduce their bounce rate and any webmaster should be able to find improvements.
The best way to do it is to have relevant links, obviously placed on every page. For example in the dictionary site your goal should be have linked words within the definition, links to related words adjacent to the definition and links to a few enticing articles along the side.
On an article site you can links within the text to related articles, a "recommended" box of links beside the article and even a few enticing links to "popular" or "related" articles where every one will see them.
Try to reduce your bounce rate by improving your site and making your relevant content visible on every page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
Canonicals from sub-domain to main domain: How much content relevancy matters? Any back-links impact?
Hi Moz community, I have this different scenario of using canonicals to solve the duplicate content issue in our site. Our subdomain and main domain have similar landing pages of same topics with content relevancy about 50% to 70%. Both pages will be in SERP and confusing users; possibly search engine too. We would like solve this by using canonicals on subdomain pointing to main domain pages. Even our intention is to only to show main domain pages in SERP. I wonder how Google handles it? Will the canonicals will be respected with this content relevancy? What happens if they don't respect? Just ignore or penalise for trying to do this? Thanks
Algorithm Updates | | vtmoz0 -
Domain location is a ranking factor? Back links & website?
If a website trying to rank in US and it has received many back-links from domains hosting from other countries; how it will impact website ranking? Can a website hosted in country will rank well in other country? How much the hosted location matters? Like....domain hosted in Germany but trying to rank in US?
Algorithm Updates | | vtmoz0 -
Our Journey back to Good Rankings.
17 year old support site on the topic of hair loss. The home page (and pretty much all internal pages) enjoyed Page 1 Place 1 ranking out of 64 million search results for 12 of those years, for our main search phrase: hair loss. Other internal pages ranked #1 for other search phrases. I believe we were blessed by Google because we did everything the best we could: Genuine, manually constructed, unique, relevant content that was created from the heart. Other generalized health sites linked-to our site for more information on hair loss, and we had a couple thousand back-links that we never had to pay for. For the last 7 years or so, core content and news center went stagnant, but user-driven content (discussion forums) continued chugging along. Very old CMS systems had created duplicate content (print pages, PDF pages, share pages) and the site was not mobile-friendly at all. By the end of 2013, our home page had been bumped to the middle of Page 2 for "hair loss" as Google began pushing us down. Replacing our 700 page site dedicated to the topic of hair loss with random news articles, and dermatology organization sites that had little more than a paragraph of content on the topic. Traffic and income dropped by over 75% with this change, and by 2015 we were looking at a 9 year old site design that wasn't mobile-friendly, and had no updated content outside of the Forums for about as long. Mid 2015 we began a frantic renovation. The store was converted to a mobile-friendly design, tossed into HTTPS, and our developer screwed up, forgetting to put canonicals in place. Soon after, our store rankings dropped to almost zero. By the end of 2015 this was fixed, and we were spending tens of thousands to convert a very large, very old site into WordPress with a responsive, mobile friendly, lightning fast page-load design. We had no Google Analytics data prior to this either. Actions Taken starting Jan 1, 2016 - May 2016: Static Homepage + core content > Now put into WordPress. (80 pages) - proper 301's. News section running a 10 year old "PostNuke" CMS > Now put into WordPress. (300 pages). 301's. Forums running a 5 year old vBulletin > Now put into XenForo. (160,000 pages). 301's. Profiles section running a 10 year old "SocialEngine" CMS > Now put into new SocialEngine. (10,000 pages)* Site moved from HTTP > HTTPS. Proper 301's. Store CMS already finished months prior but sales dropped by 90%. Almost zero. Old forum CMS had created countless duplicate URLs. All of these 410'd. Old forum CMS had 65,000 pointless member profile pages indexed. All 410'd. Old news CMS created 4+ dup pages for every article (print, etc). All 301'd to new Article URL. Our HTACCESS file is thousands of lines long, trying to clean everything up, and redirect everything back to one, accurate, proper URL for each piece of content. It was a lot of work! After 17 years, we obviously had spammy sites linking to us. I quickly deleted content on my site the worst offenders were linking to. Then hired an SEO person to create a disavow audit on the other 20,000 sites liking to us. He settled on around 300 URLs needing disavow, but commented that didn't see any evidence we'd been penalized by Panda. He finished Friday and we will submit disavow Monday. Ran Screaming Frog audit on the site Cleaned up Google Search Console fully Created properties and submitted new sitemaps there. Monitored each property for the last 3 months and addressed 100% of issues raised. Revived Facebook, Twitter, Google+, Pinterest, and Instagram Accounts. Began publishing new content in our /news/ section and cross-posting to Social Media. Began improving up our Title Tags in the Forums as they often were pointless: "Hi! Need help!?" **Despite this, nothing has helped. Nothing has budged. Our traffic hasn't moved an inch since January. Sales have dropped 90% and site income has almost dried up. ** I have taken out a $25,000 personal loan just to cover my mortgage and pay my bills while I attempt to identify what's going wrong, and how to fix it. It bought me about 3 months, and that 3 months is almost up. I hired 2 or 3 different SEO experts with varying levels of experience. Due to no Google Analytics data to draw on, none of them could come up with any specific explanations for our drop in ranking over the last 4 years. That's why I took the approach to just "do everything" to fix all problems identified, and then cross my fingers. It hasn't worked. As of today our home page is not even found in google for our main search phrase: hair loss. Its simply not there. At all. And the only thing that is ranking is our forums, ranked at "67", which is horrible. But I don't understand why a site that was doing so well for over a decade has now been completely dropped from Google, without a single notice in Console or otherwise, explaining any problems. I realize this is a massive undertaking, and an equally massive post. But any time you can spend helping me will be forever appreciated.
Algorithm Updates | | HLTalk0 -
Will this fix my bounce rate?
If I understand bounce rate correctly, what it basically means is that someone clicks on your SERP, and then clicks back to google? But, it doesn't matter if they spent 10 minutes on your page or 10 seconds...so if that's right, then can you lower you bounce rate by getting someone to click on another internal link inside the original page they visited from the SERPs? So for example, if a user clicks on the SERP result for our webpage X, then the users clicks on an internal link on our page X to another one of our webpages ,Y, will that lower the bounce rate, even if the user eventually backs out to the original SERP page? Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Website dropping from Page 1 pos 5 to no ranking and then back again?
Hi all, We have a very odd occurrence with a client of ours. It should be noted that they had a penalty recently removed about 2 months ago after much work from our company. Recently they started appearing back on Page 1 Google for a semi competitive keyword term. We were very happy with this and so was the client. The the ranking improved with our work to position 5, which was excellent. Unfortunately what has been happening is they have been dropping out of the rankings completely for this semi competitive keyword for a few days and then reappearing in the same position. The client is checking daily and has noticed. I thought this is just a 'hangover' from the Google penalty and perhaps a one off occurrence, but it has happened about 3 or 4 times now and seems to be happening every couple of weeks. Can anyone shed some light on this behavior? I have checked Webmasters Tools and everything is fine. Thanks Jon
Algorithm Updates | | Jon_bangonline0 -
Has Panda update made you lose your ranks but put them back again?
I noticed recently that one of the main sites I run dropped ranks quite heavily across the board. I then noticed that with very link building during the time that the ranks were down (about 1 month) that my ranks went back up again really quickly. All this with very little link building effort, and its the same link building campaign I've been running for a while. So I'm wondering has any been experiencing ranking flux between jan and feb? I know that people reckon if you fix some things your ranks can improve again, but I barley fixed anything on the site and yet it dropped some keywords from 1st page to 3rd page and then back to 3rd page; some keywords went back to original position some were lower but non were higher.
Algorithm Updates | | upick-1623910 -
Bounce rate and rankings
I have believed for years that a high bounce rate (from search) could lower your rankings over time. Makes sense; if users bounce right back to search after looking at your page Google should think that page wasn't very useful and will push your down the SERPs. But, how do they determine this? If a user comes back after 30 seconds that's a bounce? Or is my premise incorrect and Google does not take bounce into account? Erin
Algorithm Updates | | ErinTM0