Shouldn't Lower Bounce Rate Correlate into Greater Click Thru Rate for a Web Site?
-
Greetings:
I run a real estate web site in New York City with about 650 pages out of which 330 are property listing pages. About 250 of those listing pages contain less than 150 words of content.
In late August I set about 250 of the listing pages that generated the least traffic (generally corresponding to those with the least content) to "no-index, follow". Now Google has removed those pages from their index.
The overall bounce rate for the site has been reduced from about 69% to about 64% since the removal of these low quality listing pages.
However the click thru rate has not improved and is stuck at about 2.2 pages per visitor.
Shouldn't the click thru rate improve if the bounce rate goes own? Am I missing something?
Also, is a lower bounce rate something that Google will take into account when calculating rank?
Thanks, Alan
-
Good idea, however according to Google Webmaster Tools under Google Index>Index Status the number of indexed pages has been dropping. It is down by 120 which is about half the 250 which we have set to "no-index, follow" on August 20th. I suspect it may be down a bit more as the results on Webmaster Tools may lag a bit.
I just can't explain why the pages views per visitor has not increased if the bounce rate is down. If the bounce rate has decreased from about 69% in August to 63% in September which means that 37% of visitors are staying on the site instead of 31% which is significant improvement (about 18%). I would think this would translate into more page views per visitor. But it has not. Pages views per session was 2.38 in August and 2.18 in September. This seems impossible.
Thanks, Alan
-
Hey Alan,
You said that you made these changes in late August. Could it be that Google hasn't updated this number for you in the one week ish amount of time it has been since you made the changes? It seems odd that the number would stay exactly the same.
-
Hi Alan,
I assume you mean pageviews per visit rather than click through rate, since you mentioned 2.2 pages/visit.
Pageviews per visit is the average number of pages viewed per session, while bounce rate is single page visits.
Normally, the answer is yes where lower bounce rates would usually correlate with higher average pageviews per visit, however the correlation is not that high since there are many factors included over here (Sources of traffic, landing pages, search keywords, and the list goes on).
So the assumption here is that you would be getting the same quality of visitors, however with decreasing single page visits which would normally increase the pageviews per visit. However bounce rate moving from 69% to 64% is not that big of a difference, and I am not sure what the sample size is for these visits.
I would recommend that you check landing pages with high pageview per visit and start focusing your marketing efforts there and you should find an increase in average pageviews/visit.
With regards to bounce rate affecting rankings, well this only applies for bounce rate you get from organic traffic, since google can not actually determine your overall website bounce rate (or atleast they claim they don't use analytics data), so make sure that your top organic landing pages are well optimized for their target terms with proper call to actions to avoid bounces over there.
Hope this was helpful.
Have a great day,
Moe
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our Web Site Is candere.com. Its PA and back link status are different for https://www.candere.com, http://www.candere.com, https://candere.com, and http://candere.com. Recently, we have completely move from http to https.
How can we fix it, so that we may mot lose ranking and authority.
Intermediate & Advanced SEO | | Dhananjayukumar0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
Why isn't the rel=canonical tag working?
My client and I have a problem: An ecommerce store with around 20 000 products has nearly 1 000 000 pages indexed (according to Search Console). I frequently get notified by messages saying “High number of URLs found” in search console. It lists a lot of sample urls with filter and parameters that are indexed by google, for example: https://www.gsport.no/barn-junior/tilbehor/hansker-votter/junior?stoerrelse-324=10-11-aar+10-aar+6-aar+12-aar+4-5-aar+8-9-aar&egenskaper-368=vindtett+vanntett&type-365=hansker&bruksomraade-367=fritid+alpint&dir=asc&order=name If you check the source code, there’s a canonical tag telling the crawler to ignore (..or technically commanding it to regard this exact page as another version of the page without all the parameters) everything after the “?” Does this url showing up in the Search Console message mean that this canonical isn’t working properly? If so: what’s wrong with it? Regards,
Intermediate & Advanced SEO | | Inevo
Sigurd0 -
'?q=:new&sort=new' URL parameters help...
Hey guys, I have these types of URLs being crawled and picked up on by MOZ but they are not visible to my users. The URLs are all 'hidden' from users as they are basically category pages that have no stock, however MOZ is crawling them and I dont understand how they are getting picked up as 'duplicate content'. Anyone have any info on this? http://www.example.ch/de/example/marken/brand/make-up/c/Cat_Perso_Brand_3?q=:new&sort=new Even if I understood the technicality behind it then I could try and fix it if need be. Thanks Guys Kay
Intermediate & Advanced SEO | | eLab_London0 -
Links to my site still showing in Webmaster Tools from a non-existent site
We owned 2 sites, with the pages on Site A all linking over to similar pages on Site B. We wanted to remove the links from Site A to Site B, so we redirected all the links on Site A to the homepage on Site A, and took Site A down completely. Unfortunately we are still seeing the links from Site A coming through on Google Webmaster Tools for Site B. Does anybody know what else we can do to remove these links?
Intermediate & Advanced SEO | | pedstores0 -
Local SEO - Do I need it if I don't do business locally?
Super confused about this. Our office is located in Los Angeles, but it is not a storefront, and our clients are from all over the country... and our business involves travel to other countries. So there is nothing "local" about us. But everything I read seems to say we should be doing local SEO. How to approach this?
Intermediate & Advanced SEO | | benenjerry1 -
301 doesn't redirect a page that ends in %20, and others being appended with ?q=
I have a product page that ends /product-name**%20** that I'm trying to redirect in this way: Redirect 301 /products/product-name%20 http://www.site.com/products/product-name And it doesn't redirect at all. The others, those with %20, are being redirected to a url hybrid of old and new: http://www.site.com/products/product-name**?q=old-url** I'm using Drupal CMS, and it may be creating rules that counter my entries.
Intermediate & Advanced SEO | | Brocberry0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0