Google Analytics: how to filter out pages with low bounce rate?
-
Hello here,
I am trying to find out how I can filter out pages in Google Analytics according to their bounce rate.
The way I am doing now is the following:
1. I am working inside the Content > Site Content > Landing Pages report
2. Once there, I click the "advanced" link on the right of the filter field.
3. Once there, I define to "include" "Bounce Rate" "Greater than" "0.50" which should show me which pages have a bounce rate higher of 0.50%.... instead I get the following warning on the graph:
"Search constraints on metrics can not be applied to this graph"
I am afraid I am using the wrong approach... any ideas are very welcome!
Thank you in advance.
-
Thank you Mark! Yes, I knew about that option and helps a big deal!
I appreciated your help. Thanks!
-
Thank you Lynn, yes sorry, I meant 50% not 0.50% (I got confused with the conversion rate). Yes, I didn't notice that the data under the graph was actually updated, looks like it is just the graph that doesn't show that kind of filtered data (too bad!).
Thank you again, I appreciated your help!
-
I would also look into sorting the data by the metrics you want, and using weighted sort instead of the default. Weighted sort takes into account other metrics as well - so this way, when you sort by bounce rate, it doesn't just show you 100% bounce rate at the top, even if that page only has 1 view and so is skewed, but gives you a much better idea at pages that are performing poorly and actually getting visits.
You can read more about weighted sort here on the GA blog - http://analytics.blogspot.co.il/2010/08/introducing-weighted-sort.html
Hope this helps,
Mark
-
Hi Fabrizio,
If you put 50 in the 'bounce rate greater than box' instead of 0.5 then the table shown below the graph shows the data you want (only bounce rates over 50%, I think that is what you are after right?). I guess the graph cannot be show this filtered data although you can click the 'select a metric' link next to the visits dropdown on the top/left of the graph and add bounce rate to see the average bounce rate by visits/day if that helps with getting a baseline.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Dropped and Indexed Pages Went Down on Google?
Hi there, We run an e-commerce site on Shopify. Our Domain Authority was 28 at the start of our campaign in May of this year. We also had 610 indexed pages on Google. We did some SEO work which included: Renaming Images for SEO Adding in alt tags Optimizing the meta title to "Product Name - Keyword - Brand Name" for products Optimizing meta descriptions Transition of Hubspot blog to Shopify (it was on a subdomain at Hubspot previously) Fixing some 404s Resubmitting site map after the changes Now it is almost at the 3-month mark and it looks like our Domain Authority has gone down 4 points to 24. The # of indexed pages has gone to down to 555. We made sure all our SEO updates weren't spammy or keyword-stuffed, but took a natural and helpful-sounding approach. We followed guidelines. So there shouldn't be any penalty right? I checked site traffic and it does not coincide with the drop. Our site traffic remains steady. I also looked at "site:" as well as conducted some test searches for the important pages (i.e. main pages, blog pages, and product pages) and they still come up on Google. So could it only be non-important pages being deindexed? My questions are: Why did both the Domain Authority and # of indexed pages go down? Is there any way to see which pages were deindexed? I checked Google Search Console, but couldn't find it. Thank you!
Intermediate & Advanced SEO | | kindalpaca70 -
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Google Page Speed Score 91, But 5-8 Seconds to Download URL
Greetings MOZ Community: In Google Analytics under "Site Speed" under "Behavior" our home page has a page speed rank of 91 which I assume is pretty fast. However the "Average Page Load Time" is varies between 5 and 8 seconds, which seems very slow. My developers have made major efforts to optimize the home page URL (www.nyc-officespace-leader.com) for speed. The page has a carousel which I assume may be slowing it down. Is the download speed of this page detrimental to SEO? Or is the favorable Page Speed Score good enough. I am particularly concerned because the most competitive phrases are ranked on the home page. As it stands I am having a lot of difficulty ranking in the top ten for these pages. My concern is that the slow download speed of the home page could be holding back ranking of these terms. If necessary I can always redesign the home page and remove the carousel or reduce the number of listings in the carousel to speed it up. Is this worth investing effort in or is the speed good enough? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Google crawled my rich snippet pages and then excluded them
Hi guysWe have added schema.org mark up a few months ago and it all looked well and showed up then suddenly last month all the crawled pages disappeared from Webmaster tools Structured data (see the screenshot attached). This happened to another site of mine and I cannot figure out what causes it. Nothing has been changed on the pages and you can see by yourself in the HTML code. Any ideas to why this might happened this way?wenR89I.png?1
Intermediate & Advanced SEO | | Walltopia0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0