What is The Bounce Rate of Single Page Website?
-
Hi All,
I just want to clear some of my confusion regarding bounce rate.
- Bounce rate depends upon time. If yes than how?
- What will be the bounce rate for single page website.
- Single page website will have same bounce rate and exit rate?
-
In this case they target the time out period. This is a bit different from the time on site. The time out period is in most cases (like for example Google Analytics) 30 minutes. When you're inactive for about 30 minutes a visit will count as a new visit instead of a duration of your former 'visit.
-
Thanks Martijn,
Can you please elaborate the concept of session in bounce rate?
Per wikipedia "The bounce rate for a single page is the number of visitors who enter the site at a page and leave within the specified timeout period without viewing another page, divided by the total number of visitors who entered the site at that page. In contrast, the bounce rate for a web site is the number of web site visitors who visit only a single page of a web site per session divided by the total number of web site visits."
In above definition, time period has mentioned, so here again my point is:
If bounce rate doesn't depend upon time than Bounce Rate should be same whether visitor bounce back in single session or after session expires??
-
- No, bounce rate is not depending upon time.
- I have no clue ;-), this fully depends on the market you're in, how the page looks like and dozens of other aspects.
- Good question, I would say: yes.
-
Bounce rate is the percentage of visits that go only one page before exiting a site. You can change the way Analytics sees the bounce rate, f.e. if the visitors stays for 20 seconds, or scrolls down in a time period.
Check this blogpost about advanced content tracking: http://cutroni.com/blog/2012/02/21/advanced-content-tracking-with-google-analytics-part-1/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the most effective way of selecting a top keyword per page on a site?
We are creating fresh content for outdated sites and I need to identify the most significant keyword per page for the content developers, What is the best way to do this?
Reporting & Analytics | | Sable_Group0 -
New website server code errors
I launched a new website at www.cheaptubes.com and have recovered my search engine rankings as well after penguin & panda devestation. I'm continuing to improve the site but moz analytics is saying I have 288 medium issues and i see the warning "45% of site pages served 302 redirects during the last crawl". I'm not sure how to fix this. I'm on WP using Yoast SEO so all the 301's I did are 301's not 302's. I do have SSL, could it be Http vs Https?
Reporting & Analytics | | cheaptubes0 -
Adjusted Bounce Rate WP Plugin?
Hi Moz Community, Is there any recommended Plugin for implementing an adjusted bounce rate on WordPress with customtizable features for time on page to not be counted as a bounce? Thank you, Kristin
Reporting & Analytics | | Red_Spot_Interactive0 -
Deleted Rarely Visited Pages - Traffic Dropped (Big Time)
Hi folks: I'd appreciate any thoughts you might have on a problem I am having with organic traffic. One of our sites has about 500 pages/blog posts. We had about 200 pages that no one was visiting, or only one to ten people had visited in an entire year. As a result, we decided to experiment, and delete any page which had fewer than 5 visits in a year. This resulted in a deletion of about 90 pages.We did this on April 6 or 7 of this year. Two days later, we had a substantial drop in visits to the site. We had been getting about 300 sessions a day. Now, we are lucky to get that in a month. I know there was an algorithm update in late March, but our traffic dropped about two weeks after that, and a day or so after the deletion of the pages. There is a clear demarcation on analytics. I gave it a month, the traffic did not recover, so we decided to restore the pages. Traffic has not recovered and it has been about 3 months now. Does anyone have any thoughts on why we might have experienced such a drastic drop as well as what we might do to recover from it? Thanks very much
Reporting & Analytics | | jnfere0 -
Google Analytics - Next Page Path is the Same URL?
Hey Everyone, I have a Google analytics question. I'm looking through a client's site and when I look at the next page path, I get the same URL as the next path. For example, on the homepage, the next page path I get is the homepage again? This happens for all URL's, is this an implementation error? Is there a way to fix this? Thanks!
Reporting & Analytics | | EvansHunt0 -
Does traffic coming from Adwords increase overall Domain Authority or Page Rank?
If I'm setting up an Adwords campaign, will setting my homepage as the landing page boost my domain rank? and will the Page Rank of the landing page get boosted because of the high click rate coming from the Adwords campaign?
Reporting & Analytics | | s2bkevin0 -
800,000 pages blocked by robots...
We made some mods to our robots.txt file. Added in many php and html pages that should not have been indexed. Well, not sure what happened or if there was some type of dynamic conflict with our CMS and one of these pages, but in a few weeks we checked webmaster tools and to our great surprise and dismay, the number of blocked pages we had by robots.txt was up to about 800,000 pages out of the 900,000 or so we have indexed. 1. So, first question is, has anyone experienced this before? I removed the files from robots.txt and the number of blocked files has still been climbing. Changed the robots.txt file on the 27th. It is the 29th and the new robots.txt file has been downloaded, but the blocked pages count has been rising in spite of it. 2. I understand that even if a page is blocked by robots.txt, it still shows up in the index, but does anyone know how the blocked page affects the ranking? i.e. while it might still show up even though it has been blocked will google show it at a lower rank because it was blocked by robots.txt? Our current robots.txt just says: User-agent: *
Reporting & Analytics | | TheCraig
Disallow: Sitemap: oursitemap Any thoughts? Thanks! Craig0 -
Ranking on page 1 position 3 but hardley any visits
in one of my projects the expected vistors is around 23000 weve got the keyword onto the first page position 3 and weve only recieved 30 vistits this month, my qustion is why? are the local searches from google analtics that tell you thier are 23000 searches for that keyword false please can someone shed some light on this, my client is totally over the moon with his ranking but cant believe that hes only recieving 30 visits this month from what we thought was a million dollar keyword
Reporting & Analytics | | Westernoriental0