Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Javascript to manipulate Google's bounce rate and time on site?
-
I was referred to this "awesome" solution to high bounce rates.
It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question).
I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me.
Can someone with experience in JS help me by explaining what this script does?
I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in.

-
Stephen,
Thanks for the explanation - I just had a client ask me about this script. Based on your explanation, this script will change your bounce rate. This is because once the event is triggered, the visit will no longer be considered a bounce, even if the user only visits one page. So it's an artificial/false decrease in bounce rate, not a "fix" as others claim.
I wrote a short blog post on this (and referenced your description)!
~Adam
-
Thanks for the encouragement Martin.
As it turns out, with the help of the two previous answers, the script is actually based on a valid script adjustment that might actually help some people in their reports but the what my client thought was that this was an easy/quick way to get more traffic. The article they found was saying this would dramatically change results in GA and then directly effect their site's ranking in the SERPs.
They had "proof" in the form of some GA screenshots so I needed more information on what the script actually does. I was able to let my client know what exactly this was and recommend not doing it unless there was a problem in the GA reports that they wanted fixed.

Thanks again for your reply.
-
Dont do it - just improve your content. You know it's wrong to try and cheat the system. Think about what would happen if you banned from the results.
Look i dont mean to be harsh - but i allways balance risks against rewards. In this situation - the risk is to high.
-
Thanks for that link.
The site (link in the previous reply) my client referred me to was manipulating the way they were reporting the results. The closer I looked at it, I realized that it was a little spike but then it went right back down. Knowing them they just paid a bunch of people to visit the site.
This stuff is annoying and gives us SEO's a bad name.

-
The code was from this site http://millionairevolution.com/cut-bounce-rate-by-80/ and looking at the dates and analytics shown on the page this is nothing more than a misrepresentation of the facts and data.
I knew Google doesn't use data from GA but the data graph was showing a contradiction and I didn't know exactly what the script was doing.
-
First, Google Analytics reporting does not, to my knowledge, influence SERP rankings. Altering the data collected through Google Analytics should not affect SEO indicators.
Second, this is from here: http://briancray.com/posts/time-on-site-bounce-rate-get-the-real-numbers-in-google-analytics/
Once this code is installed, your site will update Google Analytics every 10 seconds under the Event Category "Time", the Event Action "Log", and the Event Value will be based on the pattern of 0:10, 0:20, 0:30, 0:40, 0:50, 1:00, 1:10, etc.
The script does not change your bounce rate, it just gives you additional information.
-
You're correct that it's a GA hack. Avoid it.
Google has publicly stated that they don't use your site-specific GA metrics to influence organic search rankings. E.g., they're not taking data from your GA profile, and feeding that to the Search Quality team to determine if your site should rank better or worse. They have MANY better ways to accurately track anonymous user interactions with sites at scale (e.g. Chrome).
The only thing that you'll accomplish with this code is making all of your own internal metrics turn to garbage. Accurate metrics are important. If you bounce rate is high, knowing that allows you to take action to improve your site and reduce it.
The more people who stay on your site for more than 1 pageview, the more money your business is likely to make. Improve your bounce rate to improve the profitability of your website, not for some supposed correlation between bounce rate and organic search ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Soft 404's on a 301 Redirect...Why?
So we launched a site about a month ago. Our old site had an extensive library of health content that went away with the relaunch. We redirected this entire section of the site to the new education materials, but we've yet to see this reflected in the index or in GWT. In fact, we're getting close to 500 soft 404's in GWT. Our development team confirmed for me that the 301 redirect is configured correctly. Is it just a waiting game at this point or is there something I might be missing? Any help is appreciated. Thanks!
Technical SEO | | MJTrevens0 -
How to Remove /feed URLs from Google's Index
Hey everyone, I have an issue with RSS /feed URLs being indexed by Google for some of our Wordpress sites. Have a look at this Google query, and click to show omitted search results. You'll see we have 500+ /feed URLs indexed by Google, for our many category pages/etc. Here is one of the example URLs: http://www.howdesign.com/design-creativity/fonts-typography/letterforms/attachment/gilhelveticatrade/feed/. Based on this content/code of the XML page, it looks like Wordpress is generating these: <generator>http://wordpress.org/?v=3.5.2</generator> Any idea how to get them out of Google's index without 301 redirecting them? We need the Wordpress-generated RSS feeds to work for various uses. My first two thoughts are trying to work with our Development team to see if we can get a "noindex" meta robots tag on the pages, by they are dynamically-generated pages...so I'm not sure if that will be possible. Or, perhaps we can add a "feed" paramater to GWT "URL Parameters" section...but I don't want to limit Google from crawling these again...I figure I need Google to crawl them and see some code that says to get the pages out of their index...and THEN not crawl the pages anymore. I don't think the "Remove URL" feature in GWT will work, since that tool only removes URLs from the search results, not the actual Google index. FWIW, this site is using the Yoast plugin. We set every page type to "noindex" except for the homepage, Posts, Pages and Categories. We have other sites on Yoast that do not have any /feed URLs indexed by Google at all. Side note, the /robots.txt file was previously blocking crawling of the /feed URLs on this site, which is why you'll see that note in the Google SERPs when you click on the query link given in the first paragraph.
Technical SEO | | M_D_Golden_Peak0 -
Why am I not showing up in the SERP's or Google Local?
I have been trying to optimise the following site for both Google SERP's and Google Local - Pixel Primate The URL has been around for around 3 years now but they just updated the website and launched it in December 2012. I did the on-page optimisation early in January 2013 and Google seems to have indexed the changes, for the home page at least. One major keyword I am targeting for the home page is 'Web Design Leicester'. I understand that the DA is fairly low (24) so this is something I need to improve. However, I've experienced positive results fairly quickly from just on-page optimisation for other sites I have worked on. The site just doesn't seem to be ranking at all for any keywords. Maybe the industry type is just extremely competitve but I find it very strange to not be visible anywhere in the SERPs. The site does not seem to have any penalties as it ranks for 'Pixel Primate' and all pages appear when doing a site: search. Also what's strange is that I set up the Google Local listing years ago but it doesn't appear anywhere in the local listing, not even when I search for it manually. Any suggestions would be appreciated.
Technical SEO | | CWseo0 -
Can I format my H1 to be smaller than H2's and H3's on the same page?
I would like to create a web design with 12px H1 and for sub headings on the page to be more like 24px. Will search engines see this and dislike it? The reason for doing it is that I want to put a generic page title in the banner, and more poetic headings above the main body. Example: Small H1: Wholesale coffee, online coffee shop and London roastery Large h2: Respect the bean... Thanks
Technical SEO | | Crumpled_Dog
Scott0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0