Help on Page Load Time
-
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too.
I've made a research, and I found that GA is used to track page load time automatically, isn't it?
-
I would definitely make sure you look at what Google is saying https://developers.google.com/analytics/devguides/collection/gajs/methods/gaJSApiBasicConfiguration#gat.GA_Tracker._setSiteSpeedSampleRate
your Google analytics code is the newest version "asynchronous tracking code" it does make a difference in speed. If you want to track your websites loading speed from certain areas or get a general idea of what you can do to speed it up I strongly recommend http://tools.pingdom.com/fpt/ or http://www.uptrends.com/aspx/free-html-site-page-load-check-tool.aspx both will allow you to check the sites load time from different areas in the United States and the world. If you want to have your site load faster than it does now and you're using Word press I would recommend a different host somebody like WPengine.com or http://page.ly/ if you are using any other form of website you can use a content delivery network somebody like http://www.akamai.com/ does a great job I also use http://www.limelight.com/website-application-acceleration/ for a more complete look at your website load speed and analytics I would recommend Adobe's Omniture http://www.omniture.com/en/ they are more expensive obviously then the free suite from Google however I believe you will see that you do get what you pay for. I also want to bring up Kiss metrics analytics they are only $30 a month and will allow you to track the speed of particular individuals here is a bit of information on the subject from their blog http://blog.kissmetrics.com/speed-is-a-killer/ as well as their main page you can sign up for a free month trial https://www.kissmetrics.com/new_feature
Here is Googles advice on what to do
_setSiteSpeedSampleRate()
_setSiteSpeedSampleRate(sampleRate)
Defines a new sample set size for Site Speed data collection. By default, a fixed 1% sampling of your site visitors make up the data pool from which the Site Speed metrics are derived. If you have a relatively small number of daily visitors to your site, such as 100,000 or fewer, you might want to adjust the sampling to a larger rate. This will provide increased granularity for page load time and other Site Speed metrics. (See Site Speed in the Help Center for details about the Site Speed reports.)
The
_setSiteSpeedSampleRate()
method must be called prior to_trackPageview()
in order to be effective.Analytics restricts Site Speed collection hits for a single property to the greater of 1% of visitors or 10K hits per day in order to ensure an equitable distribution of system resources for this feature.
Note: We strongly encourage sites with greater than 1 million hits per day to keep their sample selection set to the default 1% rate. Adjusting the sample size to a larger number will not increase your sample size.
Async Snippet (recommended)
_gaq.push(['_setSiteSpeedSampleRate', 5]); _gaq.push(['_trackPageview']);
<a class="exw-control exw-expanded">▾</a>
Traditional (ga.js) Snippet
pageTracker._setSiteSpeedSampleRate(5); pageTracker._trackPageview();
parameters
_Number_ sampleRate
Value between 0 - 100 to define the percentage of visitors to your site that will be measured for Site Speed purposes. For example, a value of5
sets the Site Speed collection sample to 5%.I hope I was of help to you and wish you luck with this.
Sincerely,
Thomas Zickell
QuiZick Internet Marketing
-
Thank you, but it seems that GA is not tracking page load time at all.
I use the new asynchronous tracking code, it's supposed to do it automatically, right?
Sometimes it show one or two results, but almost everything is ZERO.
It's making me confused.
-
New tracking codes set this up as default but if this is an old code it will need enabling
by using the folowing
Previous versions of Site Speed required a tracking code change to add _trackPageLoadTime. Sites with the deprecated call will still collect speed data at the 10% sampling rate. However, this call will be ignored in the future, and the sample rate will fall to the default 1%. Consider updating your tracking code to use setSiteSpeedSampleRate() to set a higher sampling rate.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using non-https links (not pages) impact or penalise the website rankings?
Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk
Algorithm Updates | | vtmoz0 -
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
Ecommerce SEO help
Hi I'm having difficulty managing our product pages for optimisation, we have over 20,000 products. We do keyword research & optimise product titles/meta of new products - however there's a lot to clean up but we have done a lot. I find we rank/convert better on product pages so they would be great to focus on - however when an old product is discontinued, the page is removed & we lose authority by creating new pages for similar products - does anyone have any ideas for managing this? This is something done automatically on the dev side in France. I then have the issue of trying to rank category pages - these are highly competitive areas competing with big brands. I'm finding it tough to know where to focus, the site is vast and I am the only SEO. I've started looking into low hanging fruit - but these aren't necessarily the areas which bring in much revenue. Thanks!
Algorithm Updates | | BeckyKey0 -
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Changing the # of results per page in Google search settings displays totally different results. Why is this?
Curious what's going on here. This is the first time I've seen this before. What's happening is this ... In Google, I search for "mobile apps orange county" and get a standard list of 10 results. I go to Google's search settings in the top right corner of the page (button is grey with a gear) to change the number of results per page from 10 to 50 (also did 100). When I go back to Google and search again for "mobile apps orange county" I get a much larger list but with completely different results. This time around the top 10-12 are dominated by the same website (ocregister.com) What's going on here that Google would now show different results? Why is this one website all of a sudden dominating the first 12 results? Thanks everyone! ByteLaunch
Algorithm Updates | | ByteLaunch0 -
How could Penguin kill my top ten rank and promote this garbage page to a #5 spot
Hey, Before penguin, I had a #9 rank for the term "yoga poses". So as many of us are doing, I started looking at my link profile... and yes, there were around 300 links from an old yoga news website (anchor: yoga poses)... that lead to the page on my site optimized for this term. The problem is they took the site down, but not properly... I.E. they generate a "not available" message for browsers, but underneath, I guess the bots can still index all the pages... so I guess they were interpreting these links as coming from a cloaked site. So, I was able to get them to remove the links... webmaster tools reports half of them gone now. What I don't get though... is how Google can give this garbage page a #5 spot for a competitive term like "yoga poses"... Check out http://www.ebmyoga.com/beginyoga.html and compare it to my page... http://www.yogaclassplan.com/yoga-poses/ This page leads to highly quality 100% unique yoga pose articles... in my mind we deliver so much more value than the site with a #5 rank. I don't understand. Any insight? Thanks,
Algorithm Updates | | biomat0 -
Index Page lost rankings? Please Help!
This morning I ranked highly (Page 1 UK Google) for over 50 keyword search terms for my website http://www.careworx.co.uk This afternoon my rankings have bottomed out and dropped pages? I have not been de-indexed it appears and many of my sub-pages are still highly ranked. Would anybody know what has happened? I know of Google Panda but I would've seen results drop before now so I'm very concerned. Don't seem to have lost any links etc and am careful to balance SEO with a mix of techniques to keep Google happy and again, have not been de-indexed. Can anybody offer advice please, or let me know how I can rectify this.
Algorithm Updates | | andystep0