Help on Page Load Time
-
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too.
I've made a research, and I found that GA is used to track page load time automatically, isn't it?
-
I would definitely make sure you look at what Google is saying https://developers.google.com/analytics/devguides/collection/gajs/methods/gaJSApiBasicConfiguration#gat.GA_Tracker._setSiteSpeedSampleRate
your Google analytics code is the newest version "asynchronous tracking code" it does make a difference in speed. If you want to track your websites loading speed from certain areas or get a general idea of what you can do to speed it up I strongly recommend http://tools.pingdom.com/fpt/ or http://www.uptrends.com/aspx/free-html-site-page-load-check-tool.aspx both will allow you to check the sites load time from different areas in the United States and the world. If you want to have your site load faster than it does now and you're using Word press I would recommend a different host somebody like WPengine.com or http://page.ly/ if you are using any other form of website you can use a content delivery network somebody like http://www.akamai.com/ does a great job I also use http://www.limelight.com/website-application-acceleration/ for a more complete look at your website load speed and analytics I would recommend Adobe's Omniture http://www.omniture.com/en/ they are more expensive obviously then the free suite from Google however I believe you will see that you do get what you pay for. I also want to bring up Kiss metrics analytics they are only $30 a month and will allow you to track the speed of particular individuals here is a bit of information on the subject from their blog http://blog.kissmetrics.com/speed-is-a-killer/ as well as their main page you can sign up for a free month trial https://www.kissmetrics.com/new_feature
Here is Googles advice on what to do
_setSiteSpeedSampleRate()
_setSiteSpeedSampleRate(sampleRate)
Defines a new sample set size for Site Speed data collection. By default, a fixed 1% sampling of your site visitors make up the data pool from which the Site Speed metrics are derived. If you have a relatively small number of daily visitors to your site, such as 100,000 or fewer, you might want to adjust the sampling to a larger rate. This will provide increased granularity for page load time and other Site Speed metrics. (See Site Speed in the Help Center for details about the Site Speed reports.)
The
_setSiteSpeedSampleRate()
method must be called prior to_trackPageview()
in order to be effective.Analytics restricts Site Speed collection hits for a single property to the greater of 1% of visitors or 10K hits per day in order to ensure an equitable distribution of system resources for this feature.
Note: We strongly encourage sites with greater than 1 million hits per day to keep their sample selection set to the default 1% rate. Adjusting the sample size to a larger number will not increase your sample size.
Async Snippet (recommended)
_gaq.push(['_setSiteSpeedSampleRate', 5]); _gaq.push(['_trackPageview']);
<a class="exw-control exw-expanded">▾</a>
Traditional (ga.js) Snippet
pageTracker._setSiteSpeedSampleRate(5); pageTracker._trackPageview();
parameters
_Number_ sampleRate
Value between 0 - 100 to define the percentage of visitors to your site that will be measured for Site Speed purposes. For example, a value of5
sets the Site Speed collection sample to 5%.I hope I was of help to you and wish you luck with this.
Sincerely,
Thomas Zickell
QuiZick Internet Marketing
-
Thank you, but it seems that GA is not tracking page load time at all.
I use the new asynchronous tracking code, it's supposed to do it automatically, right?
Sometimes it show one or two results, but almost everything is ZERO.
It's making me confused.
-
New tracking codes set this up as default but if this is an old code it will need enabling
by using the folowing
Previous versions of Site Speed required a tracking code change to add _trackPageLoadTime. Sites with the deprecated call will still collect speed data at the 10% sampling rate. However, this call will be ignored in the future, and the sample rate will fall to the default 1%. Consider updating your tracking code to use setSiteSpeedSampleRate() to set a higher sampling rate.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we need to maintain consistency in page titles suffix?
Hi all, We usually give "brand & primary keyword" across all pages in website like "vertigo tiles". Do we need to maintain this suffix across all page titles? What if we change according to the page? Will Google downlook for not maintaining these page titles suffix like I mentioned? Thanks
Algorithm Updates | | vtmoz0 -
Help for a webstore with Google Warnings for Watermark Images and Panda
I have not had too much experience with helping websites that have been hit by Panda - any tried and tested formulas I can pass to website owner would be great. He does not want to reveal domain name - its in the area of children/baby products 'Web site featured on page 1 of Google search results for many years (website 5 years old- Australian domain) . In April/May 2014, Google suspended our Google Shopping account because we used watermarks on all our images. We were advised that the suspension would remain in place indefinitely or until such time the watermarks were removed. We wrote back to Google to explain that these watermarks were put in place by our store back 2005 with the sole purpose of protecting our intellectual property. Needless to say, their attitude was unwavering. And as a result, revenue plummeted. However, the perfect storm was about to hit our store without warning. In the same month, Panda 4.0 was unleashed and our store was hit once again. This update alone reduced visitor numbers by around 50% overnight. The Panda 4.0 algorithm update was designed to target poor quality, duplicate content and unfortunately we had some of it. We have now begun creating original content with many of the new products we're uploading onto our web site. It's slow and tedious. We have modified our web site to now include a tag on a the home page (this was missing). We have removed many duplicate links from our footer (it was too big and contained hundreds of links that were also repeated from the header). We introduced a blog and we have engaged the services of a local seo company to disavow any bad backlinks and add missing or improve existing content to category and brand pages. No improvement in our situation is yet visible and with Christmas just 3 months away, poor sales during our 'bread and butter' period will mean even tougher times for our store in 2015. ANY PANDA EXPERTS who can help please email me felicity@gardenbeet.com - looking for independent freelancers rather than agencies
Algorithm Updates | | GardenBeet0 -
18 years later, Page Rank 6 Drops to 0, All +1s disappear, Scrapers outrank us
18 years ago I put up our first website at http://oz.vc/6 Traffic grew and our forums reached hundreds of thousands of posts, our website had a page rank of 6 and our forums and other content areas ranked 5-6, the others usually 4-6. Panda 2.2 came along and whacked it. No measures recommended by SEO experts and the Matt Cutts videos even made a dent, including some pretty severe measures that were supposed to make a difference. Bing and Yahoo traffic both grew since Panda 2.2 and only Google kept dropping every few updates without recovery. Several few weeks ago Google provides the ultimate whack. It seems every page other than the home page has either a PR of 0 or not generating any PR at all. Every +1 disappeared off of the site. Now three pages have +1 back and the entire guide section (hundreds of articles) are still missing all +1s. I discovered two scrapers, one of which was copying all of our forum posts and ranking a PR 2 for it (while we have a zero. They were taken down but I still can't imagine how this result could happen. I am going to have an RSS feed aggregator taken down that is ranking a 2 and knows we can't prevent them from taking our Wordress feeds and storing them (we use them for areas on the site.) How can Google provide us with a zero page rank and give obvious scrapers page rank? What should have been years worth of awesome rich added content and new features was wasted chasing Google ghosts. I've had two SEO people look at the site and none could point to any major issue that would explain what we've seen, especially the latest page rank death penalty. We haven't sold paid links. We have received no warnings from Google (nor should we have.) The large "thin" area you may see in a directory were removed entirely from Google (and made no difference and a drop in Google doing the "right" thing!) Most think we have been stuck for a very long time in the rare Google glitch. Would be interested in your insights.
Algorithm Updates | | seoagnostic0 -
Is it wise to conduct a link building campaign to a Google+ Local page?
Is it wise, while doing a link building campaign to not only focus on the main website target page, but also the Google+ Local page? Here are two strategies I was thinking of using: 1. Conduct a city specific link building campaign to direct traffic to the location specific page on the main website AND the Google+ Local page. 2. Use the main website to direct traffic to each cities specific Google+ Local page. Does it make sense to drive links to a Google+ Local page? It does to me, but I haven't seen anything written about that yet... or perhaps I've just missed it along the way. I'd love to hear the communities thoughts. Thanks! Doug
Algorithm Updates | | DougHoltOnline0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
Google showing different pages for same search term in uk and usa
Hi Guys, I have an interesting question and think Google is being a bit strange.. Can anyone tell me why when I input the term design agency in Google.co.uk it shows one page, but when i tyupe in the same search term in Google.com (worldwide search) it shows another page.. Any ideas guys? Is this not bit strange?? Any help here be much appreciated.. Thanks Gareth
Algorithm Updates | | GAZ090 -
Can you help with a few high-level mobile SEO questions?
Rolling out a mobile site for a client and I'm not positive about the following: Do these mobile pages need to be optimized with the same / similar page titles? If we have a product page on the regular site with an optimized title like "Men's Sweaters, Shirts and Ties - Company XYZ", should the mobile version's page have the same title? What if the dev team simply named it "Company XYZ Clothes" and missed the targeted keywords? Does it matter? Along the lines of question 1, isn't there truly just one index and your regular desktop browser version will be used for all ranking factors on both desktop and mobile SERPs? If that regular page indeed ranks well for "men's sweaters" and that term is searched on a mobile device, the visitor will be detected and served up the mobile page version, regardless of its meta tags and authority (say it's on a subdomain, m.example/.com/mens-department/ ), correct? Are meta descriptions necessary for the mobile version? Will the GoogleBot Mobile recognize them or will just the regular version work? Looks like mobile meta descriptions have about 30 less characters. Thanks in advance. Any advice is appreciated. AK
Algorithm Updates | | akim260 -
Was I Kicked Off Google Page One by Panda/Farmer?
Took over this site in March. Got a Panicked call from client Mid-March that all of a sudden keywords that put the site on Page One weren't working. There are still 9 that work, but apparently there were more. A large percentage of the backlinks are from Article Directories and Link Farms. Is this my problem? Also, a large percentage of the 149 pages suffer from keyword stuffing and were obviously written for Search Engines and not people. How much of a difference does that make?
Algorithm Updates | | reeljerc0