Site speed not being reported accurately?
-
We're constantly on the lookout for site speed, and Google's Webmaster tools are saying that we're really really slow (on the order of 5-15 seconds per page). But the site NEVER feels that slow, and lots of other tools say we're in the 3-5 second range. Further, we've implemented literally 100% of Google's suggestions, and all we have are ad units that now render using Googles Async ad loader, further reducing time to interactivity.
Could Google be dinging us in search results for this? Here's an example page that they said loaded in 200+ seconds (!?!)
http://hark.com/clips/kwkdqqtzsg-terran-nuclear-launch-detected
Thanks!
-
I'll always suggest you improve site speed but make sure you look at the ROI you can get out of it. If you spend 40 dev hours to increase your site speed and you see no increase in rankings, thats not good in my eyes. I'd make sure that you work with you Network Systems guy to get an accurate reflection of how faster/slow your site really it before investing a bunch of dev time.
-
Only negatively evaluated in search results. We've seen a definite flattening of some terms, and we are trying to figure out why - speed seems to be the only thing that would/could have changed.
-
Negatively evaluated from Google or someone else? My guess is that they use that number along with other user feedback from the SERPs, such as if people bounce from your page quickly. I've worked with sites who have horrible metrics on GWT and they still ranked very high.
-
I guess my concern is not the number - I'm MORE concerned with our site being negatively evaluated because of it. Any thoughts here?
-
As Keri said, those results are based on people who use the Google Toolbar. If you are looking for a more accurate reading of site speed, I would install the new Google Analytics site speed tag which will start tracking your site speed in Google Analytics and isn't based just on people who use the toolbar.
I've found that the speed displayed in Webmaster Tools can vary widely and is something that is still very beta for Google. I personally look at it once in a while but never report it to anyone since I don't trust it.
Casey
-
It says "highly accurate", with thousands of data points. Is that how it works - off the Google Toolbar?
The problem is that we're a flash site, in order to play audio, and we have ads. There's just nothing we can do to get around that.
-
In that tool, there is a comment about accuracy, which is based on the data points. What does it say about accuracy? To the best of my knowledge, their site speed report is based on other users with the Google Toolbar installed. If you have only a few users with the toolbar visiting, and they're all on dialup, you could get skewed results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which page speed tools do you trust?
We are at a loss. We have been following Google Pagespeed Insights which gives us horrible marks. Nothing we do seems to really make a difference. GTMetrix and Pingdom give us much better scores. We don't know what to do...
Reporting & Analytics | | HashtagHustler0 -
Stripping Out Referral Spam From Past Reports
Hi, I'm looking to confirm the best approach for retroactively stripping away referral spam (free buttons, SEMalt, etc.). Now to be clear, I already have filters in place to ignore them from current stats, so moving forward I'm fine. However, I'd love to go back and check untainted stats. I've setup segments using a regex to strip the root words away and it seems to be working. I have a regex setup to strip out things like: social-buttons|seoanalyses|copyrightclaims|classifiedads|jobsense|free-share-buttons|e-buyeasy|acrobats.hol|cheap-online|amezon|search-help|qut-smoking and so forth. I've been going through my referral data, noticing obvious spam, and adding their domains to my segment. Is this the optimal way for me to get a clear, untainted view of my past stats?
Reporting & Analytics | | kirmeliux0 -
How do I track a primary domain and a subdomain as single site in Google Analytics?
Our website consists of a primary domain (marketing focused) and subdomain (ecommerce platform). The two sites look and function as one site even though they are using different technology. I would like to track the primary domain (example.com) and the subdomain (shop.example.com) as a single site in Google Analytics. The subdomain will be set up with GA ecommerce tracking as well. Can someone provide an example of the GA snippet that each would need?
Reporting & Analytics | | Evan340 -
Free Media Site / High Traffic / Low Engagement / Strategies and Questions
Hi, Imagine a site "mediapalooza dot com" where the only thing you do there is view free media. Yet Google Analytics is showing the average view of a media page is about a minute; where the average length of media is 20 - 90 minutes. And imagine that most of this media is "classic" and that it is generally not available elsewhere. Note also that the site ranks terribly in Google, despite having decent Domain Authority (in the high 30's), Page Authority in the mid 40's and a great site and otherwise quite active international user base with page views in the tens of thousands per month. Is it possible that GA is not tracking engagement (time on site) correctly? Even accounting for the imperfect method of GA that measures "next key pressed" as a way to terminate the page as a way to measure time on page, our stats are truly abysmal, in the tenths of a percentage point of time measured when compared with actual time we think the pages are being used. If so, will getting engagement tracking to more accurately measure time on specif pages and site signal Google that this site is actually more important than current ranking indicates? There's lots of discussion about "dwell time" as this relates to ranking, and I'm postulating that if we can show Google that we have extremely good engagement instead of the super low stats that we are reporting now, then we might get a boost in ranking. Am I crazy? Has anyone got any data that proves or disproves this theory? as I write this out, I detect many issues - let's have a discussion on what else might be happening here. We already know that low engagement = low ranking. Will fixing GA to show true engagement have any noticeable impact on ranking? Can't wait to see what the MOZZERS think of this!
Reporting & Analytics | | seo_plus0 -
SEO dealing with a CDN on a site.
This one is stumping me and I need some help. I have a client who's site is www.site.com and we have set them up a CDN through Max CDN at cdn.site.com which is basically a cname to the www.site.com site. The images in the GWT for www.site.com are de-indexing rapidly and the images on cdn.site.com are not indexing. In the Max CDN account I have the images from cdn.site.com sending a canonical header from www.site.com but that does not seem to help, they are all still de-indexing.
Reporting & Analytics | | LesleyPaone0 -
Google webmaster links vs Moz Reporte do follow links
A bit confused about my seo reports for a site I am tracking in Moz. Google webmaster reports i have 1836 links to my domain.
Reporting & Analytics | | KenW
Moz reports 273 external followed links.
Website Auditor reports 449 dofollow and 338 no follow.> total 787
What is important factor that I should be reporting to my client that really matters?0 -
Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hi Friends, This issue is crimping my analytics efforts and I really need some help. I just don't trust the analytics data at this point. I don't know if my problem should be called duplicate content or what, but the SEOmoz crawler shows the following URLS (below) on my nonprofit's website. These are all versions of our main landing pages, and all google analytics data is getting split between them. For instance, I'll get stats for the /camp page and different stats for the /camp/ page. In order to make my report I need to consolidate the 2 sets of stats and re-do all the calculations. My CMS is looking into the issue and has supposedly set up redirects to the pages w/out the trailing slash, but they said that setting up the "ref canonical" is not relevant to our situation. If anyone has insights or suggestions I would be grateful to hear them. I'm at my wit's end (and it was a short journey from my wit's beginning ...) Thanks. URL www.enf.org/camp www.enf.org/camp/ www.enf.org/foundation www.enf.org/foundation/ www.enf.org/Garden www.enf.org/garden www.enf.org/Hante_Adventures www.enf.org/hante_adventures www.enf.org/hante_adventures/ www.enf.org/oases www.enf.org/oases/ www.enf.org/outdoor_academy www.enf.org/outdoor_academy/
Reporting & Analytics | | DMoff0 -
What are your top 5 Analytics Reports?
What are for you the 5 most important reports into Google Analytics? Thank you for yours answers guys, Jonathan Leplang
Reporting & Analytics | | JonathanLeplang0