Site speed not being reported accurately?
-
We're constantly on the lookout for site speed, and Google's Webmaster tools are saying that we're really really slow (on the order of 5-15 seconds per page). But the site NEVER feels that slow, and lots of other tools say we're in the 3-5 second range. Further, we've implemented literally 100% of Google's suggestions, and all we have are ad units that now render using Googles Async ad loader, further reducing time to interactivity.
Could Google be dinging us in search results for this? Here's an example page that they said loaded in 200+ seconds (!?!)
http://hark.com/clips/kwkdqqtzsg-terran-nuclear-launch-detected
Thanks!
-
I'll always suggest you improve site speed but make sure you look at the ROI you can get out of it. If you spend 40 dev hours to increase your site speed and you see no increase in rankings, thats not good in my eyes. I'd make sure that you work with you Network Systems guy to get an accurate reflection of how faster/slow your site really it before investing a bunch of dev time.
-
Only negatively evaluated in search results. We've seen a definite flattening of some terms, and we are trying to figure out why - speed seems to be the only thing that would/could have changed.
-
Negatively evaluated from Google or someone else? My guess is that they use that number along with other user feedback from the SERPs, such as if people bounce from your page quickly. I've worked with sites who have horrible metrics on GWT and they still ranked very high.
-
I guess my concern is not the number - I'm MORE concerned with our site being negatively evaluated because of it. Any thoughts here?
-
As Keri said, those results are based on people who use the Google Toolbar. If you are looking for a more accurate reading of site speed, I would install the new Google Analytics site speed tag which will start tracking your site speed in Google Analytics and isn't based just on people who use the toolbar.
I've found that the speed displayed in Webmaster Tools can vary widely and is something that is still very beta for Google. I personally look at it once in a while but never report it to anyone since I don't trust it.
Casey
-
It says "highly accurate", with thousands of data points. Is that how it works - off the Google Toolbar?
The problem is that we're a flash site, in order to play audio, and we have ads. There's just nothing we can do to get around that.
-
In that tool, there is a comment about accuracy, which is based on the data points. What does it say about accuracy? To the best of my knowledge, their site speed report is based on other users with the Google Toolbar installed. If you have only a few users with the toolbar visiting, and they're all on dialup, you could get skewed results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL Crawl Reports providing drastic differences: Is there something wrong?
A bit at a loss here. I ran a URL crawl report at the end of January on a website( https://www.welchforbes.com/ ). There were no major critical issues at the time. No updates were made on the website (that I'm aware of), but after running another crawl on March 14, the report was short about 90 pages on the site and suddenly had a ton of 403 errors. I ran a crawl again on March 15 to check if there was perhaps a discrepancy, and the report crawled even fewer pages and had completely different results again. Is there a reason the results are differing from report to report? Is there something about the reports that I'm not understanding or is there a serious issue within the website that needs to be addressed? Jan. 28 results:
Reporting & Analytics | | OliviaKantyka
Screen Shot 2022-03-16 at 3.00.52 PM.png March 14 results:
Screen Shot 2022-03-15 at 10.31.22 AM.png March 15 results:
Screen Shot 2022-03-15 at 4.06.42 PM.png0 -
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
How do sites without access to a site's analytical data, determine a site's organic traffic?
I've recently used a organic traffic checker that showed you your traffic compared to each google algo update. I was interested in how they derived the organic traffic totals for each month, without having access to our site's google analytics? I've since compared the data to historical google analytics data and it's not wrong, isn't 100% match either but isn't far from fact. So if they're predicting or making a guess, it's rather spot on, site crawlers and SERPs snapshots only provide so much info, I'm just wondering where they get the rest from and how?
Reporting & Analytics | | Deacyde0 -
Weird visitors to my site
Hi, I am in the process of disentangling myself from a dodgy SEO company. At some point they set up another GA account on my site without consulting me. I replaced the tracking code with my original account on my wordpress site, placing the tracking code on the dashboard. There is a box in the dashboard for you to do this. For some reason the account he created is still giving me analytics but from mostly one url :forum.topic55622342.darodar.com. It has marked it as a referral? When you click it it redirects to this site : http://activities.aliexpress.com/computers_channel.php?aff_platform=aaf&sk=vV3B2RJYB%3A&cpt=1421321021096&null There have been 218 visits from this "referral" in the last month and also 2 direct visits to a clients online gallery (i'm a photographer). I am guessing the code for this new account is still on the site somewhere? Funnily enough in the first month I was getting targeted by spam using my contact form and I was a bit perplexed as to why. We had to put captchas on the contact forms which I was loathe to do as its another step for a client to have to go through causing resistance. Has this link got something to do with it? I have recently disavowed a lot of toxic links he created, so maybe they had something to do with it? Best wishes. David.
Reporting & Analytics | | WallerD0 -
What determines the page order of site:domain?
Whenever I use site:domain.com to check what's index, it's pretty much always in the same order. I gather from this, the order is not random. I'm also reasonably certainly it isn't related to any page strength signals or ranking results. So, does anyone know why the pages are displayed in the order they are? What information does the order of the pages tell me? Thanks, Ruben
Reporting & Analytics | | KempRugeLawGroup1 -
Moz Crawler suddenly reporting 1000s of duplicates (BE.net)
In the last 3-4 days we've had several thousand 'duplicate content' warnings appear in our crawl report, 99% of them related to our on-site blog. The blog is BlogEngine.Net, but the pages simply don't exist. The majority seem to be Roger trying quasi-random URLs like:
Reporting & Analytics | | Progauto
/?page=410 /?page=151 Etc. etc. The blog will present content for these requests, but it is of course the same empty page since there's only unique content for up to /?Page=10 or so. Two questions: 1. Did something change recently? These blogs have been up for months, and this problem has only come up this week. Did Roger change to become more aggressive lately? 2. Suggested remediation? On one of the blogs I've put no-index no-follow for any page that has a /?page querystring, and we'll see what effect that has come next crawl next week. However, I'm not sure this will work as per: http://moz.com/community/q/functionality-of-seomoz-crawl-page-reports Anyone else had dynamic blogs suddenly blossom into thousands of duplicate content warnings? Google (rightly) ignores these pages completely.0 -
Anyone notice a drop in results using site operator?
I set our site's preferred domain back on January 28. We had a www and non www domain being indexed. Since then, I've seen the number or results for our site site operator (site:) decline dramatically. Not sure if this is a good thing or bad thing. So, I'm trying to see if it's unique to our site. My gut is that the numbers are probably leveling out to where they should be and the duplicates are falling out, but I would think that as I see number of results for non www decline, the number of results for www would increase. Any thoughts? Anyone else seeing fluctuations in results using site: ? Lisa
Reporting & Analytics | | Aggie0 -
Duplicate Content From My Own Site?!
When I ran the SEO Moz report it says that I have a ton of duplicate content. The first one I looked at was my home page. http://www.kisswedding.com/ http://www.kisswedding.com/index.html http://kisswedding.com/index.html All of the above 3 have varying internal links, page authority, and link root domains. Only the first has any external links. All of the others only seem to have 1 other duplicate page. It's a difference between the www and the non-www version. I have a verified acct for www.kisswedding.com in google webmaster tools. The non-www version is in there too but has not been verified. Under settings for the verified account (www.kisswedding.com), "Don't set a preferred domain" is checked off. Is that my mistake. And if so, which should I select? The www version or the non-www version? Thanks!
Reporting & Analytics | | annasus0