Page Speed - What tool to use?
-
I am looking for a good tool to measure page speed. Any tools out there that you recommend?
-
Site 24x7 is quite good - http://www.site24x7.com/tools.html
Its more of an all round website monitor - but one of the built in aspects is response time.
-
First thing I would look at is Cache time. I see that one be different on a lot of the website speed tests.
-
Nice tool. Interesting results. I used pingdom and the site scored 98/100. Under the developers.google the site scored 55/100.
Thanks for the link. Now to find why the discrepancy.
-
That is a good one. The irony is that I use pingdom to monitor my sites. LOL. Thanks of the info.
-
I've found PageSpeed Insights (https://developers.google.com/speed/pagespeed/insights) to be very helpful. It offers suggestions on how to improve the performance of your site, based on the potential for improvement. By benchmarking your site against others in your industry (or top performers in other industries), you can compare not only the overall PageSpeed Score, but you can identify how the suggestions differ.
This enables you to identify possible competitive advantages/disadvantages. For instance, if you receive a high priority suggestion that your competitors have already implemented, you may be at a disadvantage; the severity of which dependent on the resources required to implement the suggestion.
-
tools.pingdom.com/ is one of my favorites!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I remove parameters from the GSC URL blocking tool?
Hello Mozzers My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google. The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool. Thank you Mozzers!
Reporting & Analytics | | Vsood0 -
No Google Analytics code on page BUT reporting is active
How could Google Analytics be reporting data if my pages don't have the GA code on them? Mike
Reporting & Analytics | | Mike_c0 -
Irrelevant page with high bounce rate
I have a page on my site, www.waikoloavacationrentals.com/kolea-rentals/floor-plans, that gets me roughly 17% of my traffic. That being said it is not really relevant traffic because it comes from the search term "floor plans", which really has nothing to do with Hawaii vacation rentals, which is what I do. My question is does Google know how to figure that out when they are looking at my stats or is there a way to let google know that that page probably should not show up for that search phrase? On the positive, they are nice floor plans and if someone is searching for ideas for floor plans and see one of them in google images then it probably could help them, but it really is not relevant to my business. It has a 80% bounce rate, but does have an average time on page of 1.5 minutes, which is a fair amount for what is there.
Reporting & Analytics | | RobDalton0 -
Is the meta description available on the On Page Optimization Report even if its currently being optimized?
Currently, description is only available if the element is not being optimized (i.e. character count is off/keyword isn't included in the description)
Reporting & Analytics | | Jerome670 -
Google Analytics Title tag vs landing page visitors numbers
Hi folks, Just wondering if anyone has any ideas as to why im getting different results in Google analytics. I'm using the Content Efficiency Analysis Report from http://www.kaushik.net which is absolutely awesome. When I search via my title tag I get 920 Unique Visitors over the month but when I search via the landing page URL with the same title tag I get 28. Any ideas to why their should be such a difference. I've also noticed that on that page i'm also getting a Rel Cononical TRUE using a site crawl. Any ideas are much appreciated
Reporting & Analytics | | acs1110 -
Duplicate Page Title
I'm new to SEO and have just signed up to SEOMOZ to see what I can learn. I got the report back on my site and it indicates various errors, one of them being Duplicate Page Title - I have a blog on my site and a lot of pages identified as with duplicates are like this: http://www.martinspencephotography.co.uk/blog?page=2 Is it important I rectify this? Do I need to rectify it?
Reporting & Analytics | | MartinSpence460 -
How to Turn Google Analytics Into Your Own Rank Tracker Using Custom Variables
I followed the instructions here http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables, but cannot see the report in Analytics under "Custom Variables". As I dont know JS syntax, I assume the code is wrong. I just added dohertyjf's code under my normal tracking code. Can someone tell me if and where the error is? | | Much obliged!
Reporting & Analytics | | zeepartner0 -
If a page bounces in the woods, can Google Panda hear it?
I have read that after the Panda update a site's bounce rate is an important ranking metric. However, can anyone confirm whether all pages count equally? For instance, my home page gets 5000% more traffic than Deep Page X. If Deep Page X has a poor bounce rate, does it matter as much as if my Homepage has a bad bounce rate? I am guessing not, but wanted to open it up for discussion. If not, it has me wondering on what to do for some of my database driven content. I have some dynamically created pages that have higher bounce rates and minimal unique content. They aren't pure spam or junk, but are likely only about 1% unique from one another. Sounds like a no brainer change post-Panda, right? Well, what if I was the only one targeting the keywords for these pages? The pages pull from info I stored on the U.S. government stimulus program (related to my industry). It then has just about every city, state and county combo in the country for my product. For instance, a page <title>might be "Flemington, NJ Widgets - Somerset County". Something that no one else is targeting and drives minimal traffic.</p> <p> </p> <p>Do I take this content down? I didn't have any affects, positive or negative from Panda, so I am hesitant to take down thousands of Google cached pages.</p></title>
Reporting & Analytics | | TheDude0