Which are the best website reporting tools for speed and errors
-
Hi, i have just made some changes on my site by adding some redirects and i want to know what affect this has had on my site www.in2town.co.uk
I have been using tools such as Pingdom tools, http://gtmetrix.com but these always give me different time reports so i do not know the true speed of my site and give me different advice. So i am wanting to know how to check the true speed for my site in the UK and how to check for the errors to make it better
any advice would be great on which tools to use
-
Thank you alex i will have a look now. Cannot believe that different tools are bringing back different results
-
The loading speed of any website will vary.
I use the following to test speed:
- tools.pingdom.com/fpt/ gives a good visual breakdown (similar to GTMetrix)
- www.websiteoptimization.com/services/analyze/ gives some good recommendations
- Network tab on Google Chrome Developer Tools (Ctrl+Shift+I on a PC)
- there are also various plugins available like YSlow and Speed Tracer
Also check out this article: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Are there any errors in particular you want to spot? http://validator.w3.org/ will pick up HTML validation errors, Xenu Link Sleuth will find crawl errors (as will the SEOmoz crawler, and Google and Bing's Webmaster Tools).
-
Hi Diane, you can also try GT Metrix . Free tool that gives you very detailed and accurate information on your site speed and recommended fixes.
-
thanks for that, will have a look now
-
Hi Diane,
For errors I prefer W3c Validator : http://validator.w3.org/
And for page speed Google web page speed : https://developers.google.com/speed/pagespeed/insights
Try them out and let me know...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many tiers are the best?
I am working the SEO from my website and I don't know how many Tiers I need. I have read a lot but people always says different things. What do you think? Thanks for your help friends! Regards, Carlos Zambrana
Technical SEO | | CarlosZambrana0 -
174 Duplicate Content Errors
How do I go about fixing these errors? There are all related to my tags. Thank you in advance for any help! Lisa
Technical SEO | | lisarein0 -
When Should I Ignore the Error Crawl Report
I have a handful of pages listed in the Error Crawl Report, but the report isn't actually showing anything wrong with these pages. I am double checking the code on the site and also can't find anything. Should I just move on and ignore the Error Crawl Report for these few pages?
Technical SEO | | ChristinaRadisic0 -
DNS error on webmaster tool
Google webmaster tool is showing DNS error and that is leading to many server error (502,500) almost 50+ in every crawl. Recently Google crawled one of our sub domains that we did not want google to crawl. We blocked it via Robots.txt and also removed all the URL's and since then we are having this issue. Any suggestions how to fix this DNS error? Thanks in advance.
Technical SEO | | tpt.com0 -
Why is an error page showing when searching our website using Google "site:" search function?
When I search our company website using the Google site search function "site:jwsuretybonds.com", a 400 Bad Request page is at the top of the listed pages. I had someone else at our company do the same site search and the 400 Bad Request did not appear. Is there a reason this is happening, and are there any ramifications to it?
Technical SEO | | TheDude0 -
How To Proceed When A Portion of Website has been hijacked.
Hi - I've recently learn that a site I manage: http://www.hhisland.com has somehow been hijacked by other sites (examples below): http://wlwhost.info/schering-07-nissan-altima-air-conditioner-drain-clogged/ (460 links)
Technical SEO | | hhdentist
http://abhinav.co.uk/Mary-motorcycles-for-sale-in-iasi/ (440 links)
http://www.turetzky.net/pmr/knifty-knitter-hobo-glove-pattern.html (374 links)
http://safeimail.com/banker-cold-cstrike-16/ (233 links) Just wanted to find out what my best course of action might be? Would changing hosts (or IP address) help in this situation? Thanks!0 -
Error Reporting
http://pro.seomoz.org/campaigns/33868/issues/18 Rel Canonical Found about 16 hours ago <dl> <dt>Tag value</dt> <dd>http://www.geeks.com/</dd> <dt>Description</dt> <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> <dd>We do have rel canonical on some of the pages this report is recommending that we "fix" this issue.</dd> <dd> Rel Canonical Found about 16 hours ago <dl> <dt>Tag value</dt> <dd>http://www.geeks.com/products.asp?cat=MBB</dd> <dt>Description</dt> <dd>Using rel=canonical suggests to search engines which URL should be seen as canonical.</dd> </dl> <a class="more expanded">Minimize</a> </dd> </dl>
Technical SEO | | JustinGeeks0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0