Which are the best website reporting tools for speed and errors
-
Hi, i have just made some changes on my site by adding some redirects and i want to know what affect this has had on my site www.in2town.co.uk
I have been using tools such as Pingdom tools, http://gtmetrix.com but these always give me different time reports so i do not know the true speed of my site and give me different advice. So i am wanting to know how to check the true speed for my site in the UK and how to check for the errors to make it better
any advice would be great on which tools to use
-
Thank you alex i will have a look now. Cannot believe that different tools are bringing back different results
-
The loading speed of any website will vary.
I use the following to test speed:
- tools.pingdom.com/fpt/ gives a good visual breakdown (similar to GTMetrix)
- www.websiteoptimization.com/services/analyze/ gives some good recommendations
- Network tab on Google Chrome Developer Tools (Ctrl+Shift+I on a PC)
- there are also various plugins available like YSlow and Speed Tracer
Also check out this article: http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
Are there any errors in particular you want to spot? http://validator.w3.org/ will pick up HTML validation errors, Xenu Link Sleuth will find crawl errors (as will the SEOmoz crawler, and Google and Bing's Webmaster Tools).
-
Hi Diane, you can also try GT Metrix . Free tool that gives you very detailed and accurate information on your site speed and recommended fixes.
-
thanks for that, will have a look now
-
Hi Diane,
For errors I prefer W3c Validator : http://validator.w3.org/
And for page speed Google web page speed : https://developers.google.com/speed/pagespeed/insights
Try them out and let me know...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we reinstate the old website?
In a nut shell we had a great site that performed well and grew month on month but it perhaps looked a bit dated. A decision was taken to build a new site and the job given to a PR agency for some reason. All the titles, H1 tags, page content and url structure was changed and now the site has drop 50% of organic traffic. I've been tasked with trying to rebuild rankings but so far it's not going well. A snapshot of the old website still exists and i'm very tempted to have it reinstated in the hopes that our traffic will recover. What are your thoughts?
Technical SEO | | etienneb0 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
Sudden drop in website ranking
Since past 2 week my website www.micromaxtablet.in is experiencing sudden drops in the site ranking. I can understand this regression in the site ranking is not because of Goggle Panda update, and their is some other reason. Kindly let me know what could be the other possibilities, and plz also suggest how to get it fixed. Keyword for which the site had the best rankings (in top 3) was "micromax tablet". Now it shows on the second page in the google search and to make it worse, it's losing its position almost every second day.
Technical SEO | | nishant9110 -
404 Error
Hello, Seomoz flagged a url as having a 404 client error. The reason the link doesn't return a proper content page is because the url name was changed. What should we do? Will this error disappear when Google indexes our site again? Or is there some way to manually eliminate it? Thanks!
Technical SEO | | OTSEO0 -
Disappeared from Google with in 2 hours of webmaster tools error
Hey Guys I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it he made the below changes and within 2 hours the site has drop off the face of google “in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed” “I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools” I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex do you guys have any further advice ? Ben
Technical SEO | | elbeno1 -
I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
Hi SEOMoz,I am trying to fix the final issues in my site crawl. One that confuses me is this canonical homepage URL fix. It says I have duplicate content on the following pages:http://www.accupos.com/http://www.accupos.com/index.phpWhat would be the best way to fix this problem? (...the first URL has a higher page authority by 10 points and 100+ more inbound links).Respectfully Yours,Derek M.
Technical SEO | | DerekM880