Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best tools for an initial website health check?
-
Hi,
I'd like to offer free website health checks (basic audits) and am wondering what tools other people use for this? It would be good to use something that presents the data well.Moz is great but it gets expensive if I want to offer these to many businesses in the hope of taking on just a few as clients and doing a full manual audit for them.
So far I've tried seositecheckup.com (just checks a single page though), metaforensics.io and mysiteauditor.
Thanks!
-
Sounds like a good idea. I'm just trying to provide basic data on a potential clients' site - traffic, rankings, main errors and things that are working for them. Some tools actually do a pretty good job of this, but only seem to do this for the homepage, rather than the entire site. I guess I'll have to use multiple tools and hope the effort is worth it as it's a free service.
Do you use SEO profiler?
-
Hi John,
Do you use SEMRush? I've been thinking of signing up. Can you run unlimited reports do you know? Screaming Frog is great and is what I'll use for thorough Audits but I'm just trying to generate lead generation/potential new client reports.
thanks
-
I like Ccreaming Frog and SEMRush as well. WebsiteGrader.com is good in a pinch too. SEO Profiler has a nice set of tools that have come a long way in the last 2 years. I think this is one of those tools where we all have a different opinion as to what works best, which would explain so many options!
I feel like I have tons of great data...I just can't find a good way to display it graphically so it is clear to the client and helps me use the research to convert prospects. Maybe we should come up with a killer report template we can all use?
-
James
For a free report I would start with website.grader.com - a couple of screen grabs, Rand backed it a few weeks ago. For a very low cost report I would use semrush.com.
Hope that assists.
-
Good point.
-
ScreamingFrog gives all the data you want. Tools for the purpose of a creating a sleek report usually don't give the full picture. It's those issues you draw out yourself that makes a difference.
-
Really it's best to use S. Frog then and to extract that info for a website review instead of trying to find software that gives the overall 'health' of a site and produces a report?
thanks for your answer by the way!
-
I second ScreamingFrog. Nothing else comes close.
-
Hi James,
For page load, network, speed test I have used Pingdom.com in the past. They recently went more pay to use but it is a nice set of tools for basic test. There is still some free stuff you can use at tools.pingdom.com
For a quick SEO pass. Man I love ScreamingFrog! You can quickly identify header errors, long titles, descriptions, lack of h1 tags and so much more. When I do general quick audits for trouble shooting problems posted on this board, its my go to.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WEbsite cannot be crawled
I have received the following message from MOZ on a few of our websites now Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. I have spoken with our webmaster and they have advised the below: The Robots.txt file is definitely there on all pages and Google is able to crawl for these files. Moz however is having some difficulty with finding the files when there is a particular redirect in place. For example, the page currently redirects from threecounties.co.uk/ to https://www.threecounties.co.uk/ and when this happens, the Moz crawler cannot find the robots.txt on the first URL and this generates the reports you have been receiving. From what I understand, this is a flaw with the Moz software and not something that we could fix form our end. _Going forward, something we could do is remove these rewrite rules to www., but these are useful redirects and removing them would likely have SEO implications. _ Has anyone else had this issue and is there anything we can do to rectify, or should we leave as is?
Moz Pro | | threecounties0 -
What is the best way to treat URLs ending in /?s=
Hi community, I'm going through the list of crawl errors visible in my MOZ dashboard and there's a few URLs ending in /?s= How should I treat these URLs? Redirects? Thanks for any help
Moz Pro | | Easigrass0 -
Best method for tracking true keyword ranking overtime?
It seems as though week to week ranking monitoring can be very volatile, and comparing over the course of a month only takes into account the absolute change and not whether the rank has been jumping up and down in the 3 weeks between. What is a good method (Moz tool or not) for tracking a true change in average rank?
Moz Pro | | ajranzato90 -
Does SEOmoz have a tool to find mirror sites?
I heard from a company that is trying to get my clients SEO business that they discovered multiple sites mirroring our site's content. Does SEOmoz have a tool to find these websites? Or does Google?
Moz Pro | | thomas.wittine0 -
How can a site have a backlink from Barclays website?
Hi, I have entered a competitiors website www.my-wardrobe.com into Open Site to see who they get links from and to my surprise they have a load from Barclays Business Banking. When I visit the page I can not see the links. But if I search the pages source code for my-wardrobe, there I have it, a link to my-wardrobe.com. How have they done this? Surely Barclays haven't sold them it? And more so, why are they receiving link juice when you cant even see the link on the Barclays page in question - http://www.barclays.co.uk/BusinessBanking/P1242557952664 Thanks | |
Moz Pro | | YNWA
| | <a <span="">href</a><a <span="">="</a>http://www.my-wardrobe.com" class="popup" title="Link opens in a new window" rel='' onmousedown="dcsMultiTrack('DCS.dcsuri','BusinessBankingfromBarclays/Footer/wwwmywardrobecom', 'WT.ti', '','WT.dl','1');"> |
| | www.my-wardrobe.com |
| |
|
| | |0 -
Is there a tool to upload multiple URLs and gather statistics and page rank?
I was wondering if there is a tool out there where you can compile a list of URL resources, upload them in a CSV and run a report to gather and index each individual page. Does anyone know of a tool that can do this or do we need to create one?
Moz Pro | | Brother220 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0