What do you use for site audit
-
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results.
In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more.
Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
-
I use the following tools:
- Xenu - identifies broken links
- GSite Enterprise Crawler - identifies on page issues
- Google Cache, Google Webmaster Tools - finds crawling issues
- Scritch - finds server/platform type
- Ahrefs, Majestic, OSE - for link diagnostics
- SEO Book Bulk Server Header Tool
-
Hi Anthony,
I use a combination of tools for audits. SEOmoz is great for client-facing reports, and tracking issues over time. The downside is that you don't have that "on-demand" capability to crank out a full audit the instant you need it.
For on-demand audits, I use Screaming Frog, which is free for 500 URLs, and $99 for an unlimited license - it's worth every penny, and returns a full range of technical SEO data, which you can export and manipulate in Excel.
-
Although it has many limitations, I use: http://marketing.grader.com periodically. It's fast and covers the basics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Bug in site crawl analysis - 308 redirect flagged as temporary
Hi, we have some 308 redirects on our website, which are permanent redirects, but the site crawler is flagging them as temporary currently. Screenshot 2022-02-10 14.24.26.png
Moz Pro | | pm-mbc0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
How do you get your web site recrawled with Moz without waiting for a week?
My initial crawl was screwed up because of a no follow that needed to be removed. I would like Moz to recrawl the site right away so I can find any other errors.
Moz Pro | | Ron_McCabe0 -
Is it still best practice to optimize your site with geographic long tail keywords?
Since Google is tailoring search results to user IP and location, is it still best practice to optimize your site titles etc. with long-tail geographic keywords? For example, instead of optimizing a page for "dentist in West Palm Beach, Florida", search users who are IN West Palm Beach can just search "dentists" and a list of local dentists will be displayed (both in the local listings AND organic search listings). I'd love to see Rand cover this on a Whiteboard Friday!
Moz Pro | | RickyShockley1 -
How often does seomoz crawl the site? Can you force a crawl at a specific time ?
How often does seomoz crawl the site? Can you force a crawl at a specific time ?
Moz Pro | | stewbuch18720 -
How can i locate the links on my site that are causing 404 errors?
In the 404 error report, there is no way to find what page on my site has the broken link.. how can i find them??? help. Matt
Moz Pro | | seo4anyone0 -
How can I download all the inbound links to my Site?
Hi, From Site Explorer, I want to download all the inbound links linking to my site: www.comm100.com/livechat/. But it seems that there is a limitation. I can only download a part of the inbound links? What should I do? Or is it because my account is free trial? Do I have to pay for it? Please help. Thanks,
Moz Pro | | Sophia_M0 -
Will Open Site Explorer ever show number of links over time?
OSE is an amazing tool, but do you guys at SEOmoz have any plans to develop it so we can track numbers of links over time. I need to demonstrate to clients how the link building is going, and this would be a great quick report to see how many links you found on a given day, month, year, etc. A bit like magesticSEO backlink history graph, but better 🙂
Moz Pro | | timwills0