Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What do you use for site audit
-
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results.
In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more.
Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
-
I use the following tools:
- Xenu - identifies broken links
- GSite Enterprise Crawler - identifies on page issues
- Google Cache, Google Webmaster Tools - finds crawling issues
- Scritch - finds server/platform type
- Ahrefs, Majestic, OSE - for link diagnostics
- SEO Book Bulk Server Header Tool
-
Hi Anthony,
I use a combination of tools for audits. SEOmoz is great for client-facing reports, and tracking issues over time. The downside is that you don't have that "on-demand" capability to crank out a full audit the instant you need it.
For on-demand audits, I use Screaming Frog, which is free for 500 URLs, and $99 for an unlimited license - it's worth every penny, and returns a full range of technical SEO data, which you can export and manipulate in Excel.
-
Although it has many limitations, I use: http://marketing.grader.com periodically. It's fast and covers the basics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with auto generated pages on our site that are considered thin content
Hi there, Wondering how to deal w/ about 300+ pages on our site that are autogenerated & considered thin content. Here is an example of those pages: https://app.cobalt.io/ninp0 The pages are auto generated when a new security researcher joins our team & then filled by each researcher with specifics about their personal experience. Additionally, there is a fair amount of dynamic content on these pages that updates with certain activities. These pages are also getting marked as not having a canonical tag on them, however, they are technically different pages just w/ very similar elements. I'm not sure I would want to put a canonical tag on them as some of them have a decent page authority & I think could be contributing to our overall SEO health. Any ideas on how I should deal w/ this group of similar but not identical pages?
Moz Pro | | ChrissyOck0 -
SEO impact of redirecting high ranking mirror site to the main website
During SEO audit for a client I noticed that they had over a dozen duplicate websites that are carbon copies of the main website. This was done via CMS platform and DNS. One of the mirror sites has about 400 indexed pages and has Moz DA of 42 and 137k External Equity-Passing Links. Full metrics comparison is attached. I originally planned on doing rel="canonical" on the mirror site but the CMS vendor never even heard of it and is refusing to implement it in the header. My only other option is doing one to one 301 redirects. Since the mirror site ranks well, even competes with main domain for some positions on the 1st page of SERP, what will be the impact after the redirects? Is doing 301's still the best option? Thanks! PrUpN3q
Moz Pro | | dasickle0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Add to cart redirect using 302
I am getting a list of crawl errors in Moz because I am using a 302 redirect when people click on an item using the quickview add to cart eg:http://copyfaxes.com/cart/quickadd?partno=4061 will redirect them to the viewshoppingcart page. Is this wrong should this be a 301 redirect? There is no link juice to pass. Thanks
Moz Pro | | copyfaxes10 -
How to track data from old site and new site with the same URL?
We are launching a new site within the next 48 hours. We have already purchased the 30 day trial and we will continue to use this tool once the new site is launched. Just looking for some tips and/or best practices so we can compare the old data vs. the new data moving forward....thank you in advance for your response(s). PB3
Moz Pro | | Issuer_Direct0 -
What user agent is used by SEOMOZ crawler?
We have a pretty tight robots.txt file in place to only allow the major search engines. I do not want to block SEOMOZ.ORG from being able to crawl the site so I want to make sure the user agent is open.
Moz Pro | | eseider0 -
Is there a way to see what keywords users of my site are using to find it online?
Since Google Analytics no longer shows the keywords used by people to find a site online, does the SEOMoz toolset provide somethng to show this data?
Moz Pro | | Mionkeybot0 -
Fetch googlebot for sites you don't own?
I've used the "fetch as googlebot" tool in Google webmaster tools to submit links from my site, but I was wondering if there was any type of tool or submission process like this for submitting links from other sites that you do not own? The reason I ask is, I worked for several months to get a website to accept my link as part of their dealer locator tool. The link to my site was published a few months ago, however I don't think google has found it and the reason could be because you have to type in your zip code to get the link to appear. This is the website that I am referencing: http://www.ranchhand.com/dealers.php?zip=78070&radius=20 (my website is www.rangeroffroad.com) Is there any way for Google to index the link? Any ideas?
Moz Pro | | texmeix0