What do you use for site audit
-
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results.
In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more.
Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
-
I use the following tools:
- Xenu - identifies broken links
- GSite Enterprise Crawler - identifies on page issues
- Google Cache, Google Webmaster Tools - finds crawling issues
- Scritch - finds server/platform type
- Ahrefs, Majestic, OSE - for link diagnostics
- SEO Book Bulk Server Header Tool
-
Hi Anthony,
I use a combination of tools for audits. SEOmoz is great for client-facing reports, and tracking issues over time. The downside is that you don't have that "on-demand" capability to crank out a full audit the instant you need it.
For on-demand audits, I use Screaming Frog, which is free for 500 URLs, and $99 for an unlimited license - it's worth every penny, and returns a full range of technical SEO data, which you can export and manipulate in Excel.
-
Although it has many limitations, I use: http://marketing.grader.com periodically. It's fast and covers the basics.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our crawler was not able to access the robots.txt file on your site.
Good morning, Yesterday, Moz gave me an error that is wasn't able to find our robots.txt file. However, this is a new occurrence, we've used Moz and its crawling ability many times prior; not sure why the error is happening now. I validated that the redirects and our robots page are operational and nothing is disallowing Roger in our robots.txt. Any advice or guidance would be much appreciated. https://www.agrisupply.com/robots.txt Thank you for your time. -Danny
Moz Pro | | Danny_Gallagher0 -
On page Grader is not working on specific site
Hello.
Moz Pro | | livedigm
When I try to use 'On Page Grader' on specific site, I get an error message. "
Page Optimization Error
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page.
"
example : https://www.livedigm.com Site's robots.txt settings are good. and I think there's no blocking factor. But On Page Grader cannot crawl the sites.
But campaign crawler is working well on the site. only On Page Grader is not working.. What should I change my server's setting or site's setting for crawling site on my site?
I'm using wordpress on cloudways / Digitalocean(singapore) server. Thank you.0 -
Cleaning Up Bad 301 External Links From Old Site
A relatively new site I'm working on has been hit really hard by Panda, due to over optimization of 301 external links which include exact keyword phrases, from an old site. Prior to the Panda update, all of these 301 redirects worked like a charm, but now all of these 301's from the old url are killing the new site, because all the hyper-text links include exact keyword matches. A couple weeks ago, I took the old site completely down, and removed the htaccess file, removing the 301's and in effect breaking all of these bad links. Consequently, if one were to type this old url, you'd be directed to the domain registrar, and not redirected to the new site. My hope is to eliminate most of the bad links, that are mostly on spammy sites, that aren't worth linking to. My thought is these links would eventually disappear from G. My concern is that this might not work, because G won't re-index these links, because once they're indexed by G, they'll be there forever. My fear is causing me to conclude I should hedge my bets, and just disavow these sites using the disavow tool in WMT. IMO, the disavow tool is an action of last resort, because I don't want to call attention to myself, since this site doesn't have a manual penalty inflected on it. Any opinions or advise would be greatly appreciated.
Moz Pro | | alrockn0 -
Get into Google : New Sites
I have a brand new website. It was created 10 days ago. How long would it take for it to show up in search results? I understand that since the site is new, there are no sites sending it backlinks. Also, i have optimized the page for my keyword "xyz" and it received an A grade. The site does not figure even in the top 50 results. Please help me out. It is a one page web application that needs to drive traffic to survive.
Moz Pro | | dl_s0 -
In Open Site Explorer is it possible to use wildcards?
If I have a section on my website called lists with articles in there can I use wildcards in Open Site Explorer to find how many backlinks all articles in that section have - and ideally which pages are most linked to? Something like www.example.com/lists/* to give number of backlinks to all articles in that website section and which are the most highly linked to. Would be a great feature to have! Cheers Siimon
Moz Pro | | SimonCh0 -
OPen Site Explorer listing criteria?
Checking link data for a potential client who has given me access to their websmaster tools account, I've noticed that Google list over 150,000 "Links to Your Site" but yet OSE only reports a tiny fraction of this number.
Moz Pro | | G-DC
I have manually verified that there are in fact direct links from the sites that are listed in Webmaster Tools but not in OSE. Does OSE discard some links? Is there a reporting lag? Does anyone know why there would be such a large discrepancy?0