Tools necessary for a Technical Audit of website with penalties and need remediation?
-
Tools necessary for a Technical Audit of website with penalties and needs remediation? I am being tested for a job interview to prove and/or disprove a website has issues. I am familiar with Moz tools but I'm not sure of the procedure for this request? I am not finding anything online. The client will be giving a website and I will be doing this audit.
What tools would you use?
What exactly should I be looking for?
What are some obvious fixes?
WHERE CAN I LEARN MORE?
-
Joseph,
Yes, GWT = Google Webmaster Tools
In GA, you will want to take a look at traffic levels, Bounce Rate, Page views, Time On Page, etc. If any of these metrics show a significant decrease instantaneously (over a couple of days) you can be fairly sure there is something going on from a penalization standpoint. That, or the GA tracking code may have been tampered with during a site redesign. The way to know this is occurring is if your rankings are staying relatively high, but your measured traffic is decreasing drastically. We had a client who went through this just this week - it's a small problem to fix, but it can make your heart race when you think it's a penalty.
GWT will inform you of any penalization taken against your website (if it is manual). If it is algorithmic, your only real warning will be your ranking and traffic drops. Besides that, it is also fairly good for security issues, but these may not be alluded to directly. A great way to determine whether a site is at risk/hacked is to check the link profile - if there is an unnaturally large number of incoming spammy links, there are good odds you are the target of a negative SEO attack, or the site is hacked and being used for spam. Use a site: search to determine if new pages are being created on your sitemap and what they are targeting.
These are worst-case scenarios, so I don't know if you will be tested on them. More likely you will have to make adjustments to some basic on-site ranking factors like H1's or Title Tags.
Feel free to touch base any time if you need additional tips - you can PM me anytime.
Best of luck!
-
in GA - traffic drop offs, in fact any large changes in any metrics
GWT - manual actions, system messages, security issues
-
Dmitrii! Thanks very much!
Any pointers and/or specific areas to pay attention to in GA and GWT? Joe
-
Rob, Thanks very much for this!
GWT is Google Webmaster Tools?
Also, Any pointers and/or specific areas to pay attention to in GA and GWT?
-
Hi.
My belief is that for ANYWHAT good TECHNICAL audit you will need access to Google Analytics and GWT. Because it's TECHNICAL. And these tools will tell you everything about penalizations etc.
Yes, you can use MOZ, but if you don't have access to historical rankings data - there is no much use to determine anything about penalizations.
The rest is covered by other comments - ahrefs/majestic/OSE for backlink profile, MOZ Rank Tracker for current rankings, webpagetest and PageSpeed for loading times.
And the most important - your own head. Just look at the website. If that website has been penalized - there is huge reason for that and you'd be able to see it just by looking at the website architecture, content etc.
-
Hi Joseph,
I will give you a list of the tools I use/have used for auditing purposes of a website and you can pick and choose what you think might be useful:
On-Site - These will help you with on-site penalties (Panda)
1. SEMrush (http://www.semrush.com/)
Beautiful on-site auditing tool. Provides you with knowledge of the site from the perspective of Googlebot and outlines all issues with URL's included so you know where to go/what to fix with minimal involvement from you. It is a monthly subscription service and also incorporates keyword-tracking software, among other things. Deliverables include PDF's and CSV's of data.
2. SEO Powersuite SiteAuditor (www.seopowersuite.com)
This is a one-time cost piece of software. It is less flexible and in-depth than SEMrush in my opinion, but it gets the job done. The primary selling point is that SEO Powersuite provides you with multiple SEO tools for a one-time fee.
Off-Site (Link Profile) - These will help you with off-site penalties (Penguin)
1. Majestic (https://majestic.com/)
Great link-profile tool that allows you to conduct link audits of a website. I use this tool to develop competition analysis reports, and to determine the value of links, whether they are worth keeping, what links I might pursue from competitors, etc. It is a monthly service and you can compare your client's site to others in their niche/industry directly from an off-site perspective. Reports are in CSV format.
2. Ahrefs (https://ahrefs.com)
This is a more in-depth tool that many of my co-workers prefer compared to Majestic. They perform much the same service, but Ahrefs has a more extensive database. It is a more "techy" option than Majestic, which is geared more for client presentation.
You will also want to make sure that they are providing you with all the tools you need to conduct the technical audit - GA access, GWT access, etc.
Hope this helps and let me know if you need any further pointers!
Rob
-
Moz tools will show you all of the crawl issues (Which includes any crawl errors, duplicate content, meta tag errors, robot.txt errors And A LOT more), spammy links, on page seo, bad reviews and many other problems.
I'm not sure what else they would be looking for. Those tools will bring up so many errors (more than likely) Then you just need to learn/explain how to fix these errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have set up my www as a cname, do I still need to add a redirect?
Hello, I am trying to minimize my redirect chains that have been popping up in my Moz crawl issues. So I'd like to get clear on some best practices. I have a few clients that are getting their first redesign in a while. We are moving into Wordpress from good ole .html files. So I've installed a redirection plug in to create 301 redirects for those .html versions of the pages. But apparently I am already redirecting from www to non-www. I have the www set up as a CNAME going to the IP address - is that counting as the redirection? Sometimes though, the www is set up by making it an A record. Is one way better than the other? Finally, I'm forcing https for all my new sites now. Is whatever the host does to "force" https also counting as a redirect and contributing to my chain? Any advice will be appreciated!
Moz Pro | | Dandelion0 -
I have a new website and would like to use the keywords my competitors are using and was wondering the best way to take a look using Moz Pro?
The website is www.fapwrap.com - it's a new adult wall decals website. [editor's note: may not be safe for some workplaces] The wall decals we have are obviously a bit more niche than the bigger competitors, but I want to get into the first page to see how it helps us. I'm slowly taking over some kws like "xbox wraps" and "ps4 wraps" but those have low search volume and I feel like "wall decals" will help - though it's highly competitive. Suggestions? I just kinda want to plug in my competition and pick from their list of existing kws..... Ha.
Moz Pro | | JohnnyRoq0 -
Open Site Explorer vs Webmaster Tools
Hi there. OSE is showing 53 linking domains and WMT is showing 161.
Moz Pro | | JeromeSavon
Why are so many missing from OSE. They are all links of a decent age. Thanks0 -
Best SEO Tool Set ! What Are they ?
Hey Moz Family. I wanted to get everyones opinion on some Seo Tools Sets that they have used and have had success with or find essential to Seo Campaigns. I have been playing around with a lot of tool sets. Just need some honest input on some tools you guys use, or a combination of tools. Best of luck, Hampig M Ceo BizDetox
Moz Pro | | BizDetox0 -
A question about Mozbot and a recent crawl on our website.
Hi All, Rogerbot has been reporting errors on our website's for over a year now, and we correct the issues as soon as they are reported. However I have 2 questions regarding the recent crawl report we got on the 8th. 1.) Pages with a "no-index" tag are being crawled by roger and are being reported as duplicate page content errors. I can ignore these as google doesnt see these pages, but surely roger should ignore pages with "no-index" instructions as well? Also, these errors wont go away in our campaign until Roger ignores the URL's. 2.) What bugs me most is that resource pages that have been around for about 6 months have only just been reported as being duplicate content. Our weekly crawls have never picked up these resources pages as being a problem, why now all of a sudden? (Makes me wonder how extensive each crawl is?) Anyone else had a similar problem? Regards GREG
Moz Pro | | AndreVanKets0 -
Tool for tracking actions taken on problem urls
I am looking for tool suggestions that assist in keeping track of problem urls, the actions taken on urls, and help deal with tracking and testing a large number of errors gathered from many sources. So, what I want is to be able to export lists of url's and their problems from my current sets of tools (SEOmoz campaigns, Google WM, Bing WM,.Screaming Frog) and input them into a type of centralized DB that will allow me to see all of the actions that need to be taken on each url while at the same time removing duplicates as each tool finds a significant amount of the same issues. Example Case: SEOmoz and Google identify urls with duplicate title tags (example.com/url1 & example.com/url2) , while Screaming frog sees that example.com/url1 contains a link that is no longer valid (so terminates in a 404). When I import the three reports into the tool I would like to see that example.com/url1 has two issues pending, a duplicated title and a broken link, without duplicating the entry that both SEOmoz and Google found. I would also like to see historical information on the url, so if I have written redirects to it (to fix a previous problem), or if it used to be a broken page (i.e. 4XX or 5XX error) and is now fixed. Finally, I would like to not be bothered with the same issue twice. As Google is incredibly slow with updating their issues summary, I would like to not important duplicate issues (so the tool should recognize that the url is already in the DB and that it has been resolved). Bonus for any tool that uses Google and SEOmoz API to gather this info for me Bonus Bonus for any tool that is smart enough to check and mark as resolved issues as they come in (for instance, if a url has a 403 error it would check on import if it still resolved as a 403. If it did it would add it to the issue queue, if not it would be marked as fixed). Does anything like this exist? how do you deal with tracking and fixing thousands of urls and their problems and the duplicates created from using multiple tools. Thanks!
Moz Pro | | prima-2535090 -
Question to SEOMOZ when will the On-Page Optimization tool be Updated
Hi, When will your On-page Optimization Tool be updated to reflect the Over Optimization Penalty which is coming. I would think that grading out at an 'A' will have adverse effects.
Moz Pro | | Bucky0 -
Is it possible to do the comparision between a subfolder against a subdomain at SEOmoz campaign tool?
Is it possible to do the comparision between a subfolder against a subdomain at SEOmoz campaign tool? For instance: - I have a client who uses a subdomain model to render your homepage and he has a competitor hosted in a subfolder. It appears to be impossible to do that kind of comparision at the SEOmoz campaign tool. Do you have some other way to do that?
Moz Pro | | Abril0