Suggestions on Link Auditing a 70,000 URL list?
-
I have a website with nearly 70,000 incoming links, since its a somewhat large site that has been online for 19 years.
The rate I was quoted for a link audit from a reputable SEO professional was $2 per, and clearly I don't have $140,000 to spend on a link audit !!
I was thinking of asking you guys for a tutorial that is the Gold Standard for link auditing checklists - and do it myself. But then I thought maybe its easier to shorten the list by knocking out all the "obviously good" links first. My only concern is that I be 100% certain they are good links.
Is there an "easiest approach" to take for shortening this list, so I can give it to a professional to handle the rest?
-
Hi! - I wrote this guide a few years ago on penalty recovery which may help you as it contains a lot of methods around auditing the links - https://moz.com/blog/ultimate-guide-to-google-penalty-removal
If we were to approach a product with 70k URLs. We'd do the following steps:
- Pull all the URLs into a Spreadsheet
- Split the URLs into domains
- Filter the URLs are search for common spammy words. e.g 'Link', 'Best', 'Free', 'Cheap', 'Dir', 'SEO' etc (mark as spam accordingly)
- Run contact finding across all URLs using a tool such as URL Profiler with Whois Lookups
- Filter by contact name and find duplicates (mark as spam accordingly)
- Filter by website type and mark as spam accordingly
- Manually check remaining links
By working through by domain, you'll rule out thousands of spammy links very quickly. Though 70k will ultimately take a few solid days of work.
Hope this helps,
Lewis
-
Have you looked at www.monitorbacklinks.com, good tool.
-
Hello,
Although it's important to do a link audit if you feel you have been penalized, for some sites a link audit isn't necessary. With that being said, and you feel you need a link audit there are a few options. Ideally, you would go through each link and review it to see how it may be impacting your site, but often site owners don't have the time to do this.
- Review obvious links - Grab 50-100 links at a time and do a quick glance at each one to determine if it should be on a list of potentially bad links. This way you can quickly overlook links you know are not hurting your rankings. Over time you can slowly tackle your list and hammer out which links are bad.
- Focus on spam analysis links - Run your site through Moz open site explorer and review the spam analysis. Now you're not going to get every single link here, but you can get an idea on what links are lower quality.
- Look into other companies - $2 per link is quite high, and there are other companies out there that will do a link audit, removal, and disavow for much less. If you would like a quote please contact us. Look into multiple options, don't get sold on just what one place tells you.
Hope this is helpful, if you have any additional questions please feel free to ask.
Chris
-
$2 per link is very expensive when you are looking at so many, especially as there is a big part of this that can be automated (hint: This should cost you no more than about $5-$10k if outsourced).
Linda has given you some good tips there, but I do agree that you need to tread carefully because you can often go too far and end up jumping out of the frying pan and into the fire.
It really does help to first gather all of the links from as many sources as you can and as already mentioned, create your de-dupe list. Depending on who you speak to at this point, there are different ways to go through the data and start to segment the links into those you know that are dangerous, those that are perhaps a bit of a grey area, and those that are safe.
Cheers,
Andy
-
I concentrate on the "most normal or typical sites will not need to use this tool" part, myself. (Though it sounds like you may not fall into that category.)
So then it's back to downloading as comprehensive a list of links as you can by using various sources and looking them over. (Also, in the past I have used LinkResearchTools to get an overview--it isn't cheap but it is a lot less than $140,000.)
-
Yes. We have confirmed with Sucuri that there was a concerted, intentional spam campaign against our site in 2013 that has since destroyed our rankings. Though Google hasn't given us any warnings, Sucuri had us on a blacklist because of it, and was kind enough to remove us without any cost or obligation on our part to sign up. They also provided us with a list of some of the most offending links so I could disavow them.
With up to 70,000 total, I am confident there are more, and to be honest, I see no reason to "leave some". Or leave any. I believe Google's warning should focus on this part: "...if used incorrectly". That means ... simply use it correctly. And disavow bad links, period. That's my take at least.
-
First, are you sure you need a link audit? Google is pretty good at ignoring regular spammy links that get picked up over time by large sites, as they say in their "Disavow backlinks" help page.
If you think there is a cause for concern, Moz's own Open Site Explorer can give you a list of incoming links that includes a spam score for those links, which can be used as a first pass.
The general drill for a manual link audit is to find all of the links you can (search console, moz, ahrefs, majestic, etc.) and create a de-duped list. From there, the "definitely good links" are usually easy to spot--you will recognize them from your industry or from other authoritative sources. And you will probably recognize the spammy "Get Rich/Viagra" backlinks as well. (If you sort your list by domain, it is easier to pick them out as a group.)
The rest are the ones to look at more closely.
But as I said to start, unless you think you are being penalized, tread lightly when it comes to disavowals.
To quote from Google [about disavowal]:
"This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will URLS With Existing 301 Redirects Be as Powerful As New URLS In Serps?
Most products on our site have redirects to them from years of switching platform and merely trying to get a great and optimised URL for SEO purposes. My question is this: If a product URL has alot of redirects (301's), would it be more beneficial to me to create a duplicated version of the product and start fresh with a new URL? I am not on here trying to gain backlinks but my site is tn nursery dot net (proof:)
Intermediate & Advanced SEO | | tammysons
I need some quality help figuring out what to do.
Tammy0 -
Image Audit: Getting a list of *ALL* Images on a Site?
Hello! We are doing an image optimization audit, and are therefore trying to find a way to get a list of all images on a site. Screaming Frog seems like a great place to start (as per this helpful article: https://moz.com/ugc/how-to-perform-an-image-optimization-audit), but unfortunately, it doesn't include images in CSS. 😞 Does the community have any ideas for how we try to otherwise get list of images? Thanks in advance for any tips/advice.
Intermediate & Advanced SEO | | mirabile0 -
Poor internal linking?
Hi guys, Have a large e-commerce site 10,000 pages as a client and they are currently not getting much organic traffic to their level 3 sub-category pages, the URLs are like: https://www.domain.com.au/category/s...-category-type These pages have been on-page optimised, category content added, yet hardly any traffic. However the site level 1, level 2 pages do quite well. So this suggests this might be an internal linking issue? The site is definitely not penalized and as enough authority for these level 3 pages to rank. Any ideas would be very much appreciated! Cheers.
Intermediate & Advanced SEO | | bridhard80 -
What to do with parameter urls?
We have a ton of ugly parameter urls that are coming up in google, in semrush, etc. What do we do with them? I know they can cause issues. EX https://www.hibbshomes.com/wp-content/themes/highstand/assets/js/cubeportfolio/js/jquery.cubeportfolio.min.js?ver=6.3
Intermediate & Advanced SEO | | stldanni0 -
Need suggestion for URL structure to show my website in Google News section
I need suggestion for URL structure to include my news section in Google News.Google recommend any specific URL structure to include website in Google News.?
Intermediate & Advanced SEO | | vivekrathore0 -
Linking to External Websites?
Is it good to link external websites from every page. Since, the on-page grader shows there should be one link pointing to an external source. I have a website that can point to an external website from every page using the brand name of the specific site like deal sites do have. Is it worth having external link on every page, of-course with a no-follow tag?
Intermediate & Advanced SEO | | welcomecure0 -
Intra-linking to pages with a different Canonical url ?
Hello Moz Community! I'm hoping to get some advice around intra-linking practices and the benefits when a page that is being linked to has a different canonical tag than it's own URL. Confused? Allow me to elaborate. Scenario: Background: Ecommerce Company is trying to increase its organic ranking for key, broad terms in the cycling industry. Ecommerce company is trying to rank its category pages for a main term. To help this, the company focusing on increasing the quality of its intra-linking structure (the links and anchor texts that link to another page within the site). Example goal: to have it's Road Cassettes category page rank for 'Road Cassettes' Company's 'cassettes' main category page is here: /Components/Drivetrain/Cassettes/ And the company uses filtered navigation logic to drill down into 'road cassettes' specifically: /Components/Drivetrain/Cassettes/?page_no=1&fq=ATR_RoadBiking:True SEOs are instructed to include occasional links back to this page, with SEO friendly anchor text, to help strengthen it's authority for the main term. The Issue / Question: Main category URL: /Components/Drivetrain/Cassettes/ Road Cassettes category URL: /Components/Drivetrain/Cassettes/?page_no=1&fq=ATR_RoadBiking:True Road Cassettes Canonical URL: /Components/Drivetrain/Cassettes/ The canonical URL of the filtered Road Cassettes category is its main category URL. Will Company be able to effectively rank its Road Cassettes category URL for 'Road Cassettes' if the canonical URL is the main category? Should the canonical URL not be the main category? OR Will increasing the intra-linking to the Road Cassettes URL help the main category URL rank for 'Road Cassettes' - by passing all it's authority?
Intermediate & Advanced SEO | | Ray-pp0 -
Received "Googlebot found an extremely high number of URLs on your site:" but most of the example URLs are noindexed.
An example URL can be found here: http://symptom.healthline.com/symptomsearch?addterm=Neck%20pain&addterm=Face&addterm=Fatigue&addterm=Shortness%20Of%20Breath A couple of questions: Why is Google reporting an issue with these URLs if they are marked as noindex? What is the best way to fix the issue? Thanks in advance.
Intermediate & Advanced SEO | | nicole.healthline0