Suggestions on Link Auditing a 70,000 URL list?
-
I have a website with nearly 70,000 incoming links, since its a somewhat large site that has been online for 19 years.
The rate I was quoted for a link audit from a reputable SEO professional was $2 per, and clearly I don't have $140,000 to spend on a link audit !!
I was thinking of asking you guys for a tutorial that is the Gold Standard for link auditing checklists - and do it myself. But then I thought maybe its easier to shorten the list by knocking out all the "obviously good" links first. My only concern is that I be 100% certain they are good links.
Is there an "easiest approach" to take for shortening this list, so I can give it to a professional to handle the rest?
-
Hi! - I wrote this guide a few years ago on penalty recovery which may help you as it contains a lot of methods around auditing the links - https://moz.com/blog/ultimate-guide-to-google-penalty-removal
If we were to approach a product with 70k URLs. We'd do the following steps:
- Pull all the URLs into a Spreadsheet
- Split the URLs into domains
- Filter the URLs are search for common spammy words. e.g 'Link', 'Best', 'Free', 'Cheap', 'Dir', 'SEO' etc (mark as spam accordingly)
- Run contact finding across all URLs using a tool such as URL Profiler with Whois Lookups
- Filter by contact name and find duplicates (mark as spam accordingly)
- Filter by website type and mark as spam accordingly
- Manually check remaining links
By working through by domain, you'll rule out thousands of spammy links very quickly. Though 70k will ultimately take a few solid days of work.
Hope this helps,
Lewis
-
Have you looked at www.monitorbacklinks.com, good tool.
-
Hello,
Although it's important to do a link audit if you feel you have been penalized, for some sites a link audit isn't necessary. With that being said, and you feel you need a link audit there are a few options. Ideally, you would go through each link and review it to see how it may be impacting your site, but often site owners don't have the time to do this.
- Review obvious links - Grab 50-100 links at a time and do a quick glance at each one to determine if it should be on a list of potentially bad links. This way you can quickly overlook links you know are not hurting your rankings. Over time you can slowly tackle your list and hammer out which links are bad.
- Focus on spam analysis links - Run your site through Moz open site explorer and review the spam analysis. Now you're not going to get every single link here, but you can get an idea on what links are lower quality.
- Look into other companies - $2 per link is quite high, and there are other companies out there that will do a link audit, removal, and disavow for much less. If you would like a quote please contact us. Look into multiple options, don't get sold on just what one place tells you.
Hope this is helpful, if you have any additional questions please feel free to ask.
Chris
-
$2 per link is very expensive when you are looking at so many, especially as there is a big part of this that can be automated (hint: This should cost you no more than about $5-$10k if outsourced).
Linda has given you some good tips there, but I do agree that you need to tread carefully because you can often go too far and end up jumping out of the frying pan and into the fire.
It really does help to first gather all of the links from as many sources as you can and as already mentioned, create your de-dupe list. Depending on who you speak to at this point, there are different ways to go through the data and start to segment the links into those you know that are dangerous, those that are perhaps a bit of a grey area, and those that are safe.
Cheers,
Andy
-
I concentrate on the "most normal or typical sites will not need to use this tool" part, myself. (Though it sounds like you may not fall into that category.)
So then it's back to downloading as comprehensive a list of links as you can by using various sources and looking them over. (Also, in the past I have used LinkResearchTools to get an overview--it isn't cheap but it is a lot less than $140,000.)
-
Yes. We have confirmed with Sucuri that there was a concerted, intentional spam campaign against our site in 2013 that has since destroyed our rankings. Though Google hasn't given us any warnings, Sucuri had us on a blacklist because of it, and was kind enough to remove us without any cost or obligation on our part to sign up. They also provided us with a list of some of the most offending links so I could disavow them.
With up to 70,000 total, I am confident there are more, and to be honest, I see no reason to "leave some". Or leave any. I believe Google's warning should focus on this part: "...if used incorrectly". That means ... simply use it correctly. And disavow bad links, period. That's my take at least.
-
First, are you sure you need a link audit? Google is pretty good at ignoring regular spammy links that get picked up over time by large sites, as they say in their "Disavow backlinks" help page.
If you think there is a cause for concern, Moz's own Open Site Explorer can give you a list of incoming links that includes a spam score for those links, which can be used as a first pass.
The general drill for a manual link audit is to find all of the links you can (search console, moz, ahrefs, majestic, etc.) and create a de-duped list. From there, the "definitely good links" are usually easy to spot--you will recognize them from your industry or from other authoritative sources. And you will probably recognize the spammy "Get Rich/Viagra" backlinks as well. (If you sort your list by domain, it is easier to pick them out as a group.)
The rest are the ones to look at more closely.
But as I said to start, unless you think you are being penalized, tread lightly when it comes to disavowals.
To quote from Google [about disavowal]:
"This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will link juice still be passed if you have the same links in multiple, outreach articles?
We are developing high quality, unique content and sending them out to bloggers to for guest posts. In these articles we have links to 2 to 3 sites. While the links are completely relevant, each article points to the same 2 to 3 sites. The link text varies slightly from article to article, but the linked-to site/URLs remain the same. We have read that it is best to have 2 to 3 external links, not all pointing to the same site. We have followed this rule, but the 2 to 3 external sites are the same sites on the other articles. I'm having a hard time explaining this, so I hope this makes sense. My concern is, will Google see this as a pattern and link juice won't be passed to the linked-to URLs, or worst penalize all/some of the sites being linked to or linked from? Someone I spoke to had suggest that my "link scheme" describes a "link wheel" and the site(s) will be penalized by Penguin. Is there any truth to this statement?
Intermediate & Advanced SEO | | Cutopia0 -
Links: Links come from bizzare pages
Hi all, My question is related to links that I saw in Google Search Console. While looking at who is linking to my site, I saw that GSC has some links that are coming from third party websites but these third party webpages are not indexed and not even put up by their owners. It looks like the owner never created these pages, these pages are not indexed (when you do a site: search in Google) but the URL of these pages loads content in the browser. Example - www.samplesite1.com/fakefolder/fakeurl what exactly is this thing? To mention more details, the third party website in question is a Wordpress website and I guess is probably hijacked. But how does one even get these types pages/URLs up and running on someone else's website and then link out to other websites. I am concerned as the content that I am getting link from is adult content and I will have to do some link cleansing soon.
Intermediate & Advanced SEO | | Malika10 -
Link Brokers Yes or No?
We have a client who has asked us to talk to link brokers to speed up the back linking process. Although I've been aware of them for ages I have never openly discussed the possible use of 'buying' links or engaging in that part of the industry. Do they have a place in SEO and if so what is the MOZ communities thoughts?
Intermediate & Advanced SEO | | wearehappymedia0 -
Internal Linking - Can You Over Do It?
Hi, One of the sites I'm working on has a forum with thousands of pages, amongst thousands of other pages. These pages produce lots of organic search traffic... 200,000 per month. We're using a bit of custom code to link relevant words and phrases from various discussion threads to hopefully related discussion pages. This generates thousands of links and up to 8 in-context links per page. A page could have anywhere from 200 to 3000 words in one to 50+ comments. Generally, a page with 200 words would have fewer of these automatically generated links, just because there are fewer terms naturally on the page. Is there any possible problem with this, including but not limited to some kind of internal anchor text spam or anything else? We do it to knit together pages for link juice and hopefully user experience... giving them another page to go to. The pages we link to are all our pages that produce or we hope to produce organic search traffic from. Thanks! ....Darcy
Intermediate & Advanced SEO | | 945010 -
Redirect to url with parameter
I have a wiki (wiki 1) where many of the pages are well index in google. Because of a product change I had to create a new wiki (wiki 2) for the new version of my product. Now that most of my customers are using the new version of my product I like to redirect the user from wiki 1 to wiki 2. An example of a redirect could be from wiki1.website.com/how_to_build_kitchen to wiki2.website.com/how_to_build_kitchen. Because of a technical issue the url I redirect to, needs to have a parameter like "?" so the example will be wiki2.website.com/how_to_build_kitchen? Will the search engines see it as I have two pages with same content?
Intermediate & Advanced SEO | | Debitoor
wiki2.website.com/how_to_build_kitchen
and
wiki2.website.com/how_to_build_kitchen? And will the SEO juice from wiki1.website.com/how_to_build_kitchen be transfered to wiki2.website.com/how_to_build_kitchen?0 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
Canonical url issue
Canonical url issue My site https://ladydecosmetic.com on seomoz crawl showing duplicate page title, duplicate page content errors. I have downloaded the error reports csv and checked. From the report, The below url contains duplicate page content.
Intermediate & Advanced SEO | | trixmediainc
https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=40&brands=66&click=brnd And other duplicate urls as per report are,
https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&click=colorsu&brands=66 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&brands=66&click=brnd But on every these url(all 4) I have set canonical url. That is the original url and an existing one(not 404). https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=0 Then how this issues are showing like duplicate page content. Please give me an answer ASAP.0