Better to Remove Toxic/Low Quality Links Before Building New High Quality Links?
-
Recently an SEO audit from a reputable SEO firm identified almost 50% of the incoming links to my site as toxic, 40% suspicious and 5% of good quality. The SEO firm believes it imperative to remove links from the toxic domains.
Should I remove toxic links before building new one? Or should we first work on building new links before removing the toxic ones?
My site only has 442 subdomains with links pointing to it. I am concerned that there may be a drop in ranking if links from the toxic domains are removed before new quality ones are in place.
For a bit of background my site has a MOZ Domain authority of 27, a Moz page authority of 38. It receives about 4,000 unique visitors per month through organic search. About 150 subdomains that link to my site have a Majestic SEO citation flow of zero and a Majestic SEO trust flow of zero. They are pretty low quality. However I don't know if I am better off removing them first or building new quality links before I disavow more than a third of the links to the site.
Any ideas?
Thanks,
Alan -
It's difficult to say whether these links are helping or hindering at the moment - by the sounds of it the report the SEO agency ran was a Link Detox report and the results are purely automated, while I have a lot of faith in the tools from LRT I wouldn't be 100% confident with the results as some of the links may be giving a false positive... however you will most likely be talking about a small change ratios.
Ultimately there is a potential problem if you haven't received a warning yet, so I would look to remove as many of these links as possible - as soon as possible.
-
Well,
If you say a lot of the links have a Majestic SEO citation flow of zero and a Majestic SEO trust flow of zero. Then I would not be worried about removing these links affecting your Da or Pa. They have zero trust flow.
I would def. get rid of sites you think are giving you a penalty. If you need a site review to remove a manual penalty, They will be wanting to see that you made an effort to clean up the bad link profile, not just added some new links.
As well if your site is under a manual penalty, then the new links you build may not be allowed to flow trust fully to your credit.
I would spend a few weeks cleaning up bad links than after that split the time half and half building new high quality links and removing the bad ones.
Hope that helps.
-
I think it really does not matter but personally I would focus on cleaning up your link profile then focus on your quality content to gain natural links and social shares.
You can reach out to other web masters to achieve links but honestly (if you have a manual penalty) I would be focusing my time and efforts on getting the penalty lifted first.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keep getting "/feed" broken links in Google Search Console
Hey guys, I'm having an issue for the past few months. I keep getting "/feed" broken links in Google Search Console (screenshot attached). The site is a WordPress site using the YoastSEO plugin for on-page SEO and sitemap. Has anyone else experienced this issue? Did you fix it? How should I redirect these links? s7elXMy
Technical SEO | | Extima-Christian0 -
Webmasters tools - Need to update every time you add a new product/page to an ecommerce
Hi guys, I run an ecommere store and we are constantly receiving and uploading new products. Do we need to update the sitemap every time we upload a product? Google Webmasters tools shows that the number of URLs received is higher than the number of indexed URLs. They should match right? Thanks and regards
Technical SEO | | footd0 -
Too Many Page Links
I have 8 niche websites for golf clubs. This was done to carve out tight niches for specific types of clubs then only broadens each club by type - i.e. better player, game improvement, max game improvement. So far, for fairly young sites, <1 year, they are doing fairly well as I build content. Running campaigns has alerted me to one problem - too many on-page links. And because I use Wordpress those links are on each page in the right sidebar and lead to the other sites. Even though visitors arrive via organic search in most cases they tend to eventually exit to one of the other sites or they click on a product (Ebay) and venture off to hopefully make a purchase. Ex: Drivers site will have a picture link for each of the other 7 sites. Question: If I have one stie (like a splash page) used as one link to that page listing all the sites with a brief explanation of each site will this cause visitors to bounce off because they will have one click, than the list and other clicks depending on what other club/site they would like to go to. The links all open in new windows. This would cut down on the number of links per page of each site but will it cause too much work for visitors and cause them to leave?
Technical SEO | | NicheGuy0 -
Blog.furnacefilterscanada.com/ or furnacefilterscanada.com/blog/
My shopping cart does not allow to instal a WordPress blog on a sub-domain like: furnacefilterscanada.com/blog/ But I can host my blog on another server with a sub-domain like: blog.furnacefilterscanada.com In a SEO point of view is there a difference between the 2? Link juice? Page authority? Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
How do I Address Low Quality/Duplicate Content Issue for a Job portal?
Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.
Technical SEO | | jombay3 -
Internal Linking
Hello there, I own a "how to" website with 1000+ articles, and the number of articles is growing every day. Often some articles are easier to understand if I link a certain step to an article that was written before, because that article explains the step in more detail. Should I use "read here/read more" or the "title of the article I'm referring to" as anchor text? When is internal linking too much? Should I use nofollow?
Technical SEO | | FisnikSylka0 -
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
Technical SEO | | fazza470 -
Limit number of links in a page, how to build the menu?
Hi, One of the first SEOMoz tool recommand to me, is to avoid multiple links on the same page. This is fully true, i've more than 600 internal links placed in a menu on the header.
Technical SEO | | vdgvince
This means that each page contains these 600 links at least. User-experience wise, i need to keep this multi-level menu accessible. What would you suggest me ? => No-follow on the links would be useful and not penalizing (if i still have other do-follow links to these pages) => Javascript menu, so that i can't be crawled by google => Any other suggestion? Thank you in advance!0