I want to Disavow some more links - but I'm only allowed one .txt file?
-
Hey guys,
Wondering if you good people could help me out on this one?
A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached.
However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List".
When I went to upload this new list I was informed that I would be replacing the existing file.
So, my question is, what do I do here?
Make a new list with both old and new domains that I plan on disavowing and replace the existing one?
Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?
-
Cheers Tom.
Exactly the answer I needed!
-
Hi Matthew
You want to add to your current list. So you'll want to upload a file that had what you had previously disavowed in addition to what new sites you want to disavow.
It's probably worth putting in a description line like:
domain:badsite.com
badsite2.com/badpageThese files were uploaded on 19/09/2013 following a further link audit
And so on. Showing progressive evidence of action taken is always a good sign I feel.
If you uploaded the new file without the old links, for all intents and purposes it would "de-disavow" those links, so you wanna keep them in there.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links Not Detected by MOZ, AHREFS, GSC-ARE THESE QUALITY LINKS?
Our SEO provider has been creating content (6 blog posts per month as well as building page write ups) and has been promoting that content. Several links per month have been created as a result of this effort. Many of the links have been from commercial real estate publications. I am concerned that the quality of these links is not high enough to improve our ranking. Most links do not appear on AHREFS, Google Search Console or MOZ. Is this a red flag that these links are weak? Ranking and traffic on the site have improved considerably since this provider began the project in April of 2019. They have been writing about 30 pages about New York City. commercial buildings each month in addition to 4 short blog posts and 2 extremely well researched and authoritative blog posts. My concern is that the links are not of sufficient quality to result increased ranking. That the improvement in ranking is solely due to the addition of new content rather than the creation of these links. Basically, that I am incurring the cost on an ongoing basis of an link building campaign with little to no benefit. That being the case, I would shift resources to content creation and increase and improve content rather than develop links with little value. A sample of links are below: Would greatly appreciate some feedback as to whether these are in fact helpful to the domain authority, reputation and ranking of our website. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan https://patch.com/new-york/bayside/bayside-queens-priciest-area-retail-office-space-study https://qns.com/story/2019/12/04/these-commercial-streets-in-queens-were-among-the-most-expensive-in-2019/ https://patch.com/new-york/brooklyn/flatbush-ave-priciest-retail-spot-outside-manhattan-study http://thejewishvoice.com/2019/12/07/nycs-most-expensive-commercial-streets-neighborhoods-in-2019-would-surprise-you/ https://atalyst.com/investment-banking-interview-metro-manhattan/0 -
Google WMT/search console showing thousands of links in "Internal Links"
Hi, One of our blog-post has been interlinked with thousands of internal links as per search console; but lists only 2 links it got connected from. How come so many links it got connected internally? I don't see any. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
What is future of Link building ? Any link building experts Here ?
Hey Everyone, its Muhammad Umair Ghufran I have one question about Link Building ? As my Knowledge Google Love the Quality content but Link building rank some low quality website Right ? So, what is the future of link building ; please explain indeep with complete reference for better understanding Thanks Regards: Muhammad Umair Ghufran
Intermediate & Advanced SEO | | muhammadumairghufran0 -
Same page Anchor Links vs Internal Link (Cannibalisation)
Hey Mozzers, I have a very long article page that supports several of my sub-category pages. It has sub-headings that link out to the relevant pages. However the article is very long and to make it easier to find the relevant section I was debating adding inpage anchor links in a bullet list at the top of the page for quick navigation. PAGE TITLE Keyword 1 Keyword 2 etc <a name="'Keyword1"></a> Keyword 1 Content
Intermediate & Advanced SEO | | ATP
<a name="'Keyword2"></a> Keyword 2 Content Because of the way my predecessor wrote this article, its section headings are the same as the sub-categories they link out to and boost (not ideal but an issue I will address later). What I wondered is if having the inpage achor would confuse the SERPS because they would be linking with the same keyword. My worry is that by increasing userbility of the article by doing this I also confuse them SERPS First I tell them that this section on my page talk about keyword 1. Then from in that article i tell them that a different page entirely is about the same keyword. Would linking like this confuse SERPS or are inpage anchor links looked upon and dealt with differently?0 -
Disavow first (and link removal outreach second) as tactic?
I need to remove/disavow hundreds of domains due to an algorithmic penalty. Has anyone disavowed first and done the outreach thing second as a tactic? The reason why I was considering this was as follows: Most of the websites are from spammy websites and unlikely to have monitored accounts/available contact details. My business is incredibly seasonal, only being easily profitable for half of the year. The season starts from next month so the window of opportunity to get it done is small. If there's a Penguin update before I get it done, then it could be very bad news. Any thoughts would be much appreciated. (Incidentally, if you are interested in, I also posted here about it: http://moz.com/community/q/honest-thoughts-needed-about-link-building-removal)
Intermediate & Advanced SEO | | Coraltoes770 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0