I want to Disavow some more links - but I'm only allowed one .txt file?
-
Hey guys,
Wondering if you good people could help me out on this one?
A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached.
However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List".
When I went to upload this new list I was informed that I would be replacing the existing file.
So, my question is, what do I do here?
Make a new list with both old and new domains that I plan on disavowing and replace the existing one?
Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?
-
Cheers Tom.
Exactly the answer I needed!
-
Hi Matthew
You want to add to your current list. So you'll want to upload a file that had what you had previously disavowed in addition to what new sites you want to disavow.
It's probably worth putting in a description line like:
domain:badsite.com
badsite2.com/badpageThese files were uploaded on 19/09/2013 following a further link audit
And so on. Showing progressive evidence of action taken is always a good sign I feel.
If you uploaded the new file without the old links, for all intents and purposes it would "de-disavow" those links, so you wanna keep them in there.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Disavow File and SSL Conversion Question
Moz Community, So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected). Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version? Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well. Thanks QL
Intermediate & Advanced SEO | | QuickLearner0 -
Webpage has bombed outside of Top 50 for search term in one week. What's the cause?
I've been monitoring the performance of some pages via the email Moz sends every week, and until this week two pages that I've managed to get ranking have ranked between 20 and 23 for the specific term. However, today on the email one of the pages for one search term has bombed out of the top 50 while the other page has remained unaffected. What could be the cause for this? I've looked at Google Webmasters for an indication of a penalty of some sort but there is nothing glaringly obvious. I've no messages on there, and I haven't bought a load of spam links at all. What else could I check?
Intermediate & Advanced SEO | | mickburkesnr0 -
Same page Anchor Links vs Internal Link (Cannibalisation)
Hey Mozzers, I have a very long article page that supports several of my sub-category pages. It has sub-headings that link out to the relevant pages. However the article is very long and to make it easier to find the relevant section I was debating adding inpage anchor links in a bullet list at the top of the page for quick navigation. PAGE TITLE Keyword 1 Keyword 2 etc <a name="'Keyword1"></a> Keyword 1 Content
Intermediate & Advanced SEO | | ATP
<a name="'Keyword2"></a> Keyword 2 Content Because of the way my predecessor wrote this article, its section headings are the same as the sub-categories they link out to and boost (not ideal but an issue I will address later). What I wondered is if having the inpage achor would confuse the SERPS because they would be linking with the same keyword. My worry is that by increasing userbility of the article by doing this I also confuse them SERPS First I tell them that this section on my page talk about keyword 1. Then from in that article i tell them that a different page entirely is about the same keyword. Would linking like this confuse SERPS or are inpage anchor links looked upon and dealt with differently?0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
OSE Confusion on 'External' Links
Hello All, I am still very new to this but am starting to get a grasp of things in the SEO world, but there are still a few things that I just don't get yet. For example, I've been trying to find out a great strategy for Link Building, what better way than looking at already existing SEO companies? So I did a quick search on a website (http://www.opensiteexplorer.org/links?site=www.springer-marketing.co.uk) and tried to look at all of the External incoming links. So I did a filter of Followed+301, Only External and all subdomains. But about 20 of the links for this site are coming from itself. Now, i'm not an expert, but presumably you can't just give yourself strong links? Is this some kind of trick, how or why would somebody do this? Mind Blows Paul
Intermediate & Advanced SEO | | Paul_Tovey0 -
Starting Over with a new site - Do's and Don'ts?
After six months, we've decided to start over with a new website. Here's what I'm thinking. Please offer any constructive Do's or Don'ts if you see that I'm about to make a mistake. Our original site,(call it mysite.com ) we have come to the conclusion, is never going to make a come back on Google. It seems to us a better investment to start over, then to to simply keep hoping. Quite honestly, we're freakin' tired of trying to fix this. We don't want to screw with it any more. We are creative people, and would much rather be building a new race car rather than trying to overhaul the engine in the old one. We have the matching .net domain, mysite.net, which has been aged about 6 years with some fairly general content on a single page. There are zero links to mysite.net, and it was really only used by us for FTP traffic -- nothing in the SERPS for mysite.net. Mysite.NET will be a complete redesign. All content and images will be totally redone. Content will be new, excellent writing, unique, and targeted. Although the subject matter will be similar to mysite.COM, the content, descriptions, keywords, images -- all will be brand spankin' new. We will have a clean slate to begin the long painful link building process.We will put in the time, and bite the bullet until mysite.NET rules Google once again. We'll change the URL in all of our Adwords campaigns mysite.net. My questions are: 1. Mysite.com still gets some ok traffic from Bing. Can I leave mysite.com substantially intact, or does it need to go? 2. If I have "bad links" pointing to mysite.com/123.html what would happen if I 301 that page to mysite.NET/abc.html ? Does the "bad link juice" get passed on to the clean site? It would be a better experience for users who know our URL if they could be redirected to the new site. 3. Should we put Mysite.net on a different server in a different clean IP block? Or doesn't matter? We're willing to spend for the new server if it would help 4. What have I forgotten? Cheers, all
Intermediate & Advanced SEO | | DarrenX0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0