I want to Disavow some more links - but I'm only allowed one .txt file?
-
Hey guys,
Wondering if you good people could help me out on this one?
A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached.
However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List".
When I went to upload this new list I was informed that I would be replacing the existing file.
So, my question is, what do I do here?
Make a new list with both old and new domains that I plan on disavowing and replace the existing one?
Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?
-
Cheers Tom.
Exactly the answer I needed!
-
Hi Matthew
You want to add to your current list. So you'll want to upload a file that had what you had previously disavowed in addition to what new sites you want to disavow.
It's probably worth putting in a description line like:
domain:badsite.com
badsite2.com/badpageThese files were uploaded on 19/09/2013 following a further link audit
And so on. Showing progressive evidence of action taken is always a good sign I feel.
If you uploaded the new file without the old links, for all intents and purposes it would "de-disavow" those links, so you wanna keep them in there.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
Intermediate & Advanced SEO | | SearchStan
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?1 -
I'm stumped!
I'm hoping to find a real expert to help out with this. TL;DR Our visibility in search has started tanking and I cannot figure out why. The whole story: In fall of 2015 I started working with Convention Nation (www.conventionnation.com). The client is trying to build a resource for convention and tradeshow attendees that would help them identify the events that will help them meet their goals (learning, networking, sales, whatever). They had a content team overseas that spent their time copy/pasting event information into our database. At the time, I identified several opportunities to improve SEO: Create and submit a sitemap Add meaningful metas Fix crawl errors On-page content uniqueification and optimization for most visible events (largest audience likely to search) Regular publishing and social media Over nine months, we did these things and saw search visibility, average rank and CTR all double or better. There was still one problem, and that is created by our specific industry. I'll use a concrete example: MozCon. This event happens once a year and there are enough things that are the same about it every year (namely, the generalized description of the event, attendees and outcomes) that the 2015 page was getting flagged as a duplicate of 2016. The event content for most of our events was pretty thin anyway, and much of it was duplicated from other sources, so we implemented a feature that grouped recurring events. My thinking was that this would reduce the perception of duplicate or obsolete content and links and provide a nice backlink opportunity. I expected a dip after we deployed this grouping feature, that's been consistent with other bulk content changes we've made to the site, but we are not recovering from the dip. In fact, our search visibility and traffic are dropping every week. So, the current state of things is this: Clean crawl reports: No errors reported by Moz or Google Moz domain authority: 20; Spam score 2/17 We're a little thin on incoming links, but steady growth in both social media and backlinks Continuing to add thin/duplicate content for unique events at the rate of 200 pages/mo Adding solid, unique strategic content at the rate of 15 pages/mo I just cannot figure out where we've gone astray. Is there anything other than the thin/copied content that could be causing this? It wasn't hurting us before we grouped the events... What could possibly account for this trend? Help me, Moz Community, you're my only hope! Lindsay
Intermediate & Advanced SEO | | LindsayDayton0 -
Google's 'related:' operator
I have a quick question about Google's 'related:' operator when viewing search results. Is there reason why a website doesn't produce related/similar sites? For example, if I use the related: operator for my site, no results appear.
Intermediate & Advanced SEO | | ecomteam_handiramp.com
https://www.google.com/#q=related:www.handiramp.com The site has been around since 1998. The site also has two good relevant DMOZ inbound links. Any suggestions on why this is and any way to fix it? Thank you.0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
Template Files .tpl versus .html files
We sell a large selection of Insulation Products use template files (.tpl) to collect up-to-date information from a server side database file that contains some 2,500 line items. When an HTML (.html) file is requested on the Internet, the 'example.tpl' file is accessed, the latest product and and pricing information is accessed, then presented to the viewer as 'example.html' My question: Can the use of .tpl files negatively impact Search Engine acceptance?
Intermediate & Advanced SEO | | Collie0 -
Disavow Subdomain?
Hi all, I've been checking and it seems like there are only 2 options when disavowing links with Google's tool. Disavow the link: http://spam.example.com/stuff/content.htm Disavow the domain: domain: example.com What can I do if I want do disavow a subdomain? i.e. spam.site.com I'm also assuming that if I were to disavow the domain it would include all subdomains? Thanks.
Intermediate & Advanced SEO | | Carlos-R0 -
Can literally any site get 'burned'?
Just curious what people think. The SEOMOZ trust on my site has gone up, all while Google is dropping us in rankings for lots of keywords. Just curious if this can happen to anyone or once you are 100% 'trusted' you're good. We went from 120,000 page views down to about 50,000. All while doubling content, improving the design(at least from a user perspective), and getting more natural links. Seems counter intuitive to Google's mantra of ranking quality. I would guess 'authority' sites never get hit by these updates right? So when you make it you've made it.(at least from a dropping like a rock perspective, obviously you have to keep working). I'm guessing we just need a bunch more quality links but would hate to work on building links, quality content, trust etc for it to be something so finicky long term.
Intermediate & Advanced SEO | | astahl110