I want to Disavow some more links - but I'm only allowed one .txt file?
-
Hey guys,
Wondering if you good people could help me out on this one?
A few months back (June 19) I disavowed some links for a client having uploaded a .txt file with the offending domains attached.
However, recently I've noticed some more dodgy-looking domains being indexed to my client's site so went about creating a new "Disavow List".
When I went to upload this new list I was informed that I would be replacing the existing file.
So, my question is, what do I do here?
Make a new list with both old and new domains that I plan on disavowing and replace the existing one?
Or; Just replace the existing .txt file with the new file because Google has recognised I've already disavowed those older links?
-
Cheers Tom.
Exactly the answer I needed!
-
Hi Matthew
You want to add to your current list. So you'll want to upload a file that had what you had previously disavowed in addition to what new sites you want to disavow.
It's probably worth putting in a description line like:
domain:badsite.com
badsite2.com/badpageThese files were uploaded on 19/09/2013 following a further link audit
And so on. Showing progressive evidence of action taken is always a good sign I feel.
If you uploaded the new file without the old links, for all intents and purposes it would "de-disavow" those links, so you wanna keep them in there.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawler doesn't discover the links in the main nav
Hi Moz Community, We have a headless ecom (Magento) client that I'm trying to crawl the site. During the crawl, the tool (Screaming Frog) cannot discover the sub-category URLs in the main navigation when I start crawling via homepage. Similarly, when I start crawling with one of the sub-category page, it doesn't crawl any of the product URLs on the sub-category page itself. When I inspect product and sub-cat URLs through Search Console, they seem as indexed and if I view how Googlebot rendered the sub-category page, I can see the product URLs on the sub-cat page too. If you have any idea what's the issue with Screaming Frog and would like to help me out, I'd be so grateful! Thanks in advance
Intermediate & Advanced SEO | | bbop330 -
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
Disavow Experts: Here's one for ya ....
Not sure how to handle this one. Simply because there are SO MANY .... I want to be careful not to do something stupid ... Just a quick 3 minute video explanation: https://youtu.be/bVHUWTGH21E I'm interested in several opinions so if someone replies - please still chime in. Thanks.
Intermediate & Advanced SEO | | HLTalk0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Do allow or disavow, that is the question!
We're in the middle of a disavow process and we're having some difficulty deciding whether or not to disavow links from Justia.com and prweb.com - justia.com alone is giving us 23,000 links with just 76 linked pages. So, to allow, or disavow? That's the question! What do you think guys? Thank you. John.
Intermediate & Advanced SEO | | Muhammad-Isap0 -
Acceptable use of availability attribute 'preorder' value in rich snippets schema markup and Google Shopping feed?
Hello all, Could someone please advise on acceptable use of the availability attribute 'preorder' value in rich snippets schema markup for our websites and the Google Shopping feed? Currently all of our products are either 'in stock' or 'out of stock', also mentioned was 'available for order' but I found that in the 2014 Google Shopping update, this value will be merged with 'in stock' here 'We are simplifying the ‘availability’ attribute by merging ‘in stock’ with ‘available for order’ and removing ‘available for order’. The products which we would like to mark as 'preorder' have been in stock and then sold out, however we have a due date for when they will come back into stock, so therefore the customer can preorder the product on our website i.e. pay in advance to secure their purchase and then they are provided with a due date for the products. Is this the correct use of the 'preorder' value, or does the product literally have to never have been released before? The guidance we have is: 'You are taking orders for this product, but it’s not yet been released.' Is this set in stone? Many thanks in advance and kind regards.
Intermediate & Advanced SEO | | jeffwhitfield0 -
Disavow tool removed all our links from webmaster tools
We recently used the Google Disavow tool to remove 200 bad links but Google has nearly removed all our links from webmaster tools from over 2000+ we only have 150 now! Has anyone had the same problem? Any advice would be much appreciated. Thanks Paul
Intermediate & Advanced SEO | | webdesigncwd0 -
Links with Parameters
The links from the home page to some internal pages on my site have been coded in the following format by my tech guys: www.abc.com/tools/page.html?hpint_id=xyz If I specify within my Google Webmaster tools that the parameter ?hpint_id should be ignored and content for the user does not change, Will Google credit me for a link from the home page or am I losing something here. Many thanks in advance
Intermediate & Advanced SEO | | harmit360