Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unused CSS Tool Recomendations
Hey moz, Im currently converting my site to mobile as its an older template based on tables it will make a lot of CSS redundant. I cant delete as i go incase a page uses it im unaware of so need to clean it up when its done. Is there a tool that crawls my entire website and highlights the unused CSS in the files? Ive found a few paying tools that claim to do so but am reluctant to just spend money on something without a recommendation. Cheers!
On-Page Optimization | | ATP1 -
Tool To Search For Duplicate Content
Hi Is there such a tool that can be use to search a website for duplicate content? Thanks
On-Page Optimization | | Bossandy0 -
Duplicate Content Indentification Tools
Does anyone have a recommendation for a good tool that can identify which elements on a page are duplicated content? I use Moz Analytics to determine which pages have the duplicated content on them, but it doesn't say which pieces of text or on-page elements are in fact considered to be duplicate. Thanks Moz Community in advance!
On-Page Optimization | | EmpireToday0 -
Google cache tool help
This link is for the Ebay Google cache - http://webcache.googleusercontent.com/search?q=cache:www.ebay.com&strip=1 I wanted to do the same for my homepage so I switched out the urls and it worked. When I try to get a different link in there such as mysite.com/category it wont work. I know my pages are indexed. Any ideas why it wont work for other pages?
On-Page Optimization | | EcommerceSite0 -
Webmaster tools
Hi there, I have access to my sebsite writeing www.piensapiensa.es or www.piensapiensa.com. What domain should I add to the webmasters tools? Should I have to do some kind of 301 direction from piensapiensa.com to piensapiensa.com as the main market is in Spain? Thanks.
On-Page Optimization | | juanmiguelcr0 -
Is there a tool that will "grade" content?
Does anybody know of a tool that can "grade" content for Panda compliance. For example, it might look at: • the total number of words on the page • the average number of words in sentences • grammar • spelling • repetitious words and/or phrases • Readability—using algorithms such as: Flesch Kincaid Reading Ease Flesch Kincaid Grade Level Gunning Fog Score Coleman Liau Index Automated Readability Index (ARI) For the last 5 months I've been writing and rewriting literally 100s of catalog descriptions—adhering to the "no duplicate content" and "adding value" rubrics—but in an extremely informal style. I would like to know if I'm at least meeting Google Panda's minimum standards.
On-Page Optimization | | RScime250 -
Keyword Density Tools
Does anyone have recommendations on the best tool(s) to use to check the keyword density of each page of a website? I'm not sure if SEOmoz has such a tool.
On-Page Optimization | | webestate0 -
Google webmaster tools data update frequency?
What is the lag time for changes to a site to be reflected in Google webmaster tools Diagnostics section? They pointed out some duplicate titles which I fixed a week ago and yet they still show up as an HTML Suggestion. What has your experience been with making changes and then seeing them reflected in the HTML Suggestions section? My site is crawled every day, including the pages I have updated with new titles. Seems like it takes a while for the data to trickle into Webmaster tools, no?
On-Page Optimization | | scanlin0