Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
406 Errors from Third-Parties websites In Google Webmaster Tools
Google web master tools is displaying 406 errors page.The source is not from our site domain. How to fix these issues if they are from different domain? 2KXlhRy
On-Page Optimization | | SirishaNueve0 -
Community Discussion - Are Rich Snippets a Worthwhile Tool to Keep in the Online Marketing Toolbox?
We've heard a lot about Rich Snippets in the last few years, especially with regard to how they can help move the needle for ecomm brands. In the latest YouMoz post - 5 Essential E-Commerce Rich Snippets for Your Store - Aleh Barysevich provides some easy-to-follow tips he's used to help brands find success using Rich Snippets. What about you? Have you found Rich Snippets to be all they're cracked up to be?
On-Page Optimization | | ronell-smith4 -
On-page tool idea. What do you think? Like to hear it!
Easy on-page checker idea feedback Some script I wrote. Is this handy for someone? Try the script yourself like most things I like the Moz on-page grader (a lot!) but it does not do everything I want it to. So sometime I write my own little checks and other times I want to see list for of example all images without ALT tags. For this I wrote some code for myself the last few weeks. Now I bundled them for a handy little checker that is as is and not meant to make money with. It's just my handy work so to say and I would like to hear some feedback on the very simple but I think effective method of checking anchors on a front page against the page they link to. This is a very basis page checker for some important on-page ranking factors. If you type links in the option field then all links found on the page are listed Per link anchor quality check: The idea is very simple: if we assume for the simplicity of this method that the anchor text found first linking to an other page on the website this anchor text determnines the keyword with which the page is indexed. In made the links found on the page clickable. When clicked it starts the same scan but then only for the anchor text provided with the link clicked. If the on-page checks are ok I reccon your doing alright. idea is to start with the front page of any website and click the followed links, The page linked to is graded for match with anchor text as keyword. Try the script yourself Gr Daniel
On-Page Optimization | | DanielMulderNL0 -
Disavow Question
Hey, In webmaster tools i have 520 domains linking to my site and i disavowed some today, when i was done it said "you successfully uploaded a disavow links file with 75 URLs and 0 domains". First question, did i do this right? Second question if a site is linking to me on multiple pages on their site do i have to disavow each link or just the domain is enough. When i download the all links to my site it was around 5,000 from the 520 domains. I appreciate the help.
On-Page Optimization | | benjaminmarcinc0 -
How Does Google Webmaster Tools Come Up with Content Keywords?
When I look at Google Webmaster Tools, in the Content Keywords report there are a couple of ones that are suspect - "prescription", "medications", and "viagra" which are completely unrelated to the content of the site. When I click on the content keyword, and search the source code for those pages, I don't see those words in the source code. Can someone please help me figure out why Google thinks that these keywords are associated with these pages, and how to correct it?
On-Page Optimization | | bernardablola0 -
What automated tools produce a number score for A) On-Page Optimization and B) Domain Exact Match?
SEOMoz Community, -I currently use SEMRush, SEOMoz Open Site Explorer and Archive.org for the other Benefit and Opposition factors. -I’ve had to manually search pages for keyword use in the title, footer & body for on-page optimization. -I’ve also manually searched Google for Domain Exact Matches. Thanks! -Andrew
On-Page Optimization | | Todd_Kendrick0 -
Tool for Generating Sitemap/ URL List
HI, I'm looking for a tool that'll generate a URL list for a site. I looked at this thread here http://www.seomoz.org/q/online-sitemap-generator which came up when I searched for sitemap generator. However, I don't need a sitemap per se, and I don't need to submit it to Google - just a list of pages is what I need.If it updated automatically, that would be useful as well. Anyone know of a tool, on or offline? Or anyone used Xenu and know if it's what I'm looking for? Or is there a simple solution that I'm missing? Thanks.
On-Page Optimization | | 5225Marketing0 -
Google webmaster tools data update frequency?
What is the lag time for changes to a site to be reflected in Google webmaster tools Diagnostics section? They pointed out some duplicate titles which I fixed a week ago and yet they still show up as an HTML Suggestion. What has your experience been with making changes and then seeing them reflected in the HTML Suggestions section? My site is crawled every day, including the pages I have updated with new titles. Seems like it takes a while for the data to trickle into Webmaster tools, no?
On-Page Optimization | | scanlin0