Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I establish a Webmaster Tools site per language?
Our company services multiple regions. Our website is setup for France, International (en-gb), Spanish, United States and China. The websites share a common template and format and menu tree but each has unique content. Should I use a single Google Webmaster Tools site for all of these or establish each of them as a unique website?
On-Page Optimization | | bearpaw0 -
Google Webmaster Tools Not Showing Correct Data?
Hi, I am experience a weird issue. Google webmaster tools suggested me some HTML improvements a few weeks ago. The suggestions were about duplicate Title Tags and Short Meta Descriptions. I changed the Title Tags and Meta Descriptions. But after 3 Google Updates, webmaster still shows the same suggestion. Please advise Thanks
On-Page Optimization | | Kashif-Amin0 -
Looking for a Tool to Assist with Site Optimization. Does it already exist?
I'm looking for a tool that can help us quickly identify web pages on a client's site that contain a selected keyword phrase.
On-Page Optimization | | RosemaryB
I would like to enter say 100 keyword phrases and the client's URL and receive a report that shows - for each keyword - the client URLs that contained that exact phrase. Does anyone know of a tool that can do this? Thank you, Rosemary Brisco0 -
On-page tool idea. What do you think? Like to hear it!
Easy on-page checker idea feedback Some script I wrote. Is this handy for someone? Try the script yourself like most things I like the Moz on-page grader (a lot!) but it does not do everything I want it to. So sometime I write my own little checks and other times I want to see list for of example all images without ALT tags. For this I wrote some code for myself the last few weeks. Now I bundled them for a handy little checker that is as is and not meant to make money with. It's just my handy work so to say and I would like to hear some feedback on the very simple but I think effective method of checking anchors on a front page against the page they link to. This is a very basis page checker for some important on-page ranking factors. If you type links in the option field then all links found on the page are listed Per link anchor quality check: The idea is very simple: if we assume for the simplicity of this method that the anchor text found first linking to an other page on the website this anchor text determnines the keyword with which the page is indexed. In made the links found on the page clickable. When clicked it starts the same scan but then only for the anchor text provided with the link clicked. If the on-page checks are ok I reccon your doing alright. idea is to start with the front page of any website and click the followed links, The page linked to is graded for match with anchor text as keyword. Try the script yourself Gr Daniel
On-Page Optimization | | DanielMulderNL0 -
What automated tools produce a number score for A) On-Page Optimization and B) Domain Exact Match?
SEOMoz Community, -I currently use SEMRush, SEOMoz Open Site Explorer and Archive.org for the other Benefit and Opposition factors. -I’ve had to manually search pages for keyword use in the title, footer & body for on-page optimization. -I’ve also manually searched Google for Domain Exact Matches. Thanks! -Andrew
On-Page Optimization | | Todd_Kendrick0 -
Term Extractor Tool?
I want to check content (Keyword Density and such) for a page before I load it to the server. The Term Extractor Tool is great for pages already loaded on the site but what if I want to scan content before I upload it? Is there a tool out there where I can cut and paste content from a program like word and have it scanned for keyword relevancy prior to uploading it? Thanks
On-Page Optimization | | fun52dig
Gary0 -
Best meta tite - description writing tool
Hi I am in the process of writing meta titles and meta descriptions for a clients product portfolio. I have the complete list exported from the cms system in to excel. I know I can add some clever logic to build the meta information but for this purpose I want to do this manually. Does anyone have a editor online or an excel sheet that shows you the charactor limits when writing the titles and descriptions. This would be handy if someone has, otherwise it's a case of having to write the excel sheet to do this, not that hard to do but i thought i'd ask the question.
On-Page Optimization | | seohive-2227200 -
Page speed tools
Working on reducing page load time, since that is one of the ranking factors that Google uses. I've been using Page Speed FireFox plugin (requires FireBug), which is free. Pretty happy with it but wondering if others have pointers to good tools for this task. Thanks...
On-Page Optimization | | scanlin0