Disavow Tool
-
2 Questions any help appreciated:
I have had a spam url in my Disavow file with Google since last September but it still shows up as linked to my site, is this correct?
If a url has say 100 pages all with your anchor text and it is a spam website do you Disavow the domain url or do you have enter all the pages in the Disavow spreadsheet?
-
For the sake of this argument, I have a website where there were some 120-150 spammy links created. Basically I see a ton of low quality bookmarking sites who are somewhat scraping content of each other. Very few anchor text names and those are taken from authority sites in the niche as well, the others (some 80% of them) are direct domain name anchor text links to the site in question now. So, would any of you recommend adding all those links into the disavow tool if nothing is happening in terms of penalties or ranking changes now? I am having a lot of opposite opinions about this matter. Thanks!
-
Remember it's one URL per line.
If you want to disavow all of geeky.com, all you need to do is:
domain:geeky.com
That's all!
-
Sorry to sound thick but on my spreadsheet it will look like this which is an actual spam link on my site:
domain:geeky.com http://www.geeky.com/ or like this
domain:geeky.com http://www.geeky.com/spam.html
-
If you want to disavow an entire domain, that's how you enter it.
Let's say you wanted to disavow http://www.spamsite.com/spam.html and all of seomoz.org (I'm sure you don't!)
This is what you'd put in your disavow file:
http://www.spamsite.com/spam.html
domain:seomoz.orgYou need to put that "domain:" bit in front of the site's root domain in order to disavow all of the links on the site.
-
Thank you for you response, can you explain what you mean by domain:spamsite.com do I just enter the full url address of the domain?
-
Hey there
First question - this is fine. The disavow file stops Google from counting that link as part of your link profile, but it doesn't stop it reporting as linking to your site. In order for that to happen, you would need to physically break the link.
Second - you're more than welcome to use the domain:spamsite.com command - Google are happy to accept that. So yes, for a site containing 100 links or more, use the domain: command and you'll be fine. I've tried and tested this and it's worked for me.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
406 Errors from Third-Parties websites In Google Webmaster Tools
Google web master tools is displaying 406 errors page.The source is not from our site domain. How to fix these issues if they are from different domain? 2KXlhRy
On-Page Optimization | | SirishaNueve0 -
If I am using Lazy Load & Ajax Technology then how "tools.pingdom.com" will consider website performance?
Hello Experts, If I am using Lazy Load & Ajax Technology then how "tools.pingdom.com" will consider website performance? I am not using this technology but few of my competitors are using this technology but still there performance in pingdom tool worst than my ecommerce site Little bit confuse please help. Thanks! Wrights
On-Page Optimization | | wright3350 -
Google Webmaster Tools Not Showing Correct Data?
Hi, I am experience a weird issue. Google webmaster tools suggested me some HTML improvements a few weeks ago. The suggestions were about duplicate Title Tags and Short Meta Descriptions. I changed the Title Tags and Meta Descriptions. But after 3 Google Updates, webmaster still shows the same suggestion. Please advise Thanks
On-Page Optimization | | Kashif-Amin0 -
Duplicate meta and title in Google Webmaster Tools not updated?
Our canonical tags were removed by mistake for some time and our duplicate meta and title warnings in Google webmaster tools html improvements went up. We fixed the issue a week ago. I made sure the sitemap is picking up the canonical URL. Still the number went up after we fixed it (from around 5000 to around 7000 now) but when I click on details, it shows around 2000 in detail page. How long does it take for Google to update this? RB
On-Page Optimization | | rbai0 -
Confused by Moz page grading tool
Can anyone shed any light on why moz ranks this page an F: http://www.traditional-cleaning.co.uk/cleaning-in-tynemouth.htm for 'cleaners in tynemouth' and 'cleaning in tynemouth' Many thanks!
On-Page Optimization | | EdwardoUK0 -
Is there a tool that will "grade" content?
Does anybody know of a tool that can "grade" content for Panda compliance. For example, it might look at: • the total number of words on the page • the average number of words in sentences • grammar • spelling • repetitious words and/or phrases • Readability—using algorithms such as: Flesch Kincaid Reading Ease Flesch Kincaid Grade Level Gunning Fog Score Coleman Liau Index Automated Readability Index (ARI) For the last 5 months I've been writing and rewriting literally 100s of catalog descriptions—adhering to the "no duplicate content" and "adding value" rubrics—but in an extremely informal style. I would like to know if I'm at least meeting Google Panda's minimum standards.
On-Page Optimization | | RScime250 -
"Canonical URL Tag Usage" recommendation in SEOmoz "On-Page Optimization" Tool
Here comes another one related to SEOmoz "On-Page Optimization" Tool. The tool says the following about one of our pages: Canonical URL Tag Usage Explanation: Although the canonical URL tag is generally thought of as a way to solve duplicate content problems, it can be extremely wise to
On-Page Optimization | | gerardoH
use it on every (unique) page of a site to help prevent any query strings, session IDs, scraped versions, licensing deals or future
developments to potentially create a secondary version and pull link juice or other metrics away from the original. We believe
the canonical URL tag is a best practice to help prevent future problems, even if nothing is specifically duplicate/problematic
today. Recommendation: Add a canonical URL tag referencing this URL to the header of the page. Let's say our page is http://www.example.com/brands/abc-brand and on its header we'll place the following tag: Is this correct? I thought the canonical tag was meant for duplicates of the original page, for example: http://www.example.com/brands/print/abc-brand href="http://www.example.com/brands/abc-brand**?SESSID=123** Thanks in advance.0 -
Can u suggest me good one page optimization tool for wordpress based websites?
Can u suggest me good one page optimization tool for wordpress based websites?
On-Page Optimization | | dineshameh0