Disavow files and net com org etc ....
-
When looking at my backlinks if I see something like this:
www.domainPizza.net
www.domainPizza.com
sub.domainPizza.com
www.domainpizza.org
domainPizza.net
https://domainpizza.com
https://www.domainpizza.netWhat is the actual list of disavows that I put into the file if I want to disavow this domain? I am seeing so many variations of the same domain.
Thank you.
-
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to.
However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL.
The best of my knowledge there is no problem with disavowing a full domain.
As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect.
The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
-
Ahhh yes, you are correct!
-
Not quite, David. Disavowing a domain also automatically disavows all subdomains under it (www is just another subdomain, after all.)
So if domain:domainpizza.org is disavowed, sub.domainPizza.org is automatically disavowed as part of that directive. If instead you ONLY want to disavow the subdomain, you must explicitly specify just the exact subdomain and not the naked primary domain.
My main reason for disavowing at the domain level is for future protection. Not only may my crawler have missed a few link or a bad subdomain, but there is nothing to stop a spammy link site from generating more new spammy links or subdomains after I've reviewed it.
Hope that helps clarify?
Paul
-
Hi HLTalk,
You are right about disavowing at a domain level.
This is usually the option I go for because (generally) it's the overall domain that's spam, and rarely an individual URL (of course, there will be exceptions) and if you disavow at a URL level you run the risk of missing some URLs which may not be picked up in various SEO tools at the time.
For the example in your first post, you would need the following in your disavow file:
domain:domainPizza.net
domain:domainPizza.com
domain:domainpizza.org
domain:sub.domainPizza.comWhether it's http://, http://www., https://, https://www., doesn't matter - this doesn't need to be specified in the disavow file. Sub-domains are treated as a separate domain and do need to be specified in the disavow file.
As ThompsonPaul pointed out below, sub-domains will also be disavowed when you disavow at a domain level.
Cheers,
David
-
Well, nop. I do not know an easy way to disavow a whole domain.
I'd just export all URLs from that domain and disavow all them.
A more accurate and eficient way to disavow the spammy links is to export a backlink from any platform (such as Ahrefs, moz, magestic or others) and then disavow them.
Also, ahrefs offers the option to upload a .txt with the URLs disavow, so not to show them in the report.Best luck!
GR. -
My understanding is that this information is wrong. If you are dealing with an obviously spammy website that has no value or real content, you are supposed to disavow the root domain. As per Matt Cutts: https://www.seroundtable.com/google-disavow-machete-16844.html
One common issue we see with disavow requests is people going through with a fine-toothed comb when they really need to do something more like a machete on the bad backlinks. For example, often it would help to use the “domain:” operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links. That's one reason why we sometimes see it take a while to clean up those old, not-very-good links.
Obviously you don't want to disavow an entire domain if the domain itself is a legit website. That is just basic common sense. But a universal rule that you "must point to the exact link" is factually false. Just like a rule that you "must disavow the root domain" would be factually false. You look at each link and make a decision how to proceed.
So my question was about root domain disavows. I gave some examples above.
| |
| |
| |
| |
| | -
Hi there.
In order to disavow links, you must point the exact link to disavow.
Here an example: domainpizza.org/badklink/xx/puppies
Not the root domain.Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Ranking a .com in Switzerland and Saudi Arabia
Does anyone have experience and advice on ranking a .com site in Switzerland and Saudi Arabia? The site currently ranks well in the UK and Italy but they want to expand into Switzerland and Saudi Arabia. Any advice or links to resources would be hugely appreciated and I'd also be interested in entering into a joint venture with someone to focus on those 2 countries while I focus on the others. Anyone interested?
Intermediate & Advanced SEO | | Marketing_Today0 -
Have You Seen MOZU.com?
I'm not a business lawyer but has Moz been in the marketplace long enough to take issue with this? Volusion's new mid-to-enterprise level eCommerce platform is http://www.mozu.com/
Intermediate & Advanced SEO | | AWCthreads0 -
Is it safe to redirect our .nl (netherlands) domain that we have just purchased to our .com domain?
Hi all! We've recently developed a German version of our website with German translation and now we have just purchased a .nl domain, but with this one, we want all of the copy to remain in English. Is it ok to redirect our .nl domain to our current .com website or will this give us bad SEO points? Thank you!
Intermediate & Advanced SEO | | donaldsze0 -
What SEO Experts say about Pagination on PakWheels.com?
Hi SEOmozers... I need your expert feedback regarding SEO of listing pages with pagination. Crawl following links and write down your advice: Used Cars Car Reviews Listing New Honda Cars Actually these are the search listing pages with pagination. Please provide specialized recommendations for On page enhancements. Looking forward to see answers with pagination best practices.
Intermediate & Advanced SEO | | razasaeed0 -
Best domain extension is .com or .net for SEO ?
Best domain extension is .com or .net for SEO ? I wanna make site ,in that business directory ,travel guides , Hotels many more . I wanna target a one kw local search about 1,000,000 per month .domain name ofthat keyword already has taken by some one but .net remaining plz advice me to proceed .
Intermediate & Advanced SEO | | innofidelity0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0