Disavow files and net com org etc ....
-
When looking at my backlinks if I see something like this:
www.domainPizza.net
www.domainPizza.com
sub.domainPizza.com
www.domainpizza.org
domainPizza.net
https://domainpizza.com
https://www.domainpizza.netWhat is the actual list of disavows that I put into the file if I want to disavow this domain? I am seeing so many variations of the same domain.
Thank you.
-
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to.
However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL.
The best of my knowledge there is no problem with disavowing a full domain.
As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect.
The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
-
Ahhh yes, you are correct!
-
Not quite, David. Disavowing a domain also automatically disavows all subdomains under it (www is just another subdomain, after all.)
So if domain:domainpizza.org is disavowed, sub.domainPizza.org is automatically disavowed as part of that directive. If instead you ONLY want to disavow the subdomain, you must explicitly specify just the exact subdomain and not the naked primary domain.
My main reason for disavowing at the domain level is for future protection. Not only may my crawler have missed a few link or a bad subdomain, but there is nothing to stop a spammy link site from generating more new spammy links or subdomains after I've reviewed it.
Hope that helps clarify?
Paul
-
Hi HLTalk,
You are right about disavowing at a domain level.
This is usually the option I go for because (generally) it's the overall domain that's spam, and rarely an individual URL (of course, there will be exceptions) and if you disavow at a URL level you run the risk of missing some URLs which may not be picked up in various SEO tools at the time.
For the example in your first post, you would need the following in your disavow file:
domain:domainPizza.net
domain:domainPizza.com
domain:domainpizza.org
domain:sub.domainPizza.comWhether it's http://, http://www., https://, https://www., doesn't matter - this doesn't need to be specified in the disavow file. Sub-domains are treated as a separate domain and do need to be specified in the disavow file.
As ThompsonPaul pointed out below, sub-domains will also be disavowed when you disavow at a domain level.
Cheers,
David
-
Well, nop. I do not know an easy way to disavow a whole domain.
I'd just export all URLs from that domain and disavow all them.
A more accurate and eficient way to disavow the spammy links is to export a backlink from any platform (such as Ahrefs, moz, magestic or others) and then disavow them.
Also, ahrefs offers the option to upload a .txt with the URLs disavow, so not to show them in the report.Best luck!
GR. -
My understanding is that this information is wrong. If you are dealing with an obviously spammy website that has no value or real content, you are supposed to disavow the root domain. As per Matt Cutts: https://www.seroundtable.com/google-disavow-machete-16844.html
One common issue we see with disavow requests is people going through with a fine-toothed comb when they really need to do something more like a machete on the bad backlinks. For example, often it would help to use the “domain:” operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links. That's one reason why we sometimes see it take a while to clean up those old, not-very-good links.
Obviously you don't want to disavow an entire domain if the domain itself is a legit website. That is just basic common sense. But a universal rule that you "must point to the exact link" is factually false. Just like a rule that you "must disavow the root domain" would be factually false. You look at each link and make a decision how to proceed.
So my question was about root domain disavows. I gave some examples above.
| |
| |
| |
| |
| | -
Hi there.
In order to disavow links, you must point the exact link to disavow.
Here an example: domainpizza.org/badklink/xx/puppies
Not the root domain.Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I migrate .co.uk to .com?
I have previously searched the forum and could not find a definitive answer on this subject so would appreciate any guidance. I have just joined a new company, we have a .co.uk site which gets lots of traffic. We have a .com site which is targeting USA and .com/de/ targeting Germany. 'hreflang' is configured on the .com (between the USA and German sites) but not on .co.uk. This means that in the eyes of search engines (and Moz Pro) the 2 domains are competitors (and the .co.uk has much more presence than the .com in the USA). I know how to fix this and I am in the process of doing so. My question is whether it would make sense to migrate the .co.uk site to .com As previously mentioned the .co.uk site already does very well both in the UK and around the world (as our product is well known in our niche). As .co.uk can only primarily be targeted to UK would our global reach increase enough to justify migrating it to .com? We have dealers/distributors in maybe 30 countries and are continuing to expand, we will at point point add additional languages so my suggestion is that we migrate now as the authority of the .co.uk will help the emerging markets as well as increase our visibility in markets that are not currently primary targets. We are also in the process of hiring new staff specifically to focus on Content Marketing. So again this suggests having the 1 domain will make sense in the long run (as any value gained from content marketing success will be seen by all country/language focussed sites). I am also planning to rebuild the sites in the next few months as the current ones are not fit for purpose so the migration would coincide with this (I know this is not ideal). Apologies for the lengthy question, I hope the additional background information will help in providing some feedback to help me make the decision. David
Intermediate & Advanced SEO | | JamesCrossland0 -
Meta robots or robot.txt file?
Hi Mozzers! For parametric URL's would you recommend meta robot or robot.txt file?
Intermediate & Advanced SEO | | eLab_London
For example: http://www.exmaple.com//category/product/cat no./quickView I want to stop indexing /quickView URLs. And what's the real difference between the two? Thanks again! Kay0 -
Wix.com ...what if any issues are there with this platform and SEO?
I have a client that would like me to support them with SEO on a Wix.com site. I was hoping to get some feedback from the community to see if there were people who had experience int he following areas: Supporting the day to day operation of a WiX site? Specifically are there any issues I need to watch out for or be aware of if I choose to support this site? From and SEO perspective is this platform OK or are there some issues I need to be made aware of? I would sincerely appreciate any input or comments on this platform.
Intermediate & Advanced SEO | | Ron_McCabe0 -
Is Google indexing Mp3 audio and MIDI music files? Can that cause any duplicate problems?
Hello, I own virtualsheetmusic.com website and we have several thousands of media files (Mp3 and MIDI files) that potentially Google can index. If that's the case, I am wondering if that could cause any "duplicate" issues of some sort since many of such media files have exact file names or same meta information inside. Any thoughts about this issue are very welcome! Thank you in advance to anyone.
Intermediate & Advanced SEO | | fablau0 -
Why is this site have a PR 0 rank? Anyone can figure this out? LegionSafety.com
Our site dropped in PR and we haven't done anything and not sure why the drop. Anyone have any recommendations?
Intermediate & Advanced SEO | | legionsafety0 -
Schema.org Implementation: "Physician" vs. "Person"
Hey all, I'm looking to implement Schema tagging for a local business and am unsure of whether to use "Physician" or "Person" for a handful of doctors. Though "Physician" seems like it should be the obvious answer, Schema.org states that it should refer to "A doctor's office" instead of a physician. The properties used in "Physician" seem to apply to a physician's practice, and not an actual physician. Properties are sourced from the "Thing", "Place", "Organization", and "LocalBusiness" schemas, so I'm wondering if "Person" might be a more appropriate implementation since it allows for more detail (affiliations, awards, colleagues, jobTitle, memberOf), but I wanna make sure I get this right. Also, I'm wondering if the "Physician" schema allows for properties pulled from the "Person" schema, which I think would solve everything. For reference: http://schema.org/Person http://schema.org/Physician Thanks, everyone! Let me know how off-base my strategy is, and how I might be able to tidy it up.
Intermediate & Advanced SEO | | mudbugmedia0 -
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Intermediate & Advanced SEO | | JHSpecialty0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0