Disavow files and net com org etc ....
-
When looking at my backlinks if I see something like this:
www.domainPizza.net
www.domainPizza.com
sub.domainPizza.com
www.domainpizza.org
domainPizza.net
https://domainpizza.com
https://www.domainpizza.netWhat is the actual list of disavows that I put into the file if I want to disavow this domain? I am seeing so many variations of the same domain.
Thank you.
-
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to.
However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL.
The best of my knowledge there is no problem with disavowing a full domain.
As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect.
The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
-
Ahhh yes, you are correct!
-
Not quite, David. Disavowing a domain also automatically disavows all subdomains under it (www is just another subdomain, after all.)
So if domain:domainpizza.org is disavowed, sub.domainPizza.org is automatically disavowed as part of that directive. If instead you ONLY want to disavow the subdomain, you must explicitly specify just the exact subdomain and not the naked primary domain.
My main reason for disavowing at the domain level is for future protection. Not only may my crawler have missed a few link or a bad subdomain, but there is nothing to stop a spammy link site from generating more new spammy links or subdomains after I've reviewed it.
Hope that helps clarify?
Paul
-
Hi HLTalk,
You are right about disavowing at a domain level.
This is usually the option I go for because (generally) it's the overall domain that's spam, and rarely an individual URL (of course, there will be exceptions) and if you disavow at a URL level you run the risk of missing some URLs which may not be picked up in various SEO tools at the time.
For the example in your first post, you would need the following in your disavow file:
domain:domainPizza.net
domain:domainPizza.com
domain:domainpizza.org
domain:sub.domainPizza.comWhether it's http://, http://www., https://, https://www., doesn't matter - this doesn't need to be specified in the disavow file. Sub-domains are treated as a separate domain and do need to be specified in the disavow file.
As ThompsonPaul pointed out below, sub-domains will also be disavowed when you disavow at a domain level.
Cheers,
David
-
Well, nop. I do not know an easy way to disavow a whole domain.
I'd just export all URLs from that domain and disavow all them.
A more accurate and eficient way to disavow the spammy links is to export a backlink from any platform (such as Ahrefs, moz, magestic or others) and then disavow them.
Also, ahrefs offers the option to upload a .txt with the URLs disavow, so not to show them in the report.Best luck!
GR. -
My understanding is that this information is wrong. If you are dealing with an obviously spammy website that has no value or real content, you are supposed to disavow the root domain. As per Matt Cutts: https://www.seroundtable.com/google-disavow-machete-16844.html
One common issue we see with disavow requests is people going through with a fine-toothed comb when they really need to do something more like a machete on the bad backlinks. For example, often it would help to use the “domain:” operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links. That's one reason why we sometimes see it take a while to clean up those old, not-very-good links.
Obviously you don't want to disavow an entire domain if the domain itself is a legit website. That is just basic common sense. But a universal rule that you "must point to the exact link" is factually false. Just like a rule that you "must disavow the root domain" would be factually false. You look at each link and make a decision how to proceed.
So my question was about root domain disavows. I gave some examples above.
| |
| |
| |
| |
| | -
Hi there.
In order to disavow links, you must point the exact link to disavow.
Here an example: domainpizza.org/badklink/xx/puppies
Not the root domain.Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain.com/old-url to domain.com/new-url
HI, I have to change old url`s to new one, for the same domain and all landing pages will be the same: domain.com/old-url I have to change to: domain.com/new-url All together more than 70.000 url. What is best way to do that? should I use 301st redirect? is it possible to do in code or how? what could you please suggest? Thank you, Edgars
Intermediate & Advanced SEO | | Edzjus3330 -
Redirect aspx files to a different path structure on a different domain using a different server-side language?
Without getting into the debate/discussion about which server-side language should or should not be used, I am faced with the reality of moving an old ASP.NET site to a Coldfusion one with a different domain and different folder structure. Example: www.thissite.com/animals/lion.aspx --> www.thatsite.com/animals/africa/lion.cfm What is the best way to redirect individual .aspx pages to their .cfm counterparts keeping in mind that, in many cases, the folder paths will be different? If it would mean less work, I am hoping this can be done at the server level (IIS 6) rather than modifying the code on each now-defunct page. And on a related note, how long should any redirects be kept in place? My apologies if this has been answered in this forum in the past, but I did do a lot of searching first (both here and elsewhere) before posting this query.
Intermediate & Advanced SEO | | hamackey0 -
Easiest way to disavow single links on an on-going basis?
We frequently get random super-sketchy looking blogs linking to us with no author or contact information. I believe we are being targeted by a competitor setting up garbage links to us. I am hoping to use the Google disavow links tool to deal with this but is it: Safe to use or does it flag us as link spammers by using it Possible to use on an on-going basis for single links (as them come in, as opposed to a bunch of backlogged links) Thanks!
Intermediate & Advanced SEO | | BlueLinkERP0 -
Redirects, 302 and geolocation for multi-regional .com website
One of our sites utilizes a single .com domain but offers differentiated article page for users depending on their location. For example: example.com/articles/how-to-learn-seo-gb-en for UK example.com/articles/how-to-learn-seo-us-en for US example.com/articles/how-to-learn-seo-au-en for Aus Currently we use example.com/articles/how-to-learn-seo as the relative link on the site and then the user is redirected by 302 to the correct article for them based on their location. I've read countless pages about 302 redirects (and largely why you shouldn't use them because of link juice, indexing etc) but what alternative can we use since we don't want to permanently redirect to one URL but rather redirect to the relevant URL based on the users location. All the stuff I've read talks about redirecting using 301s but this surely only works when you are redirecting from one URL to one permanent new URL as opposed to redirecting to one of many country specific URLs. It's not really a solution for us to set up separate TLDs for each country so what is the best mechanism for redirecting user to the correct article for them and making sure that link juice is shared, pages are indexed etc? I hope I've explained this well enough for any of you to offer advice. Many thanks in advance.
Intermediate & Advanced SEO | | simon_realbuzz0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Google recognising regional canadian site as primary instead .com
Hi, we updated corporate site salvagedata.com to new design,but for migration test we do it on our canadian salvagedata.ca site. In few day we migrated salvageta.com. In this time google indexed salvagedata.ca contents, and now looks like google recognising it as primary site, and show it higher in search results. for example: hard drive data recovery Can 301 redirect .ca-> .com to resolve problem?
Intermediate & Advanced SEO | | markgray0 -
.co.uk to .com - Transfer of domain
After having our site up for over a year and gaining PR 3 and more than a dozen page 1 rankings on Google for most of our competitive terms, we have realised we have to transfer to .com from our current .co.uk URL for legal reasons. What would the best way be to carry out such a move in an SEO perspective?
Intermediate & Advanced SEO | | qtasad0 -
Query / Discussion on Subdomain and Root domain passing authority etc
I've seen Rands video on subdomains and best pratices at
Intermediate & Advanced SEO | | James77
http://www.seomoz.org/blog/whiteboard-friday-the-microsite-mistake
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites I have a question/theory though and it is related to an issue I am having. We have built our website, and now we are looking at adding 3rd party forums and blogs etc (all part of one CMS). The problem is these need to to be on a seperate subdomain to work correctly (I won't go into the specific IT details but this is what I have been advised by my IT guru's). So I can have something like:
http://cms.mysite.com/forum/ Obviously after reading Rands post and other stuff this is far from ideal. However I have another Idea - run the CMS from root and the main website from the www. subdomain. EG
www.mysite.com
mysite.com/blog Now my theory is that because so many website (possibly the majority - especially smaller sites) don't use 301 redirects between root and www. that search engines may make an exception in this case and treat them both as the same domain, so it could possibly be a way of getting round the issue. This is just a theory of mine, based solely on my thoughts that there are so many websites out there that don't 301 root to www. or vice versa, that possibly it would be in the SE's self interest to make an exception and count these as one domain, not 2. What are your thoughts on this and has there been any tests done to see if this is the case or not? Thanks0