Disavow files and net com org etc ....
-
When looking at my backlinks if I see something like this:
www.domainPizza.net
www.domainPizza.com
sub.domainPizza.com
www.domainpizza.org
domainPizza.net
https://domainpizza.com
https://www.domainpizza.netWhat is the actual list of disavows that I put into the file if I want to disavow this domain? I am seeing so many variations of the same domain.
Thank you.
-
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to.
However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL.
The best of my knowledge there is no problem with disavowing a full domain.
As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect.
The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
-
Ahhh yes, you are correct!
-
Not quite, David. Disavowing a domain also automatically disavows all subdomains under it (www is just another subdomain, after all.)
So if domain:domainpizza.org is disavowed, sub.domainPizza.org is automatically disavowed as part of that directive. If instead you ONLY want to disavow the subdomain, you must explicitly specify just the exact subdomain and not the naked primary domain.
My main reason for disavowing at the domain level is for future protection. Not only may my crawler have missed a few link or a bad subdomain, but there is nothing to stop a spammy link site from generating more new spammy links or subdomains after I've reviewed it.
Hope that helps clarify?
Paul
-
Hi HLTalk,
You are right about disavowing at a domain level.
This is usually the option I go for because (generally) it's the overall domain that's spam, and rarely an individual URL (of course, there will be exceptions) and if you disavow at a URL level you run the risk of missing some URLs which may not be picked up in various SEO tools at the time.
For the example in your first post, you would need the following in your disavow file:
domain:domainPizza.net
domain:domainPizza.com
domain:domainpizza.org
domain:sub.domainPizza.comWhether it's http://, http://www., https://, https://www., doesn't matter - this doesn't need to be specified in the disavow file. Sub-domains are treated as a separate domain and do need to be specified in the disavow file.
As ThompsonPaul pointed out below, sub-domains will also be disavowed when you disavow at a domain level.
Cheers,
David
-
Well, nop. I do not know an easy way to disavow a whole domain.
I'd just export all URLs from that domain and disavow all them.
A more accurate and eficient way to disavow the spammy links is to export a backlink from any platform (such as Ahrefs, moz, magestic or others) and then disavow them.
Also, ahrefs offers the option to upload a .txt with the URLs disavow, so not to show them in the report.Best luck!
GR. -
My understanding is that this information is wrong. If you are dealing with an obviously spammy website that has no value or real content, you are supposed to disavow the root domain. As per Matt Cutts: https://www.seroundtable.com/google-disavow-machete-16844.html
One common issue we see with disavow requests is people going through with a fine-toothed comb when they really need to do something more like a machete on the bad backlinks. For example, often it would help to use the “domain:” operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links. That's one reason why we sometimes see it take a while to clean up those old, not-very-good links.
Obviously you don't want to disavow an entire domain if the domain itself is a legit website. That is just basic common sense. But a universal rule that you "must point to the exact link" is factually false. Just like a rule that you "must disavow the root domain" would be factually false. You look at each link and make a decision how to proceed.
So my question was about root domain disavows. I gave some examples above.
| |
| |
| |
| |
| | -
Hi there.
In order to disavow links, you must point the exact link to disavow.
Here an example: domainpizza.org/badklink/xx/puppies
Not the root domain.Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
UK company not ranking .com domain in UK
Hi, we have a slight issue with our website. We have been proactively doing SEO for the past year, but we have run into a slight issue. Our website is ranking for search terms everywhere except Our local area (UK) We have tried creating separate sections of our site targeted just at the UK In search console. As well as targeting the whole site as UK preferred and setting the hreflang tags to en-GB. Nothing seems to be working, any ideas? Thanks in advance!
Intermediate & Advanced SEO | | SEODale1 -
Consensus on disavowing low-quality auto-generated links (e.g. webstatsdomain.org etc) ?
Is there a consensus in the SEO world around the best practice on how to treat the multiple auto-generated links for a domain? With a lot of the link profiles we have been analyzing nearly 70% volume of the backlinks relate to these auto generated links (e.g. similarweb.com, informer.com, webstatsdomain.org etc) I can see arguments for disavowing them (low-quality links) as well as keeping them (skew anchor text distribution towards URL mentions, natural link profile) but would be interested if people have run experiments or prefer strongly one way or the other.
Intermediate & Advanced SEO | | petersocapro1 -
Submitted a Disavow BUT can't send in a RECONSIDERATION, WHY?
Hi Community! 2 weeks ago, i sent in our first/HUGE disavow list to Google. Out of the 2700 domains we submitted, 1300 of them we successfully removed, but we have nothing to show Google. Reason is because on our reconsideration request page, we can't submit anything because we didn't receive a message from Google (please see screenshot). I know for a FACT we got hit by an ALGORITHM penalty back in March2013. So, I have this wonderful Gdoc to prove that we worked LONG AND HARD to add and remove links in the past year, but we can't seem to message Google and tell them our story on why we should be reconsidered. How do we tell Google our success of removals? It's been 2 weeks, how much longer until we see a change in traffic? Or do we have to wait for the next update of algorithms by google aka REFRESH to see a change? Let me know and thank you so much in advance! Shawn cYGKLVR
Intermediate & Advanced SEO | | Shawn1241 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Will duplicate content across a .net website and a .ch have negative affects on SERPs?
Hi, I am working with a company that has a .net site and a .ch website that are identical. Will this duplicate content have a negative affect on SERPs? Thanks Ali.B
Intermediate & Advanced SEO | | Bmeisterali0 -
Easiest way to disavow single links on an on-going basis?
We frequently get random super-sketchy looking blogs linking to us with no author or contact information. I believe we are being targeted by a competitor setting up garbage links to us. I am hoping to use the Google disavow links tool to deal with this but is it: Safe to use or does it flag us as link spammers by using it Possible to use on an on-going basis for single links (as them come in, as opposed to a bunch of backlogged links) Thanks!
Intermediate & Advanced SEO | | BlueLinkERP0 -
Client has been COPYING blog posts to wordpress.com for years. Now what?
Just discovered my client has been copying all her blog posts over to her wordpress.com blog account. So she has duplicate content to what is on her site. She has roughly 700 posts on her main site but doesn't look like that many on the wordpress site. There are no inbound links coming off the wordpress site that I could find. Here's an example: http://thevanillaqueen.wordpress.com/2013/01/30/is-canola-oil-a-healthy-choice/ http://vanilla.com/is-canola-oil-a-healthy-choice/ What are you recommendations for what should be done? Thanks!
Intermediate & Advanced SEO | | katandmouse0 -
How does an exact match .net compare to .com?
I'm looking into buying an exact match tld, but it is .net. Do .net domains get the same exact match bonus .coms do?
Intermediate & Advanced SEO | | jeremydavid0