Disavow files and net com org etc ....
-
When looking at my backlinks if I see something like this:
www.domainPizza.net
www.domainPizza.com
sub.domainPizza.com
www.domainpizza.org
domainPizza.net
https://domainpizza.com
https://www.domainpizza.netWhat is the actual list of disavows that I put into the file if I want to disavow this domain? I am seeing so many variations of the same domain.
Thank you.
-
Keep it as precise as possible, whether you disavow the whole domain or not is your choice, no problem doing that if you need to.
However if you are sure there is literally only one link on the site it is probably advisable to only disavow the specific URL.
The best of my knowledge there is no problem with disavowing a full domain.
As ThompsonPaul as said if you disavow the whole domain it will affect the entire domain including subdomains so treat with respect.
The other thing is be sure the link is doing you harm before you remove it, I have seen are even so-called spam links knock a site by a few points if it's disavowed.
-
Ahhh yes, you are correct!
-
Not quite, David. Disavowing a domain also automatically disavows all subdomains under it (www is just another subdomain, after all.)
So if domain:domainpizza.org is disavowed, sub.domainPizza.org is automatically disavowed as part of that directive. If instead you ONLY want to disavow the subdomain, you must explicitly specify just the exact subdomain and not the naked primary domain.
My main reason for disavowing at the domain level is for future protection. Not only may my crawler have missed a few link or a bad subdomain, but there is nothing to stop a spammy link site from generating more new spammy links or subdomains after I've reviewed it.
Hope that helps clarify?
Paul
-
Hi HLTalk,
You are right about disavowing at a domain level.
This is usually the option I go for because (generally) it's the overall domain that's spam, and rarely an individual URL (of course, there will be exceptions) and if you disavow at a URL level you run the risk of missing some URLs which may not be picked up in various SEO tools at the time.
For the example in your first post, you would need the following in your disavow file:
domain:domainPizza.net
domain:domainPizza.com
domain:domainpizza.org
domain:sub.domainPizza.comWhether it's http://, http://www., https://, https://www., doesn't matter - this doesn't need to be specified in the disavow file. Sub-domains are treated as a separate domain and do need to be specified in the disavow file.
As ThompsonPaul pointed out below, sub-domains will also be disavowed when you disavow at a domain level.
Cheers,
David
-
Well, nop. I do not know an easy way to disavow a whole domain.
I'd just export all URLs from that domain and disavow all them.
A more accurate and eficient way to disavow the spammy links is to export a backlink from any platform (such as Ahrefs, moz, magestic or others) and then disavow them.
Also, ahrefs offers the option to upload a .txt with the URLs disavow, so not to show them in the report.Best luck!
GR. -
My understanding is that this information is wrong. If you are dealing with an obviously spammy website that has no value or real content, you are supposed to disavow the root domain. As per Matt Cutts: https://www.seroundtable.com/google-disavow-machete-16844.html
One common issue we see with disavow requests is people going through with a fine-toothed comb when they really need to do something more like a machete on the bad backlinks. For example, often it would help to use the “domain:” operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links. That's one reason why we sometimes see it take a while to clean up those old, not-very-good links.
Obviously you don't want to disavow an entire domain if the domain itself is a legit website. That is just basic common sense. But a universal rule that you "must point to the exact link" is factually false. Just like a rule that you "must disavow the root domain" would be factually false. You look at each link and make a decision how to proceed.
So my question was about root domain disavows. I gave some examples above.
| |
| |
| |
| |
| | -
Hi there.
In order to disavow links, you must point the exact link to disavow.
Here an example: domainpizza.org/badklink/xx/puppies
Not the root domain.Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Can we use webiste content to Marketplce websites (Etsy / Amazon etc..)?
Hello Webmasters, My Name is Dinesh. I am working with Commerce Pundit as Marketing Person. We have one question with one of the website and would like to get the more idea on it We have one page or category name with "Engraved Photos on Wood". Here is page URL: http://www.canvaschamp.com/engraved-photos-on-wood-plaques So my Question about the content which we have added on this page. We have another team and they are handling marketplace department and they are using same content from the above page of website to do listing onto below Marketplace website. Refer website listing which are done by our marketplace team and where you can see that they guys have use the same content of form the above website page as a product info or description of the listing. https://www.etsy.com/listing/237807419/personalized-photo-art-or-custom-text-on?ref=listings_manager_grid
Intermediate & Advanced SEO | | CommercePundit
http://www.amazon.in/dp/B01003REIC
http://www.amazon.in/dp/B010037IEM
http://www.amazon.in/dp/B01000JG6I
http://www.amazon.in/dp/B01003HT6Y Does it create Duplicate content Issue with the our Website? Can marketplace team use the our website content onto various marketplace website to do website? We are every serious with the Organic Ranking for our website. So do let me know that is this right way or do we have to ask to them to stop this activities? Waiting for your reply Thanks
Dinesh
Commerce Pundit0 -
Wix.com ...what if any issues are there with this platform and SEO?
I have a client that would like me to support them with SEO on a Wix.com site. I was hoping to get some feedback from the community to see if there were people who had experience int he following areas: Supporting the day to day operation of a WiX site? Specifically are there any issues I need to watch out for or be aware of if I choose to support this site? From and SEO perspective is this platform OK or are there some issues I need to be made aware of? I would sincerely appreciate any input or comments on this platform.
Intermediate & Advanced SEO | | Ron_McCabe0 -
Would changing the file name of an image (not the alt attribute) have an effect of on seo / ranking of that image and thus the site?
Would changing the file name of image, not the alt attribute nor the image itself (so it would be exactly the same but just a name change) have any effect on : a) A sites seo ranking b) the individual images seo ranking (although i guess if b) would be true it would have an effect on a) although potentially small.) This is the sort of change i would be thinking of making : ![Red ford truck](2554.jpg) changed to ![Red ford truck](6842.jpg)
Intermediate & Advanced SEO | | Sam-P0 -
Switching from .co to com?
I have a site that does pretty well on a .co domain, but would like to switch to over .com (we own the .com already). If we were to transfer .com and 301 redirect all the .co pages over to their .com version, would we suffer at all? What would you guys recommend?
Intermediate & Advanced SEO | | StickyWebz0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
.com and .edu difference
Hello, Can anyone tell me how big is the difference between a PR5 .com and a PR5 .edu Double, triple? How big? Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Changing a url from .html to .com
Hello, I have a client that has a site with a .html plugin and I have read that its best to not have this. We currently have pages ranking with this .html plug in. However If we take the plug in out will we lose rankings? would we need a 301 or something?
Intermediate & Advanced SEO | | SEODinosaur0