Disavow questions
-
Pretty sure I know the answers to these but someone asked me to make absolutely sure so here goes, any opinions welcome:
-
If i disavow a whole domain does it include all sub-domains on the domain also?- my answer is clearly yes.
-
If i have network of links really bad linking to my website that are already nofollow but awful websites to be linked on, is it worth putting them in the disavow list anyway to basically tell Google literally no association? I know the whole point of disavow is to essentially nofollow the link.
Opinions much appreciated, thank you guys.
-
-
Great questions. I'll give my 2 cents based on what we've witnessed at Penalty Pros:
1 - Yes, this will do the trick. Just make sure that you are referencing the non-www version in the disavow. For example "domain:site.com" and NOT "domain:www.site.com". If you want to be super safe, just include the exact subdomains as separate line items.
2 - Google's official word is that you don't need to worry about nofollows, but we've encountered a few situations where nofollow links were pointed out as problematic in failed recon requests. This may be human error on the part of the manual team, but it's probably worth it just disavowing regardless.
Hope this helps and good luck!
-
In most cases if you disavow the root, the subdomains will be disavowed as well. But, this may not work for certain large hosts. If you're trying to disavow all of wordpress.com or blogspot.com you really should disavow the subdomains individually.
Regarding nofollow links, I am 99% sure that you can ignore them. John Mueller said that you don't need to include nofollowed links in a disavow: http://goo.gl/EhpI5O
The reason I say 99% sure and not 100% sure is because a colleague of mine was recently given a nofollowed link as an example link on a failed reconsideration request. The site is not available on archive.org, so I can't go back and check but my suspicion is that the site may have recently changed their links to nofollow. I have done MANY disavows and have had many successful reconsideration requests and I do not disavow nofollowed links.
-
I totally agree with Andy here. PageRank is not the only metric - not by a long shot. Any association can be negative or positive whether PageRank is passing or not. I would disavow just to get out of the bad neighborhood.
-
I wouldn't take any chances with poor links, even if they are nofollowed. Even though it passes no pagerank, it is still going to be a signal that passes a message back to Google.
I never take a chance and disavow these as a matter of course. You never fully know just how much Google is using them.
-Andy
-
According to Dr. Pete: "My understanding is that you can block the root domain that way, yes, but Google seemed to qualify that sub-domains were at their discretion. Unfortunately, we don't have much data yet. If you know that every link from the domain is bad, then I'd use the "domain:example.com" format."
Second Question--Can nofollow links hurt my site? Matt Cutts said "Typically, no unless it is abused and manual action applied."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on Pagination - /blog/ vs /blog/?page=1
Question on Pagination Because we could have /blog/ or /blog/?page=1 as page one would this be the correct way to markup the difference between these two URL? The first page of a sequence could start with either one of these URLs. Clarity around what to do on this first page would be helpful. Example… Would this be the correct way to do this as these two URLs would have the exact content? Internal links would likely link to /blog/ so signal could be muddy. URL: https://www.somedomain.com/blog/
Technical SEO | | jorgensoncompanies
<link rel="canonical" href="https://www.somedomain.com/blog/?page=1"> URL: https://www.somedomain.com/blog/?page=1
<link rel="canonical" href="https://www.somedomain.com/blog/?page=1"> Google is now saying to just use the canonical to the correct paginated URL with page number. You can read that here:
https://developers.google.com/search/docs/advanced/ecommerce/pagination-and-incremental-page-loading But they do not clarify what to do on /blog/?page=1 vs /blog/ as they are the exact same thing. Thanks for your help.0 -
Geo Targeting Content Question
Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please? From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks
Technical SEO | | TheMuffinMan0 -
Hi - I have a question about IP addresses
- would it hurt link juice to host a blog on a different server to the rest of your website? I have a web host saying they can't run Wordpress as they won't support PHP for "security reasons" - one solution would be to set up Wordpress on a different server and redirect domain.com/blog there (I presume this is do-able?). But I don't know if that affects the SEO adversely?
Technical SEO | | abisti21 -
Disavow a big part of my external link profile
Hi There, With the latest penguin 3.0 algorithm update (on October 17th,) I noticed a drop in my rankings. Even though I didn’t receive any manual penalty because no messages have been found in WebMaster Tool, I suspect it is an algorithm penalty. For this reason, I definitively decided to clean-up my external link profile. **I am excluding it is a Panda 4.1 penalty because an extensive site structure review has been conducted quite recently. I collected external links from Webmaster Tool and Open Site Explorer. What I found is that 83% of my external links need to be disavowed because the links come either from poor directories or marketing articles that are evidently and specifically written for link building purposes. My questions are: 1) Shall an external link clean-up be set in place anyway although I didn’t receive any penalty message in order to prevent future problems with penguin algorithm? 2) Is it too dangerous to disavow 83% of external links? May such a manoeuvre destroy my actual rankings? Thanks in advance for you advices 🙂
Technical SEO | | Midleton0 -
Google disavow tool ( how long does it take ? )
Hello, I disavowed some of my links about three months but still see them in my link profile, using OSE? How long does it take for Google to make them nofollow. Thanks
Technical SEO | | mezozcorp0 -
SEOMoz Crawler vs Googlebot Question
I read somewhere that SEOMoz’s crawler marks a page in its Crawl Diagnostics as duplicate content if it doesn’t have more than 5% unique content.(I can’t find that statistic anywhere on SEOMoz to confirm though). We are an eCommerce site, so many of our pages share the same sidebar, header, and footer links. The pages flagged by SEOMoz as duplicates have these same links, but they have unique URLs and category names. Because they’re not actual duplicates of each other, canonical tags aren’t the answer. Also because inventory might automatically come back in stock, we can’t use 301 redirects on these “duplicate” pages. It seems like it’s the sidebar, header, and footer links that are what’s causing these pages to be flagged as duplicates. Does the SEOMoz crawler mimic the way Googlebot works? Also, is Googlebot smart enough not to count the sidebar and header/footer links when looking for duplicate content?
Technical SEO | | ElDude0 -
Video question
If another company hosts our videos, but they are only found embedded on our site, do we get all of the SEO benefits from the video, or would we have to host it for that to happen?
Technical SEO | | ClaytonKendall0 -
I have mulitple domains that are both drawing traffic and that I should only have doing that. my question is how do I make one go away?
First off I am VERY new to his SEO stuff and If you guys could be so kind as to help. I was setting up my first campaign for my web site and when i entered it into the URL search it came back with having 2 web sites that it searched. Both are mine but one has the "www.website.com" and the other just has the "website.com" how can i fix this so i just have one? thanks in advance for your help
Technical SEO | | madabouthats0