No follow vs do follow the how to
-
Hi Guys,
Sorry if this is an ammature question, just wanted to know I noticed a few people talking about no follows and do follows for backlinks. Is there suppose to be some way to set you website up as nofollow and dofollow for backlinks? I noticed a few people saying to make sure that some directories are nofollow, i would like to know if I can set this up for my own site as I'm a bit conscious and paranoid about others that might backlink to my site who have huge spam or negative seo etc?
Any insight into this would be much appreciated
Thanks all
-
In short - do not concern yourself about negative SEO. Yes it can happen - but if you monitor your site the way you are - ie using moz diagnostics to regularly crawl back links etc. you will identify spam links and then can go through the procedure to disallow. So you have that covered.
However you should appreciate that if someone creates a link for you, an editorial article - generally you want a follow link. I spend time for clients trying to turn no-follows into follows. Then you get the link juice and the bump hopefully in rankings.
Clear as mud? If not let me know. Good question your on the right track.
-
Hi Edward,
You are a little confused about what this means i think so let me try to explain.
Each link can be assigned an attribute called rel="nofollow", the person who owns the link has control of this attribute so you can control if the links on your website are nofollow, but you have no control of the link people point to your website.
Generally speaking you want your link profile to contain both and it demonstrates a healthy link profile.
How does Google handle nofollowed links?
In general, we don't follow them. This means that Google does not transfer PageRank or anchor text across these links. Essentially, using
nofollow
causes us to drop the target links from our overall graph of the web. However, the target pages may still appear in our index if other sites link to them without usingnofollow
, or if the URLs are submitted to Google in a Sitemap. Also, it's important to note that other search engines may handlenofollow
in slightly different ways.Using nofollow them on your own website
The use of nofollow links on your own website to your own pages stops google crawling and indexing certain pages on your website. For example is you had a "Login" or "Checkout" page. Many people choose to nofollow it to stop google crawling and indexing it. This stop a page with normally fairly poor content due to its nature being indexed on your site.
It is also used to prevent duplicate content, if you know a page is a duplicate of another but it is needed, rather than use canocial tags etc some people choose to nofollow them.
Im summary You can't nofollow links that point to your website from external sites (unless you contact the person sending the link and they agree to do so). Your best defence against spammy links is to monitor your link profile and when a link pops up you dont link follow the normal channels to remove it,
nofollow on your own website should only be used to stop google crawling and indexing certain links and passing link juice as and when you need it. It Google still has a bot that crawls through nofollows. But in general it will recognise your wishes.
-
It's important to remember that a healthy link profile will be a mix of both dofollow and nofollow links. There is no rule of thumb that says which links should contain which attributes, but if you were in a generic directory, for example, you would want it to be nofollowed.
More often than not, any link that is given editorially is fine with whatever it comes with. Nofollowed links are very useful, but just don't pass page rank.
Google is pretty smart at detecting spam links and negative SEO though, due to how these normally appear, so I wouldn't worry too much, unless you have seen something that is concerning you? You are also able to handle any negative SEO or bad links through disavowing the links in Webmaster Tools.
I hope this helps a little?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
The Great Subdomain vs. Subfolder Debate, what is the best answer?
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday. While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders. https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50 Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference. http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/ Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important. http://www.websitemagazine.com/content/blogs/posts/archive/2015/03/10/seo-inquiry-subdomains-subdirectories.aspx Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller. Does anybody else have any thoughts and/or insight about this?
Intermediate & Advanced SEO | | RosemaryB3 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
One Site vs. Many
This is a question that I am not sure has a "right" answer. I am just wondering what everyone's thoughts are on this. I can see benefit of both sides of the coin. In your opinion, is it better to have one large e-commerce site with all of your content on the same domain or is it better to have multiple more targeted domains with your content broken up into smaller chunks? The reason I ask is, I feel like while multiple more targeted sites certainly have the benefit of focus, aren't you taking all your traffic and content, splitting it up and leaving you with several sites that most likely are getting less traffic than one large site would. All opinions welcome.
Intermediate & Advanced SEO | | unikey0 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Parked Vs Addon/Redirect Domain
We have an old site we are trying to figure out what to do with it. Right now, we have it as a parked domain, but were considering changing it to an addon domain with a redirect. I have no reason why I chose parked vs addon, other than I had to pick one. Is one superior than the other? What are the pro's and con's for these? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
Flat vs. subdomain web structure
I am building a site which sells a product in 50 states and in each state we will have independt partners. From an SEO perspective, what are the tradeoffs in using a single domain vs. having each state a subdomain? Each state also has varying regulatory issues that are specific to that state.
Intermediate & Advanced SEO | | uwaim20120