Nofollow tags
-
So on the homepage, should all the links like privacy, contact us, etc...be rel="nofollow" ?
I want to get a better handle on passing as much link juice on homepage to important internal pages as I can, and want to get it right.
Thanks in advance.
-
What about 12 outbound links to external client sites not related to your service.
-
unfortunately, if you can't place a NOINDEX meta tag due to limitations of the CMS then you probably won't be able to place a rel=nofollow either... leaving you with a disallow in your robots.txt.
-
what if you can't place noindex into the html head (limitation of the cms) would a exclude in the robots be enough on its own? (or at least better than nofollow links to the page)
-
simply exclude or 'disallow' the file path in the Robots.txt. Then place NOINDEX, NOFOLLOW meta tag on those pages (in the HTML head before the body). If you have important links on those pages then use the meta tag NOINDEX, FOLLOW. I hope this helps... please ask for clarification if you need.
-
Yes - follow the link in my expanded answer above... the ink points to Matt Cutts original article from February 2009 explaining how/when/why the change was made.
-
"They changed this (I think in 2009) to : If you had 10 links on a page and 5 were nofollowed each link would still only pass on 1 PR point. The remaining 5 points essentially disappear into thin air."
R u 100% sure about this? any sources to back this up?
Thanks
-
You are "over my head" lol.
So for sitewide contact, privacy, etc...what is the best thing to do?
Thanks!
-
Haha! For some reason I didn't see the other post... thought I was the only responder.
Be well!
-
Anthony, I never said I disagree with you. I did not see your answer at first, I must have opened the thread before you posted your answer. reading your answer now yes, we are in agreement.
-
I'm confused about what you are disagreeing with me about... there is the meta NOFOLLOW tag that is placed at the page level and the more granular rel=nofollow attribute at the link level. They are not interchangeable but simply give more macro or micro control over links on a page. If you read my answer carefully you will see that we are in complete agreement over link decay using the rel=nofollow attribute on individual links.
-
No you should not.
When the nofollow tag first came out you could "sculpt" page rank by saying which pages you can pass it on to, this is no longer the case. Google made a change a few years back to stop people from doing this. An example would be:
When nofollow first came out: If you page had 10 links on it, each link would pass on 1 point of page rank (PR). If you nofollowed 5 of these links then each link without the nofollow tag would then pass on 2 points.
They changed this (I think in 2009) to : If you had 10 links on a page and 5 were nofollowed each link would still only pass on 1 PR point. The remaining 5 points essentially disappear into thin air.
So by adding nofollow to internal pages you are wasting your PR, rather let it be passed on to your less important pages which will return a certain amount back to the top level if you linking structure is correct. Only use nofollow for external links which you don't want to pass on PR to e.g. If it could be considered a bad neighbourhood etc. This may not be 100% how it works but the basic concept is correct, there are extensive explanations of this on Matt Cutts blog.
-
First there was the NOFOLLOW meta tag for page-level exclusion and then Google adopted the more granular rel=nofollow attribute for individual links on a page. I find that too many SEOs overuse the rel=nofollow attribute when there is a much more elegant solution available. The reason for this is now myth formerly known as the abused tactic called PageRank sculpting. I had a well-known culture/nightlife site in NYC as a client that had placed literally thousands of rel=nofollow attributes on links throughout the site... granted this does not seem to be your problem but I digress...
To illustrate my point, Matt Cutts discusses how rel=nofollow attributes affect how Google passes PageRank to other parts of your site (or more precisely how nofollows decay the amount of link juice passed). In the case of a few pages or even large directories, etc, I would do the following:
- Disallow crawling of less valuable pages via Robots.txt
- Use the meta exclusion NOINDEX, NOFOLLOW tag at the page level - unless these pages pass valuable link juice/anchor text to other parts of the site then use NOINDEX, FOLLOW (page is not indexed but important links are followed)
- Also, leave these pages out of your XML sitemap(s) - although you may want leave them in the HTML sitemap and place a granular rel=nofollow at link-level in the case of a 404 error page for usability purposes or required privacy statement for landing pages.
Saving your Googlebot crawl budget for only high value pages is a great way to get more of those pages in the Google index providing you with more opportunity to promote your products, services, etc. Also, limiting the number of rel=nofollows used and allowing link juice (or Page Rank) to flow more freely throughout your site will prove beneficial.
-
There was a time I would have said yes. Nowadays its hardly worth the trouble.
However, if its easy to implement, why not? You might get some marginal benefit out of it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tags in the body?
Hi there, Does anyone know if placing canonical tags in the body instead of the header of a page will still "take"? The system we are on means that making an editable header is no easy business and I was just wondering how big of a difference it makes to have it in a different area. Thank you in advance.
On-Page Optimization | | Whittie0 -
Less Tags better for SEO?
I am currently reviewing my strategy when it comes to categories and tags on my site. Â Having been no-indexed for some time, and having many tags with just one entry I am thinking that this is not optimal for SEO purposes. This is what I am planning: Categories - Change these to Index, but only after adding a hundred words or so by way of introduction (see this example - https://www.besthostnews.com/news/hosting/a-small-orange-news/). Â With the categories I am thinking of highlighting key articles as well to improve link juice distribution to older articles that are important. Tags - About half my tags have only 1 entry, with a few more just having 2 entries. Â I am thinking of deleting all tags with just one entry, and trying to merge those with just two or 3 entries where it makes sense to do so. Â I will keep these as no-index, but I think this will mean more optimal distribution of link juice within the site. I would appreciate your thoughts \ suggestions on the best practices here.
On-Page Optimization | | TheWebMastercom0 -
Why are my tags causing duplicate error?
Hi, When I run an SEO moz crawl it is picking up duplicate errors from my tags. Currently I have these set as domain url/tag/name of the tag. Is this incorrect? Is there anything I can do to stop this being picked up as duplicate pages? Thanks, Eliz
On-Page Optimization | | econley0 -
Meta tag "revisit after" - useful?
Hi everybody, I've rarely seen the "revisit after" meta tag during the last 1,5 years. As some of my current client websites are still using it and I'm not sure, if it's still usefull/has any effect, I'd like to hear from the community. Any advices/hints/experiences with the tag? Thanks in advance and cheers from Germany Sven
On-Page Optimization | | targi420 -
Should I nofollow all my outbound links?
I've read a lot of stuff and I can't get a good answer about this. Now I'm reading Wordpress 3 - Search Engine Optimization, and the writer says that less outbound dofollow links you have, better for you. What is the best practice in this subject? Should I really nofollow more than 80% of all my outbound links? Thank you.
On-Page Optimization | | izaiasalmeida0 -
Which Canonical URL Tag tag should we remove?
Hi guys, We are in the process of optimizing the pages of our new site. We have used the 'on page' report card feature in the Seomoz Pro Campaign analyser. On several pages we got the following result No More Than One Canonical URL Tag Number of Canonical tags <dl> <dd>2</dd> <dt>Explanation</dt> <dd>The canonical URL tag is meant to be employed only a single time on an individual URL (much like the title element or meta description). To ensure the search engines properly parse the canonical source, employ only a single version of this tag.</dd> <dt>Recommendation</dt> <dd>Remove all but a single canonical URL tag</dd> </dl> I have looked into the source code of one of the pages http://www.sabaileela.co.uk/acupuncture-london and can see that there are two "canonical" tags. Does anyone have any advise on which one I should ask the developer to remove?  I am not sure how to determine the relative importance of either link.
On-Page Optimization | | brian.james0 -
Nofollow all outgoing links?
If nofollow keeps link juice from leaking from a site, why not use nofollow on all external links? What would be the benefit of an external link that does not use nofollow? Best, Christopher
On-Page Optimization | | ChristopherGlaeser0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0