Nofollow tags
-
So on the homepage, should all the links like privacy, contact us, etc...be rel="nofollow" ?
I want to get a better handle on passing as much link juice on homepage to important internal pages as I can, and want to get it right.
Thanks in advance.
-
What about 12 outbound links to external client sites not related to your service.
-
unfortunately, if you can't place a NOINDEX meta tag due to limitations of the CMS then you probably won't be able to place a rel=nofollow either... leaving you with a disallow in your robots.txt.
-
what if you can't place noindex into the html head (limitation of the cms) would a exclude in the robots be enough on its own? (or at least better than nofollow links to the page)
-
simply exclude or 'disallow' the file path in the Robots.txt. Then place NOINDEX, NOFOLLOW meta tag on those pages (in the HTML head before the body). If you have important links on those pages then use the meta tag NOINDEX, FOLLOW. I hope this helps... please ask for clarification if you need.
-
Yes - follow the link in my expanded answer above... the ink points to Matt Cutts original article from February 2009 explaining how/when/why the change was made.
-
"They changed this (I think in 2009) to : If you had 10 links on a page and 5 were nofollowed each link would still only pass on 1 PR point. The remaining 5 points essentially disappear into thin air."
R u 100% sure about this? any sources to back this up?
Thanks
-
You are "over my head" lol.
So for sitewide contact, privacy, etc...what is the best thing to do?
Thanks!
-
Haha! For some reason I didn't see the other post... thought I was the only responder.
Be well!
-
Anthony, I never said I disagree with you. I did not see your answer at first, I must have opened the thread before you posted your answer. reading your answer now yes, we are in agreement.
-
I'm confused about what you are disagreeing with me about... there is the meta NOFOLLOW tag that is placed at the page level and the more granular rel=nofollow attribute at the link level. They are not interchangeable but simply give more macro or micro control over links on a page. If you read my answer carefully you will see that we are in complete agreement over link decay using the rel=nofollow attribute on individual links.
-
No you should not.
When the nofollow tag first came out you could "sculpt" page rank by saying which pages you can pass it on to, this is no longer the case. Google made a change a few years back to stop people from doing this. An example would be:
When nofollow first came out: If you page had 10 links on it, each link would pass on 1 point of page rank (PR). If you nofollowed 5 of these links then each link without the nofollow tag would then pass on 2 points.
They changed this (I think in 2009) to : If you had 10 links on a page and 5 were nofollowed each link would still only pass on 1 PR point. The remaining 5 points essentially disappear into thin air.
So by adding nofollow to internal pages you are wasting your PR, rather let it be passed on to your less important pages which will return a certain amount back to the top level if you linking structure is correct. Only use nofollow for external links which you don't want to pass on PR to e.g. If it could be considered a bad neighbourhood etc. This may not be 100% how it works but the basic concept is correct, there are extensive explanations of this on Matt Cutts blog.
-
First there was the NOFOLLOW meta tag for page-level exclusion and then Google adopted the more granular rel=nofollow attribute for individual links on a page. I find that too many SEOs overuse the rel=nofollow attribute when there is a much more elegant solution available. The reason for this is now myth formerly known as the abused tactic called PageRank sculpting. I had a well-known culture/nightlife site in NYC as a client that had placed literally thousands of rel=nofollow attributes on links throughout the site... granted this does not seem to be your problem but I digress...
To illustrate my point, Matt Cutts discusses how rel=nofollow attributes affect how Google passes PageRank to other parts of your site (or more precisely how nofollows decay the amount of link juice passed). In the case of a few pages or even large directories, etc, I would do the following:
- Disallow crawling of less valuable pages via Robots.txt
- Use the meta exclusion NOINDEX, NOFOLLOW tag at the page level - unless these pages pass valuable link juice/anchor text to other parts of the site then use NOINDEX, FOLLOW (page is not indexed but important links are followed)
- Also, leave these pages out of your XML sitemap(s) - although you may want leave them in the HTML sitemap and place a granular rel=nofollow at link-level in the case of a 404 error page for usability purposes or required privacy statement for landing pages.
Saving your Googlebot crawl budget for only high value pages is a great way to get more of those pages in the Google index providing you with more opportunity to promote your products, services, etc. Also, limiting the number of rel=nofollows used and allowing link juice (or Page Rank) to flow more freely throughout your site will prove beneficial.
-
There was a time I would have said yes. Nowadays its hardly worth the trouble.
However, if its easy to implement, why not? You might get some marginal benefit out of it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exclude sorting options using nofollow to reduce duplicate content
I'm getting reports of duplicate content for pages that have different sorting options applied, e.g: /trips/dest/africa-and-middle-east/
On-Page Optimization | | benbrowning
/trips/dest/africa-and-middle-east/?sort=title&direction=asc&page=1
/trips/dest/africa-and-middle-east/?sort=title&direction=des&page=1 I have the added complication of having pagination combined with these sorting options. I also don't have the option of a view all page. I'm considering adding rel="nofollow" to the sorting controls so they are just taken out of the equation, then using rel="next" and rel="prev" to handle the pagination as per Google recommendations(using the default sorting options). Has anyone tried this approach, or have an opinion on whether it would work?0 -
Optimal amount of alt and title tags on a landing page?
Hey guys! We are preparing a few product landing pages and want to make sure the alt and title tags are done correctly. We looked at landing pages from Apple and Google Nexus and found that they had very few alt and title tags (or none). Are we missing something here? Should there be an alt for every image and title for every link?
On-Page Optimization | | TVape0 -
Should I noindex, nofollow a lot of child pages?
Some category pages on my website have A LOT of product pages in them. Many of these pages do not receive any sort of organic traffic. Is there a reason for Google to be crawling these pages? Should I noindex, nofollow these pages to make Google's life a little easier? Could I possibly see some benefits from this or should I leave it the way it is?
On-Page Optimization | | jampaper0 -
Using H3-4 tags in the footer or sidebars: good or not?
Howdy SEOmoz fans! Is it considered a good / bad / neutral practice to include H tags in the footer, as a mean to group a few links? Take http://www.seomoz.org/ for instance: - Voted Best SEO Tool 2010! = H2
On-Page Optimization | | AxialDev
- Looking for SEO consulting? = H3
- Product and Tools = H3 Company = H3 etc. I often see the same principle applied to sidebars. I feel like because they don't contribute to the actual content structure and because they are repeated from page to page, we should avoid them, but I have nothing to back my intuition. [+] Perhaps they are helpful for usability (screen readers) and thin added value (i.e. category names that carry more weight than if they weren't headers). What do you think? Thanks for your time.1 -
How should I rephrase these pages to avoid Phrase duplication within Title Tags
How should I rephrase these pages to avoid Phrase duplication within Title Tags Duplicate Page Title Page1-http://organicfruitbasketsflorist.com/index-2.html Page2- http://organicfruitbasketsflorist.com/Fruit_Baskets_Organic_Fruit_Baskets_New_York_NY.html Page3- http://organicfruitbasketsflorist.com/Fruit_Baskets_Edible_Fruit_Baskets_New_York_NY.html Page4organicfruitbasketsflorist.com/Fruit_Baskets_Business_Fruit_Baskets_New_York_NY.html Page5-http://organicfruitbasketsflorist.com/Fruit_Baskets_Fresh_Flowers_Delivered_New_York_NY.html Page6-http://organicfruitbasketsflorist.com/Coupons.htmlAmi
On-Page Optimization | | amydiamond0 -
Removing Tag Cloud to Decrease the number of On-Site Links?
I have website which has a tag cloud at the footer. This tag cloud or tag box has about 40 links pointing to different words(tags) pages. I believe that the tags are hardly ever clicked on by visitors, since its at the footer. If the tag cloud is removed would this decrease the number of on-site links? According crawl diagnostics summary I have 195 pages with too many links. Would removing it help? And what kind changes can I expect if they removed (seo wise)?
On-Page Optimization | | utesters0 -
Do you bother to nofollow links out to Facebook etc?
These days many sites link (often site-wide) to their social profiles on Facebook, Twitter, YouTube.... etc Theoretical SEO on-page optimisation might condone adding nofollow to these links, but I have a feeling it might not do much in terms of conserving page authority and indeed might be a bit of an "over-optimisation" signal. Thoughts?
On-Page Optimization | | Peterdallisomo0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0