Nofollow tags
-
So on the homepage, should all the links like privacy, contact us, etc...be rel="nofollow" ?
I want to get a better handle on passing as much link juice on homepage to important internal pages as I can, and want to get it right.
Thanks in advance.
-
What about 12 outbound links to external client sites not related to your service.
-
unfortunately, if you can't place a NOINDEX meta tag due to limitations of the CMS then you probably won't be able to place a rel=nofollow either... leaving you with a disallow in your robots.txt.
-
what if you can't place noindex into the html head (limitation of the cms) would a exclude in the robots be enough on its own? (or at least better than nofollow links to the page)
-
simply exclude or 'disallow' the file path in the Robots.txt. Then place NOINDEX, NOFOLLOW meta tag on those pages (in the HTML head before the body). If you have important links on those pages then use the meta tag NOINDEX, FOLLOW. I hope this helps... please ask for clarification if you need.
-
Yes - follow the link in my expanded answer above... the ink points to Matt Cutts original article from February 2009 explaining how/when/why the change was made.
-
"They changed this (I think in 2009) to : If you had 10 links on a page and 5 were nofollowed each link would still only pass on 1 PR point. The remaining 5 points essentially disappear into thin air."
R u 100% sure about this? any sources to back this up?
Thanks
-
You are "over my head" lol.
So for sitewide contact, privacy, etc...what is the best thing to do?
Thanks!
-
Haha! For some reason I didn't see the other post... thought I was the only responder.
Be well!
-
Anthony, I never said I disagree with you. I did not see your answer at first, I must have opened the thread before you posted your answer. reading your answer now yes, we are in agreement.
-
I'm confused about what you are disagreeing with me about... there is the meta NOFOLLOW tag that is placed at the page level and the more granular rel=nofollow attribute at the link level. They are not interchangeable but simply give more macro or micro control over links on a page. If you read my answer carefully you will see that we are in complete agreement over link decay using the rel=nofollow attribute on individual links.
-
No you should not.
When the nofollow tag first came out you could "sculpt" page rank by saying which pages you can pass it on to, this is no longer the case. Google made a change a few years back to stop people from doing this. An example would be:
When nofollow first came out: If you page had 10 links on it, each link would pass on 1 point of page rank (PR). If you nofollowed 5 of these links then each link without the nofollow tag would then pass on 2 points.
They changed this (I think in 2009) to : If you had 10 links on a page and 5 were nofollowed each link would still only pass on 1 PR point. The remaining 5 points essentially disappear into thin air.
So by adding nofollow to internal pages you are wasting your PR, rather let it be passed on to your less important pages which will return a certain amount back to the top level if you linking structure is correct. Only use nofollow for external links which you don't want to pass on PR to e.g. If it could be considered a bad neighbourhood etc. This may not be 100% how it works but the basic concept is correct, there are extensive explanations of this on Matt Cutts blog.
-
First there was the NOFOLLOW meta tag for page-level exclusion and then Google adopted the more granular rel=nofollow attribute for individual links on a page. I find that too many SEOs overuse the rel=nofollow attribute when there is a much more elegant solution available. The reason for this is now myth formerly known as the abused tactic called PageRank sculpting. I had a well-known culture/nightlife site in NYC as a client that had placed literally thousands of rel=nofollow attributes on links throughout the site... granted this does not seem to be your problem but I digress...
To illustrate my point, Matt Cutts discusses how rel=nofollow attributes affect how Google passes PageRank to other parts of your site (or more precisely how nofollows decay the amount of link juice passed). In the case of a few pages or even large directories, etc, I would do the following:
- Disallow crawling of less valuable pages via Robots.txt
- Use the meta exclusion NOINDEX, NOFOLLOW tag at the page level - unless these pages pass valuable link juice/anchor text to other parts of the site then use NOINDEX, FOLLOW (page is not indexed but important links are followed)
- Also, leave these pages out of your XML sitemap(s) - although you may want leave them in the HTML sitemap and place a granular rel=nofollow at link-level in the case of a 404 error page for usability purposes or required privacy statement for landing pages.
Saving your Googlebot crawl budget for only high value pages is a great way to get more of those pages in the Google index providing you with more opportunity to promote your products, services, etc. Also, limiting the number of rel=nofollows used and allowing link juice (or Page Rank) to flow more freely throughout your site will prove beneficial.
-
There was a time I would have said yes. Nowadays its hardly worth the trouble.
However, if its easy to implement, why not? You might get some marginal benefit out of it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tags - Good or bad for SEO
We are getting Moz errors for duplicate content because tag pages share the same blog posts. Is there any way to fix this? Are these errors bad for SEO, or can I simply disregard these and ignore them? We are also getting Moz errors for missing descriptions on tag pages. I am unsure how to fix these errors, as we do not actually have pages for these on our WordPress site where we are able to put in a description. I have heard that having tags can be good for SEO? (We don't mind having several links that show up when searching for us on google...) As far as the SEO goes, I am not sure what to do. Does anyone know the best strategy?
On-Page Optimization | | Christinaa0 -
SEO results/title tags/desktop vs. mobile
I am trying to figure out why my title tags comes up different between desktop and mobile search results. Desktop returns my title tag as written, but on mobile I get something completely different. It's related to the site, but not anything I can read, as coded in the site (i.e. not the title tags, meta, or anywhere else). Has anyone else experienced this? My title tag is 64 characters - I know it's a little bit over, but would that cause such a weird issue as a completely different title in the search results?
On-Page Optimization | | tallyhodesign0 -
Meta tag keywords with the same words in them.
I'm updating some older pages and was wondering about potential penalties from having keywords that start with the same phrase. It's a geographic area so there is the "full name" and the abbreviated name. I'd like to have keywords for both. For example: virginia beach, va beach, virginia beach attraction, virginia beach things to do, va beach attraction, va beach things to do, virginia beach dolphins tour, va beach dolphins tour Is that spammy? I understand they don't have the same weight as they used to but I'd like to optimize for them anyway since I'm redoing some things. Thanks in advance.
On-Page Optimization | | recoil0 -
Do Blog Tags affect SEO at all anymore?
We're trying to standardize the use of tags on our site amongst writers/editors, and I'm trying to come up a list of tags they can choose from to tag posts with - and telling them to use no more than 10 (absolute maximum) per post. We are also in the process of migrating to a new CMS, and have 8 defined categories that will all have their own landing page within our "News" section. TLDR: Do blog tags have any impact on SEO anymore? Are they solely meant to help users find articles related on popular topics, or does creating a tag for a popular topic help to improve organic visibility? Full Question: With the tag standardization, I want to make sure we're creating the most useful and effective tags; and the UX/SEO sides of my brain are conflicted. To my understanding, creating a tag about a high volume topic in an industry helps establish the website's relevance to Google/other search engines about that topic and improves overall relevance; but the tag feed page (ex: http://freshome.com/tag/home-protection/) isn't really meant for organic search visibility. So my other question is, is it worth it to noindex the tag pages in the robots.txt? Will that affect any benefit to increased relevance for Google (if there is any)? I'm interested to hear others' thoughts and suggestions. Thanks in advance!
On-Page Optimization | | davidkaralisjr0 -
Duplicate title tag
Hello,
On-Page Optimization | | JohnHuynh
My site have problems with duplicate title, they were reported from google webmaster. For example:
/extra-services/car-pick-up-service-146.html (1) /extra-services/transportation--car-rails--146.html (2) According to my sitemap the first URL is right (1). But the second URL is wrong, I don't know it occur here.0 -
Penalty for Changing Home Page Title Tags
Hey Mozzers I'm certain of the answer to this question, however I wanted to get some input from the experts in Moz-land to hopefully provide some additional perspective. I recently disagree with a client's assertion that there is some penalty Google levels for changing the title tags of your home page. Now, I understand changing the title tags can influence serp rankings, however, is anyone aware of some penalty Google levels for simply changing the title tags? Most of what I've read and experienced has people changing them all the time without some phantom penalty. It seems to me a problem of correlation = causality, in that people often attribute a drop to an action that may not have actually been the cause. Anyway, if you have any particular insight on this top I would appreciate it greatly. thanks!
On-Page Optimization | | BrandLabs0 -
Title tag best practices when domain and brand are the same
I know the old standard for title tag optimization is to use your brand name in the title for a multitude of reasons, all of which are indisputable The most important reason being any strength and awareness can aid in click-thru. But does this hold true for exact match domains? Considering the way a search result is displayed, any awareness and strength derived from using the brand in the title is automatically included in the search result of an exact match domain without having to sacrifice valuable characters in the title. The organic value (or value beyond simply seeing the brand displayed and nothing else) can't have that much of an impact, can it? For Example, given the result attached, is it worth it to repeat dog.com in the title if it is already showing in the result? dog.png
On-Page Optimization | | NextGenEDU0 -
Comments on Title Tag
New to this and I'm working on a title tag. I was wondering if any one had opinions/input on if this looks good/bad/ugly. I replaced the actual name of the client with "Ranch Name" Guest Ranch Dude Ranch Wyoming Jackson Hole Activities RANCH NAME Thanks!
On-Page Optimization | | dbaxa-2613380