Duplicate internal links on page, any benefit to nofollow
-
Link spam is naturally a hot topic amongst SEO's, particularly post Penguin. While digging around forums etc, I watched a video blog from Matt Cutts posted a while ago that suggests that Google only pays attention to the first instance of a link on the page
As most websites will have multiple instances of a links (header, footer and body text), is it beneficial to nofollow the additional instances of the link?
Also as the first instance of a link will in most cases be within the header nav, does that then make the content link text critical or can good on page optimisation be pulled from the title attribute?
I would appreciate the experiences and thoughts Mozzers thoughts on this
thanks in advance!
-
Thanks Maximise and Sven, both very informative answers.
I have changed the question to a discussion as it seems there may be no definitive answer to the above, so look forward to hopefully seeing further input from the Moz community.
-
Hi Justin,
I think you raise a very good point. It is known that nofollow links 'leak' linkjuice, because the linkjuice is not passed through the link, but it isn't distributed to other links on the same page either. The juice that would have been passed through a follow link simply vanishes when you add a nofollow tag.
As you said it is also suggested that Google does not count multiple instances of one link on one page. So the question that remains is: Are the second, third etc. instances of links treated as nofollow links (leaking linkjuice), or are they simply ignored (not leaking linkjuice)?
If they are treated as nofollow links and leak linkjuice anyway, you might as well add a nofollow-tag and make sure you don't get penalized for them either. On the other hand, if they are normally simply ignored by Google, but start leaking linkjuice with a nofollow tag, you might be doing some serious damage to your site.
Quite frankly, I don't know which is the case. However, my gut feeling says that the pagerank sculpting days are over so the above reasoning might not be the way to think about this.
I would simply try not to 'overdo' anything to much. Don't have pages with 200+ links all linking to perfectly optimised pages with perfectly optimised anchors. I suspect that internal linking is not the driving factor behind Penguin penalties anyway, but backlinks are.
Looking forward to see what other people think!
Greets,
Sven Witteveen
Expand Online -
I think they only pay attention to the anchor text of the first link, this would stop people from stuffing multiple links to the same page each targeting a different term. So if you have a few text links to the same page make sure the first one contains your primary key phrase.
Nofollowing the rest of the links wouldn't have any positive effect. The total link juice a page can pass on is divided by the total number of links on the page (regardless of follow or nofollow). In fact it can have a slightly negative effect as the juice from the nofollow links essentially evaporates instead of being passed to other pages on your site. Here is an article on how this works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Benefit of internal link in content
Hi, Is there a real benefit to having internal links in content other than at the bottom of a page for example and not surrounded by content. Would the benefit be 1 to 10 or 1 to 1.5 ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Breadcrumbs and internal links
Hello, I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Can internal links from a blog harm the ranking of a page?
Here is the situation: A site was moved from its original domain to its new domain, and at the same time, the external wordpress.com blog was moved to a subdirectory, making it an onsite blog. The two pages that rank the highest on the site have virtually no links from the blog and no external links, while all the other pages are linked extensively from the blog and have backlinks. Their targeted keywords are not so much easier to rank than the other pages for that to be the sole cause. To confuse the matter even more, there was a manual penalty affecting incoming links which was removed last month. The old site, which has many backlinks to the new site, is still in Google's index. The old blog however, has been redirected page by page and is not in Google's index. Most of the blog posts are short 1-paragraph company updates and potentially considered low quality content because of that (?) The common denominator among the two highest ranked pages (I'm talking top 3 in SERP v. page 3 or 4) seems to be either the lack of external backlinks or the lack of internal links from the blog. Could there be an issue with the blog such that internal links from it are detrimental rather than helpful?
Intermediate & Advanced SEO | | kimmiedawn0 -
How would Google reach internal pages on Zales with Lazy Load?
Hi, I encountered the following page on Zales:
Intermediate & Advanced SEO | | BeytzNet
http://engagementring.theprestigediamondcollection.com/NewEngagementRing/NewEring.aspx As you scroll down more items pop up (the well known Pinterest style).
Would Google bot be able to enter the product pages? I don't assume the bot "scrolls"... Thanks0 -
Duplicate page content errors stemming from CMS
Hello! We've recently relaunched (and completely restructured) our website. All looks well except for some duplicate content issues. Our internal CMS (custom) adds a /content/ to each page. Our development team has also set-up URLs to work without /content/. Is there a way I can tell Google that these are the same pages. I looked into the parameters tool, but that seemed more in-line with ecommerce and the like. Am I missing anything else?
Intermediate & Advanced SEO | | taylor.craig0 -
Best internal linking structure?
We are considering implementing a site-wide contextual linking structure. Does anyone have some good guidelines / blog posts on this topic? Our site is quite (over 1 million pages), so the contextual linking would be automated, but we need to define a set of rules. Basically, if we have a great page on 'healthy recipes,' should we make every instance of the word 'healthy recipes' link back to that page, or should we limit it to a certain number of pages?
Intermediate & Advanced SEO | | nicole.healthline0 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0