Link Juice + multiple links pointing to the same page
-
Scenario
The website has a menu consisting of 4 linksHome | Shoes | About Us | Contact Us
Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes
In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links.Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact?Any other advise or best practice would be appreciated.
Thanks Mark
-
Hi Remus & Kurt,
Thank you for your advise.
Mark
-
Remus's answer is good. I would add to that that Google has their first link filter. If you have two links pointing from page A to page B, Google only passes link authority (pagerank) and reputation (keywords in the anchor text and relevant surrounding text) through the first link that appears in the code. The second link does not pass anything. So, whatever the anchor text of the first link in the code is, that's the anchor text Google is going to use (Remus is right that anchor text has become less important).
The second link does, however, dilute the amount of pagerank passed. So, like Remus pointed out, each link in your scenario only passes 20% of the pagerank. Since Google ignores the second link to the shoe page, that 20% of pagerank does not get passed. I'm not sure if it stays on the page or just gets lost.
So, what does this all mean? From an SEO standpoint, you want the link with the targeted keyword to be first in the code if you have more than one link to a page. Also, you don't really want to have two links to the same page on that one page. Now, that's from an SEO perspective. From a user perspective, it may make perfect sense to have that second link and the page may convert better. So, you'd just have to decide which is more important...and it's probably the user perspective that's more important.
Kurt Steinbrueck
OurChurch.Com -
Hi Mark, really good questions.
- How many links would Google count as part of the link juice model?
There are a lot of opinions about this subject and there is no clear answer (it's really hard to test). Some time ago Google removed the effect of "nofollow" attribute for internal links, to cut the advantage SEO's gained by "pagerank sculpting". I think they did this so that search engine optimizers don't have a big advantage over standard websites. My personal opinion is that in terms of link juice lost Google would count 5, but the page benefiting won't get double the value. I think Google would only count the advantages of one of those links, whichever the best (probably the one in content. But on the other side, the link juice lost is not so important. The rest of the pages won't necessarily rank for popular terms.
I think that in-content links get way more advantages than just the "juice" and anchor text. The neighboring text is also important, the fact that it's in a block of text it's also important. Also it brings value to the users, who, might want to see all the shoes models when reading about them. I think you should definitely use this approach but just make sure you don't take it to an extreme.
-
20% to each link, but the shoes page won't get 20x2 from those 2 duplicate, maybe it will get 25 + some other advantages (personal oppinion!)
-
Changing anchor text had some effect in the past, but recently anchor text has less and less importance. It probably still has value. It's still an important ranking factor for 2013, and I would use it if I was you. But I would bring it to a new level. I would also think about the words in the context of the link. Try to link from all the relevant sections of the websites, and as you point to the shoes page from different contexts, naturally, the anchor text will change. For example you could link through our "shoe collection" from an article which compares between your shoes and competitor shoes.
I wrote an article for YouMoz a few years ago, some concepts might be a bit outdated because the ranking factors changed a lot since then. However it might give you some ideas to explore from a new perspective -> An Intelligent Way to Plan Your Internal Linking Structure
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
meta robots no follow on page for paid links
Hi I have a page containing paid links. i would like to add no follow attribute to these links
Intermediate & Advanced SEO | | Kung_fu_Panda
but from technical reasons, i can only place meta robots no follow on page level (
is that enough for telling Google that the links in this page are paid and and to prevent Google penlizling the sites that the page link to? Thanks!0 -
Multiple Author Rich Snippets On A Q&A Forum Page
Hi, I work on a site that has a robust q&a forum. Members post questions and other members answer the questions. The answers can be lengthy, often by experts with Google+ pages and almost always by multiple member/commenters answering a particular question. Much like Moz's forum here. In order to get rich snippets results in search for a single Q&A page, what would happen if each of, for instance, 10 commenters on a page, were tagged as author? After all, the q/a forum pages have many authors, each as author of their own comments. Or, should I pick one comment out of many and call that member/commenter the author or something else? If it matters, the person asking the question in the forum is almost always not the expert providing a ton of detailed content. Also, a question might be 8 words. One answer might be 25 to 500 or more and their might be 5 to 10 different answers. Thanks! Cheers... Darcy
Intermediate & Advanced SEO | | 945010 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
Can a home page penalty cause a drop in rankings for all pages?
All my main keywords have dropped out of the SERPS. Could it be that the home page (the strongest) page has been devalued and therefore 'link juice' that used to spread throughout the site is no longer doing so. Would this cause all other pages to drop? I just can't understand how all my pages have lost rankings. The site is still indexed so there's no problem there.
Intermediate & Advanced SEO | | SamCUK0 -
Outgoing affiliate links and link juice
I have some affiliate websites which have loads of outgoing affiliate links. I've discussed this with a SEO friend and talked about the effect of the link juice going out to the affiliate sites. To minimize this I've put "no follows" on the affiliate links but my friend says that even if you have no follow Google still then diminishes the amount of juice that goes to internal pages, for example if the page has 10 links, 9 are affiliate with no follow - Google will only give 10% of the juice to the 1 internal page. Does anyone know if this is the case? and whether there are any good techniques to keep as much link juice on the site as possible without transferring to affiliate links? Appreciate any thoughts on this! Cheers
Intermediate & Advanced SEO | | Ventura0 -
Use rel=canonical to save otherwise squandered link juice?
Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?
Intermediate & Advanced SEO | | cadenzajon1