Link juice and max number of links clarification
-
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration.
- 50 PR page
- 10 links on page
- 5 * .9 = 4.5 PR goes to each link.
Correct?
If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links?
IE
- 50 PR page
- 150 links on the page
- .33 *.9 = .29PR to each link BUT only for 100 of them.
After that, the juice is just lost?
Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
-
As always in the SEO industry, there's no right answer for any particular case but I think you got a really structured approach to it. It would be great to know the results of your experiment. This could be a really good article in the seomoz community, let me know how it goes!
-
Agreed, the extreme repetition of the brand keywords and anchor text was one of my first arguments for dropping the section.
Think, from everything I've read so far, there appears to be an additional juice loss at one point but it would highly dependent on the trust of the page and the nature of the links. Certainly not a strong enough correlation to make part of my case however.
-
I think that the link #102 may have the same value of link #35, I don't think that adding many links diminishes the value of each one. What I assume however is that:
- having many links in one page diminishes the control you have on them, so google may crawl some of them and give different weight on each one. That0s why I'll better put fewer links
- you're right about having more links to your pages augment the possibility of have thoes pages in a better position against other. However as I said before, beware that google may not crawl all your links all the time. You can achieve the same proiportion of importance with less links (ex. 10 links vs 2 is the same of 100 vs 20: same weight more control and less internal spam risks.
- be wise when you build your links and try to not use too many anchor rich links. Even if you're onsite you don't want to let google think you're trying to overoptimize your page or its backlink profile. Create variations of your anchors and use them all.
-
The question come from a circumstance where 100's of links are contained in a supplemental tab on a product detail page. They link to applications of the product - each being a full product page. On some pages, there are only 40 links, other can be upwards of 1000 as the product is used as a replacement part for many other products.
I am championing the removal of the links, if not the whole tab. On a few pages, it would be useful to humans but clearly not on pages with 100s.
But if Google followed them all, then conceivably it would build a stronger "organic" structure to the catalogue as important products would get 1000's of links - others only a few.
Whatever value this might have, it would be negated if juice leaked faster after 100+ links.
From Matt's article above, "Google might choose not to follow or to index all those links." He also mentions them being a spam signal so I think it still wise to keep them low even if the 100kb limit has been lifted. Clearly there are still ramifications - a concept reinforced by this site's reports and comments.
To my question...from what both of you have said, it doesn't appear there is strong evidence a very high number of links directly causes additional penalty as far as link juice is concerned.
For the record, I'm not calculating PR or stuck on exact counts - my focus always starts with the end user. But, I'd hate to have a structural item that causes undue damage.
-
The context is a parts page where potential hundreds of link could be associate with other parts the item fit. I looking to firm up my argument against the concept so I want to understand better the true impact of the section.
If it was accelerating the decay of link juice, all the more reason. If not, they may actual help certain products appear organically stronger (i.e. a part that fits on a greater number of products will have more incoming links).
Navigation is actually quite tight (under 20 links) by modern standards.
-
As eyepaq said a 100 links limit is not the case anymore, however even if google is able to give value to them all it really makes sense to ahve so many links in your page? Are you using fat footers? Don't rely on that structure to give value to your internal pages, if you find 100 links in one page to be needed for users to navigate through your site try to restructure it a little and create different categories.
I don't know how much value is lost after 100 links but you should try to have tinier and themed list of links adding a further step in your navigation.google won't give hesmae value to those pages as users' won't either.
-
Hi,
You should count those at all. If you get stuck in counting and calculating PR and how much PR is passed from one page to another you will lose focus from what it dose matter. This dosen't.
About the 100 links per page - that was a very old technical limitation from Google's side. There is no longer the case.
See more here: http://www.mattcutts.com/blog/how-many-links-per-page/
and a fast 2 and so min video from Matt Cutts here: http://www.youtube.com/watch?v=l6g5hoBYlf0
So the bottom line is that you should not count and focus on PR and how much PR is passed -only look at things from a normal user and ask your self: dose t his page make sense ? Dose it make sense to have over 100 links on this page ?
Not sure if this was the answer you are looking for but ... hope it helps.
Cheers.
-
I used 'PR' mainly because 'juice points' sounded stupid.
I'm more interested in what happens past the ~100 links.
Does the remaining juice get reallocated or does the page leak at a higher rate?
-
Hi Spry, as you already mentioned, not all links has the same weight, there are navigationla links like in the footer, in the menu; also google may give some different weight among them, moreover some value may be reduced, and also there are some other factors that google uses to weight each link in a page that we don't know, but we may assume they have.
So given that we can calculate an aproximate value of juice passed from a link to another I won't rely so much in PR, the time you're spending in this caluclations may be given to other tasks. In general you may assume that the best pages to obtain links are pages which are nearest to the homepage of a site and which has the least number of outgoing (both internal and external) links.
Don't rely so much on PR, I've seen so many low page rank pages ranking well and high pr pages with no rankings that I think that you need to consider other parameters which are more important when it comes to linkbuilding: age of the domain, authority, topic related, etc etc.
If your calculations are made for onsite optimization just try to have your main pages higher in your site structure and linked directly from the homepage or from m ain categories.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Linking Pages - 404s
Hello, I have noticed that we have recently managed to accrue a large number of 404s that are listed as Page Title/URL of Linking Page in Moz (e.g. http://www.onexamination.com/international/) but I do not know which site they are coming from, is there an easy why to find out or shall we just create redirects for them all? Thanks in advance for your help. Rose
Technical SEO | | bmjcai1 -
Webmaster Tools Links To Your Site
I logged onto webmaster tools today for my site and the section 'Links to Your Site' is showing no data. Also if I search using link:babskibaby.com it only shows 1 link. My site had been showing 500+ links previously. Does anyone know why this is?
Technical SEO | | babski0 -
External Linking & Your sites Link juice
Hey guys, quick question. Does a page lose link juice when it gives link juice? If I link to an outside site, do I lose that same amount of link juice or is it just applied to there site and not removed from mine? I understand that linking to a competitor can in turn help him and hurt me (if he then is seen as more relevant than me to google) but does it have a direct relation to hurting/removing my page link juice? Hope this all makes sense. Thanks
Technical SEO | | SheffieldMarketing0 -
If people link to you incorrectly, does it hurt you?
In Google WM tools I'm seeing so many 404 crawl errors but they're all from other sites linking to us incorrectly, which I can't do anything about. Will this hurt us somehow as far as SEO goes? The logical thing would be that it would hurt the site doiing the linking but it does come up in OUR WM tools, so it makes me wonder.
Technical SEO | | UnderRugSwept0 -
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
We are working on a website that has some high-quality industry articles available on their website. For each article, there is an abstract with a link to the PDF which is hosted on the domain. We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article. Can we get any benefit from a direct PDF link? Or do we need to modify our strategy?
Technical SEO | | MattAaron0 -
What loss of value would this link experience?
What loss of value would this link experience? If the link is actually a link to the from site that is 301'd to your site like this example below: i.e., www.domain.com/29834?=www.yourdomain.com My thought is that simply because you're going through a redirect (In this case a 301) you will lose slight value there. But I'd love to hear your thoughts and reasoning on any other affects if any (direct or indirect) you think it may have.
Technical SEO | | Webfor1 -
If two links from one page link to another, how can I get the second link's anchor text to count?
I am working on an e-commerce site and on the category pages each of the product listings link to the product page twice. The first is an image link and then the second is the product name. I want to get the anchor text of the second link to count. If I no-follow the image link will that help at all? If not is there a way to do this?
Technical SEO | | JordanJudson0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0