Link juice and max number of links clarification
-
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration.
- 50 PR page
- 10 links on page
- 5 * .9 = 4.5 PR goes to each link.
Correct?
If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links?
IE
- 50 PR page
- 150 links on the page
- .33 *.9 = .29PR to each link BUT only for 100 of them.
After that, the juice is just lost?
Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
-
As always in the SEO industry, there's no right answer for any particular case but I think you got a really structured approach to it. It would be great to know the results of your experiment. This could be a really good article in the seomoz community, let me know how it goes!
-
Agreed, the extreme repetition of the brand keywords and anchor text was one of my first arguments for dropping the section.
Think, from everything I've read so far, there appears to be an additional juice loss at one point but it would highly dependent on the trust of the page and the nature of the links. Certainly not a strong enough correlation to make part of my case however.
-
I think that the link #102 may have the same value of link #35, I don't think that adding many links diminishes the value of each one. What I assume however is that:
- having many links in one page diminishes the control you have on them, so google may crawl some of them and give different weight on each one. That0s why I'll better put fewer links
- you're right about having more links to your pages augment the possibility of have thoes pages in a better position against other. However as I said before, beware that google may not crawl all your links all the time. You can achieve the same proiportion of importance with less links (ex. 10 links vs 2 is the same of 100 vs 20: same weight more control and less internal spam risks.
- be wise when you build your links and try to not use too many anchor rich links. Even if you're onsite you don't want to let google think you're trying to overoptimize your page or its backlink profile. Create variations of your anchors and use them all.
-
The question come from a circumstance where 100's of links are contained in a supplemental tab on a product detail page. They link to applications of the product - each being a full product page. On some pages, there are only 40 links, other can be upwards of 1000 as the product is used as a replacement part for many other products.
I am championing the removal of the links, if not the whole tab. On a few pages, it would be useful to humans but clearly not on pages with 100s.
But if Google followed them all, then conceivably it would build a stronger "organic" structure to the catalogue as important products would get 1000's of links - others only a few.
Whatever value this might have, it would be negated if juice leaked faster after 100+ links.
From Matt's article above, "Google might choose not to follow or to index all those links." He also mentions them being a spam signal so I think it still wise to keep them low even if the 100kb limit has been lifted. Clearly there are still ramifications - a concept reinforced by this site's reports and comments.
To my question...from what both of you have said, it doesn't appear there is strong evidence a very high number of links directly causes additional penalty as far as link juice is concerned.
For the record, I'm not calculating PR or stuck on exact counts - my focus always starts with the end user. But, I'd hate to have a structural item that causes undue damage.
-
The context is a parts page where potential hundreds of link could be associate with other parts the item fit. I looking to firm up my argument against the concept so I want to understand better the true impact of the section.
If it was accelerating the decay of link juice, all the more reason. If not, they may actual help certain products appear organically stronger (i.e. a part that fits on a greater number of products will have more incoming links).
Navigation is actually quite tight (under 20 links) by modern standards.
-
As eyepaq said a 100 links limit is not the case anymore, however even if google is able to give value to them all it really makes sense to ahve so many links in your page? Are you using fat footers? Don't rely on that structure to give value to your internal pages, if you find 100 links in one page to be needed for users to navigate through your site try to restructure it a little and create different categories.
I don't know how much value is lost after 100 links but you should try to have tinier and themed list of links adding a further step in your navigation.google won't give hesmae value to those pages as users' won't either.
-
Hi,
You should count those at all. If you get stuck in counting and calculating PR and how much PR is passed from one page to another you will lose focus from what it dose matter. This dosen't.
About the 100 links per page - that was a very old technical limitation from Google's side. There is no longer the case.
See more here: http://www.mattcutts.com/blog/how-many-links-per-page/
and a fast 2 and so min video from Matt Cutts here: http://www.youtube.com/watch?v=l6g5hoBYlf0
So the bottom line is that you should not count and focus on PR and how much PR is passed -only look at things from a normal user and ask your self: dose t his page make sense ? Dose it make sense to have over 100 links on this page ?
Not sure if this was the answer you are looking for but ... hope it helps.
Cheers.
-
I used 'PR' mainly because 'juice points' sounded stupid.
I'm more interested in what happens past the ~100 links.
Does the remaining juice get reallocated or does the page leak at a higher rate?
-
Hi Spry, as you already mentioned, not all links has the same weight, there are navigationla links like in the footer, in the menu; also google may give some different weight among them, moreover some value may be reduced, and also there are some other factors that google uses to weight each link in a page that we don't know, but we may assume they have.
So given that we can calculate an aproximate value of juice passed from a link to another I won't rely so much in PR, the time you're spending in this caluclations may be given to other tasks. In general you may assume that the best pages to obtain links are pages which are nearest to the homepage of a site and which has the least number of outgoing (both internal and external) links.
Don't rely so much on PR, I've seen so many low page rank pages ranking well and high pr pages with no rankings that I think that you need to consider other parameters which are more important when it comes to linkbuilding: age of the domain, authority, topic related, etc etc.
If your calculations are made for onsite optimization just try to have your main pages higher in your site structure and linked directly from the homepage or from m ain categories.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect for multiple links
I just relaunched my website and changed a permalink structure for several pages where only a subdirectory name changed. What 301 Redirect code do I use to redirect the following? I have dozens of these where I need to change just the directory name from "urban-living" to "urban", and want it to catch the following all in one redirect command. Here is an example of the structure that needs to change. Old
Technical SEO | | shawnbeaird
domain.com/urban-living (single page w/ content)
domain.com/urban-living/tempe (single page w/ content)
domain.com/urban-living/tempe/the-vale (single page w/ content) New
domain.com/urban
domain.com/urban/tempe
domain.com/urban/tempe/the-vale0 -
Huge Drop in External Links
Hello, My total external links dropped from 900 to zero on a site that I've had running for about 10 years. What could possibly cause this? Thanks, Joe
Technical SEO | | Joe20130 -
Optimizing internal links or over-optimizing?
For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site. With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions. The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow. So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide). And overnight, the internal links page of that GWT account disappeared. Blank, no links. Now... Mhhh... I feel like... Ops! Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot. Anyone had similar experience? Could this be seen by google bot as over-optimizing and be penalized? Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links? Any idea?
Technical SEO | | max.favilli0 -
Links below linking (not sitelinks)
Hi All, Please can you let me know the name and / or point me at an article / blog / directory on how best to achieve additional links under a search engine listing (I don't mean site links) e.g. I do a search for 'home insurance' on Google.co.uk and under the listing for Compare the Market it has - home insurance, building insurance and landlords insurance. Thanks for your help!
Technical SEO | | Joseph-Vodafone0 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
Press release not giving me my link juice
The other day we released a press release, see it here http://www.businesswire.com/news/home/20120717006087/en/Rapid7-Metasploit-Pro-Increases-Vulnerability-Management-Efficiency. I asked them to include two links (seen in the first paragraph) with targeted anchor text (vulnerability management and penetration testing). The press release was published and when I check the open site explorer to see if I got any link juice from the press release, I am not seeing the link...ugh I noticed that they are using some sort of tracking code that seems to be redirecting the link, is this the problem? I talked to our sales rep at businesswire and he told me that they could take the code off if that is what needs to be done. Do you have any insight into this or have you ever ran into this problem?
Technical SEO | | PatBausemer0 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0