Link juice and max number of links clarification
-
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration.
- 50 PR page
- 10 links on page
- 5 * .9 = 4.5 PR goes to each link.
Correct?
If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links?
IE
- 50 PR page
- 150 links on the page
- .33 *.9 = .29PR to each link BUT only for 100 of them.
After that, the juice is just lost?
Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
-
As always in the SEO industry, there's no right answer for any particular case but I think you got a really structured approach to it. It would be great to know the results of your experiment. This could be a really good article in the seomoz community, let me know how it goes!
-
Agreed, the extreme repetition of the brand keywords and anchor text was one of my first arguments for dropping the section.
Think, from everything I've read so far, there appears to be an additional juice loss at one point but it would highly dependent on the trust of the page and the nature of the links. Certainly not a strong enough correlation to make part of my case however.
-
I think that the link #102 may have the same value of link #35, I don't think that adding many links diminishes the value of each one. What I assume however is that:
- having many links in one page diminishes the control you have on them, so google may crawl some of them and give different weight on each one. That0s why I'll better put fewer links
- you're right about having more links to your pages augment the possibility of have thoes pages in a better position against other. However as I said before, beware that google may not crawl all your links all the time. You can achieve the same proiportion of importance with less links (ex. 10 links vs 2 is the same of 100 vs 20: same weight more control and less internal spam risks.
- be wise when you build your links and try to not use too many anchor rich links. Even if you're onsite you don't want to let google think you're trying to overoptimize your page or its backlink profile. Create variations of your anchors and use them all.
-
The question come from a circumstance where 100's of links are contained in a supplemental tab on a product detail page. They link to applications of the product - each being a full product page. On some pages, there are only 40 links, other can be upwards of 1000 as the product is used as a replacement part for many other products.
I am championing the removal of the links, if not the whole tab. On a few pages, it would be useful to humans but clearly not on pages with 100s.
But if Google followed them all, then conceivably it would build a stronger "organic" structure to the catalogue as important products would get 1000's of links - others only a few.
Whatever value this might have, it would be negated if juice leaked faster after 100+ links.
From Matt's article above, "Google might choose not to follow or to index all those links." He also mentions them being a spam signal so I think it still wise to keep them low even if the 100kb limit has been lifted. Clearly there are still ramifications - a concept reinforced by this site's reports and comments.
To my question...from what both of you have said, it doesn't appear there is strong evidence a very high number of links directly causes additional penalty as far as link juice is concerned.
For the record, I'm not calculating PR or stuck on exact counts - my focus always starts with the end user. But, I'd hate to have a structural item that causes undue damage.
-
The context is a parts page where potential hundreds of link could be associate with other parts the item fit. I looking to firm up my argument against the concept so I want to understand better the true impact of the section.
If it was accelerating the decay of link juice, all the more reason. If not, they may actual help certain products appear organically stronger (i.e. a part that fits on a greater number of products will have more incoming links).
Navigation is actually quite tight (under 20 links) by modern standards.
-
As eyepaq said a 100 links limit is not the case anymore, however even if google is able to give value to them all it really makes sense to ahve so many links in your page? Are you using fat footers? Don't rely on that structure to give value to your internal pages, if you find 100 links in one page to be needed for users to navigate through your site try to restructure it a little and create different categories.
I don't know how much value is lost after 100 links but you should try to have tinier and themed list of links adding a further step in your navigation.google won't give hesmae value to those pages as users' won't either.
-
Hi,
You should count those at all. If you get stuck in counting and calculating PR and how much PR is passed from one page to another you will lose focus from what it dose matter. This dosen't.
About the 100 links per page - that was a very old technical limitation from Google's side. There is no longer the case.
See more here: http://www.mattcutts.com/blog/how-many-links-per-page/
and a fast 2 and so min video from Matt Cutts here: http://www.youtube.com/watch?v=l6g5hoBYlf0
So the bottom line is that you should not count and focus on PR and how much PR is passed -only look at things from a normal user and ask your self: dose t his page make sense ? Dose it make sense to have over 100 links on this page ?
Not sure if this was the answer you are looking for but ... hope it helps.
Cheers.
-
I used 'PR' mainly because 'juice points' sounded stupid.
I'm more interested in what happens past the ~100 links.
Does the remaining juice get reallocated or does the page leak at a higher rate?
-
Hi Spry, as you already mentioned, not all links has the same weight, there are navigationla links like in the footer, in the menu; also google may give some different weight among them, moreover some value may be reduced, and also there are some other factors that google uses to weight each link in a page that we don't know, but we may assume they have.
So given that we can calculate an aproximate value of juice passed from a link to another I won't rely so much in PR, the time you're spending in this caluclations may be given to other tasks. In general you may assume that the best pages to obtain links are pages which are nearest to the homepage of a site and which has the least number of outgoing (both internal and external) links.
Don't rely so much on PR, I've seen so many low page rank pages ranking well and high pr pages with no rankings that I think that you need to consider other parameters which are more important when it comes to linkbuilding: age of the domain, authority, topic related, etc etc.
If your calculations are made for onsite optimization just try to have your main pages higher in your site structure and linked directly from the homepage or from m ain categories.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does "google selected canonical" pass link juice the same as "user selected canonical"?
We are in a bit of a tricky situation since a key top-level page with lots of external links has been selected as a duplicate by Google. We do not have any canonical tag in place. Now this is fine if Google passes the link juice towards the page they have selected as canonical (an identical top-level page)- does anyone know the answer to this question? Due to various reasons, we can't put a canonical tag ourselves at this moment in time. So my question is, does a Google selected canonical work the same way and pass link juice as a user selected canonical? Thanks!
Technical SEO | | Lewald10 -
How big of a problem is this? - link cannonicalization I think?
Hello! I'm new here. My moz Pro account is flagging my website for having 282 duplicate pages, which it is saying are critical issues. I looked at this and it seems like the problem is that many of my pages are being indexed like: www.joeborders.com and joeborders.com and oeborders.com/index. I think this is an issue with link cannonicalization...right? I contacted my website builder/host a while ago and they said they don't have a way to to link cannonicalization....is this a huge problem?...Is there some way to do it that I'm missing? Should i get a new website builder/host? 😛 for reference, this is what my builder/host said when I asked them about it: "Good question, at the moment we don't offer it, I will add it to our feature request list, as I think it would be a good idea. In a traditional hosting environment this would be using a htaccess file, since we are in ruby on rails environment we would need come up with a custom solution."
Technical SEO | | joebordersmft0 -
Suite Numbers and Schema
A potentially stupid question. Is the suite number included within the tag, or should it sit outside of it? The reason I ask is because (a) I've seen it where the suite number sits outside that tag and (b) Google My Business best practices, I've been told (by Google support), is to include the suite in the second address line. I'm wondering if that translates in some way to the local schema on your site. On the other hand, it makes sense to include your suite number within the streetAddress span tag, but sometimes what makes sense doesn't really make sense when you know more, so I'm just covering my bases. Thank you!
Technical SEO | | nowmedia11 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Link from Blogspot.com subdomain...
I have found an author that has an article about a particular product we sell online. I was thinking of speaking to them about getting a link to our site. But then I looked at the stats: <label>Page:</label><label id="Page Authority" class="key lsdata">PA:1</label><label id="mozrank" class="key lsdata" title="MozRank">mR:0.00</label>mT:0.00<label id="SEOmoz-data-uid">0</label> links from
Technical SEO | | bjs2010
<label id="SEOmoz-data-uipl">0</label> Root Domains<label>Root Domain:</label>**<label id="dom-pageauthority" class="key lsdata" title="Domain Authority">DA: 59</label>**24,797,212 links from
<label id="SEOmoz-data-pid">110,858</label> Domains<label>Subdomain:</label>Its on a subdomain of blogspot.com - and the page is relevant to a particular category and our e-commerce site.Is it worth pursuing the link?Thanks!0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
What is link Schemes?
Hello Friends, Today I am reading about link schemes on http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356 there are a several ways how to avoid Google penalties and also talk about the low quality links. But I can't understand about "Low-quality directory or bookmark site links" Is there he talked about low page rank, Alexa or something else?
Technical SEO | | KLLC0 -
Google Links
I am assuming that the list presented by Google Webmaster tools (TRAFFIC | Links To Your Site) is the one that will actually be used by Google for indexing ? There seem to be quite a few links that there that should not be there. ie Assumed NOFOLLOW links. Am I working under an incorrect assumption that all links in webmaster tools are actually followed ?
Technical SEO | | blinkybill0