Transferring link juice on a page with over 150 links
-
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links.
In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links
The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas?
Many thanks
-
Hi Jane,
Many thanks for that, good news indeed.
-
Hey Flo,
Good news! This went up literally yesterday: http://www.seroundtable.com/google-link-unlimited-18468.html
See the longer discussion here: https://productforums.google.com/forum/#!topic/webmasters/alde4GNOWp0/discussion
This is the first time a Googler has confirmed the lack of a limit on the crawling of internal links
-
Hi Jane,
Many thanks for you thoughts, highly appreciated.
The website itself has a kind of below average authority I'd say (DA 30 & PR 3) and that main index page will probably have just a few sentences (it will be just an index that will send the visitors to the right sub-page that will have the actual content). It is kind solely a list of links that will put the users on the right track.
Any ideas if successfully fetching as Google (from within GWT) will indicate they will crawl the page entirely (I know that the supposed 100 links per page limit was more like 100KB crawl limit, very hard to believe that this is still true as the average page size nowadays is north of 300KB https://developers.google.com/speed/articles/web-metrics)?
Thanks
-
Hi Flo,
I think it totally depends on the usefulness of the page and perhaps the ratio of links to supporting text / resources. A page with 150+ links certainly isn't automatically useless, and many big sites have that number of links on most of their pages due to navigational elements, long blog posts with multiple citations, outbound links from comments (albeit nofollowed), etc.
The 100 links per page limit is (hopefully) rather outdated and stems from when Google visited each page with a limit of how much data it could process. Very long, heavy pages would not be properly crawled, so their links wouldn't be properly crawled either, and PR would not pass properly out of all the links on the page. Google is now much more advanced and is used to dealing with a wide variety of page sizes.
A page with over 150 links can certainly perform well and there should not be a problem with appearing spammy or overdone if each link is contributing to the aim of the page. As I said previously, I would also try to include a fairly good amount of supporting text (or whatever type of supporting resource is appropriate - images, video, etc.) so that the page is not solely a list of links.
Hope this helps!
Jane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
How to find all crawlable links on a particular page?
Hi! This might sound like a newbie question, but I'm trying to find all crawlable links (that google bot sees), on a particular page of my website. I'm trying to use screaming frog, but that gives me all the links on that particular page, AND all subsequent pages in the given sub-directory. What I want is ONLY the crawlable links pointing away from a particular page. What is the best way to go about this? Thanks in advance.
Technical SEO | | AB_Newbie0 -
Pages not being indexed
Hi Moz community! We have a client for whom some of their pages are not ranking at all, although they do seem to be indexed by Google. They are in the real estate sector and this is an example of one: http://www.myhome.ie/residential/brochure/102-iveagh-gardens-crumlin-dublin-12/2289087 In the example above if you search for "102 iveagh gardens crumlin" on Google then they do not rank for that exact URL above - it's a similar one. And this page has been live for quite some time. Anyone got any thoughts on what might be at play here? Kind regards. Gavin
Technical SEO | | IrishTimes0 -
If my home page never shows up in SERPS but other pages do, does that mean Google is penalizing me?
So my website I do local SEO for, xyz.com is finally getting better on some keywords (Thanks SEOMOZ) But only pages that are like this xyz.com/better_widgets_ or xyz.com/mousetrap_removals Is Google penalizing me possibly for some duplicate content websites I have out there (working on, I know I know it is bad)...
Technical SEO | | greenhornet770 -
Testimonial pages
Is it better to have one long testimonial page on your site, or break it down into several smaller pages with testimonials? First time I've posted on the forum. But I'm excited! Ron
Technical SEO | | yatesandcojewelers0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590