Transferring link juice on a page with over 150 links
-
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links.
In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links
The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas?
Many thanks
-
Hi Jane,
Many thanks for that, good news indeed.
-
Hey Flo,
Good news! This went up literally yesterday: http://www.seroundtable.com/google-link-unlimited-18468.html
See the longer discussion here: https://productforums.google.com/forum/#!topic/webmasters/alde4GNOWp0/discussion
This is the first time a Googler has confirmed the lack of a limit on the crawling of internal links
-
Hi Jane,
Many thanks for you thoughts, highly appreciated.
The website itself has a kind of below average authority I'd say (DA 30 & PR 3) and that main index page will probably have just a few sentences (it will be just an index that will send the visitors to the right sub-page that will have the actual content). It is kind solely a list of links that will put the users on the right track.
Any ideas if successfully fetching as Google (from within GWT) will indicate they will crawl the page entirely (I know that the supposed 100 links per page limit was more like 100KB crawl limit, very hard to believe that this is still true as the average page size nowadays is north of 300KB https://developers.google.com/speed/articles/web-metrics)?
Thanks
-
Hi Flo,
I think it totally depends on the usefulness of the page and perhaps the ratio of links to supporting text / resources. A page with 150+ links certainly isn't automatically useless, and many big sites have that number of links on most of their pages due to navigational elements, long blog posts with multiple citations, outbound links from comments (albeit nofollowed), etc.
The 100 links per page limit is (hopefully) rather outdated and stems from when Google visited each page with a limit of how much data it could process. Very long, heavy pages would not be properly crawled, so their links wouldn't be properly crawled either, and PR would not pass properly out of all the links on the page. Google is now much more advanced and is used to dealing with a wide variety of page sizes.
A page with over 150 links can certainly perform well and there should not be a problem with appearing spammy or overdone if each link is contributing to the aim of the page. As I said previously, I would also try to include a fairly good amount of supporting text (or whatever type of supporting resource is appropriate - images, video, etc.) so that the page is not solely a list of links.
Hope this helps!
Jane
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitors link building surely link farming ? but no punishment?
Hi there added a competitors metrics to see what they were doing and to my amazement they seem to have 1000+ links surely this is link farming considering we stay in a very remote area. also why would he be rewarded for this not punished? imgur.com/18dUqNL 18dUqNL 18dUqNL
Technical SEO | | ShauniBROWN2 -
Page Content
Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. Unfortunately we have understood by now that many customers have entered exactly same title for their listings which has caused us having hundreds of similar page title. We have corrected all the pages which had similar meta tag and duplicate page title tags. We have also inserted controls to our software to prevent generating duplicate page title tags or meta tags. But also the page content quality not very good because page content added by customer.(example: http://www.enakliyat.com.tr/detaylar/evden-eve--6001) What should I do. Please help me.
Technical SEO | | iskq0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
How to optimize for new subdomain when root domain has all link juice and built up authority?
We recently took control of a root domain for a business that was not doing e-commerce. They just had a single page business card website at the root domain. However, it had been around long enough to have built up some amount of domain authority and link juice. When we took over to enable the site with e-commerce, we redirected the root domain to point to a www subdomain where the store is now located. Now, in my seomoz campaign, i see that all the link juice and authority stats are in the root domain metrics, and the subdomain we are tracking has nothing. What is the best way for me to take advantage of all the built up authority for the root domain to help with the newly enabled ecommerce site at the subdomain? or am I basically starting from scratch since i have been reading that link juice does not flow as well from root domains to subdomains. thank you and happy new year to all!
Technical SEO | | devinjy0 -
How do I 301 redirect a number of pages to one page
I want to redirect all pages in /folder_A /folder_B to /folder_A/index.php. Can I just write one or two lines of code to .htaccess to do that?
Technical SEO | | Heydarian0 -
Should i redirect my lost links to my home page
Hi, as some of you maybe aware, i had a major problem last year that has caused me nothing but trouble. in short, my hosting company lost me over 10,000 pages from my site and i had to rebuild the site from stratch which is still on going. I lost thousands of links to my site and i have been over the past week pointing the pages not found to the sections that is best suited to them. But i am just wondering if it would harm my site if i also point some of those links to my home page. I was a page rank four before disaster happened to my site and now i am a page rank two and i want to build this up. so i am just wondering if i should point some of those good links to my home page i am redirecting the pages using 301 in my htaccess file any advice would be great
Technical SEO | | ClaireH-1848860 -
Can I reduce link count by no following links?
Hi, A large number of my pages contain over 100 links. This is due to a large drop down navigation which is on every page. To reduce my link count could I just no follow these navigation links or would I have to remove the navigation completely?
Technical SEO | | moesian0 -
Is robots.txt a must-have for 150 page well-structured site?
By looking in my logs I see dozens of 404 errors each day from different bots trying to load robots.txt. I have a small site (150 pages) with clean navigation that allows the bots to index the whole site (which they are doing). There are no secret areas I don't want the bots to find (the secret areas are behind a Login so the bots won't see them). I have used rel=nofollow for internal links that point to my Login page. Is there any reason to include a generic robots.txt file that contains "user-agent: *"? I have a minor reason: to stop getting 404 errors and clean up my error logs so I can find other issues that may exist. But I'm wondering if not having a robots.txt file is the same as some default blank file (or 1-line file giving all bots all access)?
Technical SEO | | scanlin0