Does Navigation Bar have an effect on the link juice and the number of internal links?
-
Hi Moz community,
I am getting the "Avoid Too Many Internal Links" error from Moz for most of my pages and Google declared the max number as 100 internal links. However, most of my pages can't have internal links less than 100, since it is a commercial website and there are many categories that I have to show to my visitors by using the drop down navigation bar. Without counting the links in the navigation bar, the number of internal links is below 100.
I am wondering if the navigation bar links affect the link juice and counted as internal links by Google. The Same question also applies to the links in the footer.
Additionally, how about the products? I have hundreds of products in the category pages and even though I use pagination I still have many links in the category pages (probably more than 100 without even counting the navigation bar links). Does Google count the product links as internal links and how about the effect on the link juice?
Here is the website if you want to take a look: http://www.goldstore.com.tr
Thank you for your answers.
-
Hi onurcan-ikiz!
Moz has a great blog post that discusses how many links is too many. I would check that out for advice—while there isn't an exact number of links you should not exceed, many people suggest having fewer than 100 links per page.
If you main navigation as a lot of links I would be worried about the link juice. When your website receives a backlink from another website, hopefully with a high domain authority (YAY!), the link juice is being spread out to ALL the pages being linked from the page they are linking to.
This means if www.cnn.com (who has a domain authority of 96) links to your company's homepage, they would be spreading some of their authority to you through "link juice". But instead of retaining majority of that juice/authority on the homepage (thus increasing the authority on that page), you are going to be spreading fewer and fewer amounts of that authority through all 100+ pages linked from the main navigation.
Check out this link juice diagram to get a visual representation of what I am talking about.
Hope this helps!
-
"100 link rule": At some point in early 2016, John Müller or sb. else at Google said "a reasonable number / a few thousand links at most". Unfortunately I did only save the statement, not the exact source/date.
Nico
-
Hi,
I agree with EGOL that the "100 links" rule is old information.
To more specifically answer your question, yes, all links in your global navigation, footer and links on category and product pages are all counted as internal links and all (provided you haven't done anything silly like added "no follow" attributes) pass link equity throughout your site. For this reason it's important to be strategic about the architecture of your navigation and internal linking structure. Ideally, your top most important pages should be included, if possible in your navigation and/or footer.
It's not unusual for large eCommerce sites to have significantly more than 100 links on a given page.
For example, Home Depot ranks #2 in Google for the term "flushmount lights" with this page: http://www.homedepot.com/b/Lighting-Ceiling-Fans-Ceiling-Lights-Flushmount-Lights/N-5yc1vZc7nk
As you can see from the attached screenshot, this page has 523 links on it. While clearly exceeding the "100 links" - this page still has no problem ranking very well for a targeted keyword.
For verification that Google dropped the "100 links" rule, check out this Matt Cutts video from November, 2013 - https://www.youtube.com/watch?v=QHG6BkmzDEM
EGOL is also right that Moz should update their suggested SEO best practices to reflect more current methodology.
Hope that's helpful!
Dana
-
"Google declared the max number as 100 internal links."
This is old information.
"Avoid Too Many Internal Links" error from Moz "
I think that Moz needs to rethink this, though I know a lot of people will disagree with me... but I am willing to bet big on myself.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Avoid Too Many Internal Links" when you have a mega menu
Using the on-page grader and whilst further investigating internal linking, I'm concerned that as the ecommerce website has a very link heavy mega menu the rule of 100 may be impeding on the contextual links we're creating. Clearly we don't want to no-follow our entire menu. Should we consider no-indexing the third-level- for example short sleeve shirts here... Clothing > Shirts > Short Sleeve Shirts What about other pages we're don't care to index anyway such as the 'login page' the 'cart' the search button? Any thoughts appreciated.
Intermediate & Advanced SEO | | Ant-Scarborough0 -
Phone number link / critical crawler issue
I've got 15 critical crawler issues coming up, all of which are ( tel: )links to the contact phone number. As this is a taxi firm, these links are pretty vital to customer conversion. Should I worry about these issues from an SEO perspective? If so, is there anything I can do about it?
Intermediate & Advanced SEO | | Paul7300 -
Hammered by Spam links
When we moved from one host to another in Wordpress engine, we had this insertion weird redirect thing happen. We 410'd the page cgi-sys/movingpage.cgi, but it hit us hard in the anchors. If you go to ahrefs, we are literally all Asian in anchors text. Anybody have any suggestions, thank goodness it looks like it finally stopped. I am looking for creative ways to repopulate our back end with the right stuff. Any thoughts would be great! Heres a example: allartalocaltours.com/tumi-tote-401.html ↳customerbloom.com/cgi-sys/movingpage.cgi ↳www.customerbloom.com/cgi-sys/movingpage.cgi ↳lockwww.customerbloom.com/cgi-sys/movingpage.cgi
Intermediate & Advanced SEO | | mattguitar990 -
Linking and non-linking root domains
Hi, Is there any affect on SEO based on the ratio of linking root domains to non-linking root domains and if so what is the affect? Thanks
Intermediate & Advanced SEO | | halloranc0 -
Heavy Internal Linking Help
One of the sites I work on is a home improvement ecommerce website that does fairly well for its niche. One of the biggest problems that we're not sure how to adequately handle is a heavy internal linking issue. The homepage (http://www.fauxpanels.com/) has approx. 226 internal links which is mainly due to the navigation structure. There are far worse pages though (the Samples page http://www.fauxpanels.com/samples.php has over 800 internal links). For the most part, management doesn't want any massive changes to the navigation layout. The Top navigation bar has a number of dropdown menus when you hover, the Left Navigation Bar expands to show more choices, and the Bottom navigation bar in many instances is just repeats of links that can be found elsewhere. Also, the product links in the body of the page can be found linked in the Left Navigation. This is not what I would personally consider the best way to handle navigation but the Customer Service Department has gotten numerous calls and emails over the years about how much people love our navigation and how easy it is to find things. My thought was trying to lessen the amount of links by having things grouped more often into Category pages/hub pages where applicable so we can remove some of the links. We've also considered NoFollowing links but my understanding is that even if you NoFollow the link equity is still divided by the number of on-page links. So, any of you much more experienced SEOs have any idea how I can lessen the heavy internal linking without completely re-doing the site's navigation layout and not harming link equity, ranking, etc.? Or, conversely, would you consider having an average 200-300 internal links per page not to be a real issue given the positive effect it has apparently had on user experience?
Intermediate & Advanced SEO | | MikeRoberts0 -
Do links in the nav bar help SEO?
If I am building a Nav bar should I use my keywords or make it easier for the user to find what they are looking for. IMO one should ALWAYS make a site based on user experience. If it Google and other SEs do count Nav links, would it be best to place more important keys first?
Intermediate & Advanced SEO | | SEODinosaur0 -
Best way to consolidate link juice
I've got a conundrum I would appreciate your thoughts on. I have a main container page listing a group of products, linking out to individual product pages. The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products. Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe. To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page. However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page. Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this. Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method? Thanks
Intermediate & Advanced SEO | | James770 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0