Too many links in header menu
-
I'm working on a few clients who are starting to get big header menus. Their site now easily exceeds the 100 links per page recommendation.
Normally I would recommend them to cut down on the links, bit in this case these sites have menus that makes navigation easier. I honestly think these menus adds value for the users.
The dilemma is that I think the menus provide value from an UX standpoint, but I'm not sure from the SEO standpoint.
Any recommendations to this dilemma?
Some examples:
-
Nofollowing some of the links would not help. Google changed the algorithm awhile back, so the PageRank allotted to nofollow links simply disappears, instead of being allocated to other links on the page.
-
How about have some of those linksbecome nofollow, would that help Can someone confirm this?
-
Well, the link juice is distributed according to the number of links on the page. So if you want link juice flowing to all those pages, you're good. If some of those pages don't really need link juice, you would do better to remove the links to them (from an SEO perspective). If you remove links to some of the pages, more link juice will flow to the other pages.
-
I know that, but what about the distribution of "link juice"?
-
Should be fine. Google won't penalize you for going over 100 links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Maintaining link value during site downtime
We are nearly finished rebuilding a client website, but they want to have a "dark launch" period for 4 days prior to the public site launch. During that 4-day period, we will be converting their server, so they want to take down the old site and instead send users a "coming soon" message. Although we have the old site pages set up to 301 for the public launch, I'm concerned that this dark period is going to hurt the link value on the old site pages. During this 4-day period, should we be setting a 503 status code on the old site that automatically serves the "coming soon" message? Or, should all old site pages be temporarily redirected to the "coming soon" landing page? Any other recommendations are appreciated as well.
Technical SEO | | AHartman2 -
Do bad links to a sub-domain which redirects to our primary domain pass link juice and hurt rankings?
Sometime in the distant past there existed a blog.domain.com for domain.com. This was before we started work for domain.com. During the process of optimizing domain.com we decided to 301 blog.domain.com to www.domain.com. Recently, we discovered that blog.domain.com actually has a lot of bad links pointing towards it. By a lot I mean, 5000+. I am curious to hear people's opinions on the following: 1. Are they passing bad link juice? 2. does Google consider links to a sub-domain being passed through a 301 to be bad links to our primary domain? 3. The best approach to having these links removed?
Technical SEO | | Shredward0 -
Cross links between sites
hi, We have several ecommerce sites and we cross linked 3 of them by mistake. We realize that the sites were linked through WMT, We have shut down 2 of the sites about 2 months ago, but WMT still shows the links coming from those 2 sites. how do we make sure that google will see the sites are shut down. Is there a better of way resolving this issue. We are no longer using those sites, so do not need them to be active. whats the best solution to show google that the links are no longer there. Crawler shows that it was able to crawl the site 45 days after it is shut down. thanks nick
Technical SEO | | orion680 -
How do I remove Links to my website???
Hi Guys, Please can anyone help!! Can anyone tell me how on earth I can remove links to my website? My website has been hit by the new penguin update and the company that was doing my SEO seems to have built a lot of spammy links!! How can I remove these links??? Please can anyone help Thanks Gareth
Technical SEO | | GAZ090 -
WIki Contextual Links
I want to understand what are Wiki Contextual Links and how are they helpful for SEO. I hear google likes them. Is that true?
Technical SEO | | KS__0 -
Drop down navigation and link juice
Hi! We are desperately needing to overhaul our site navigation setup, and we have so many categories that we think our site could really benefit from a drop down navigation similar to what these sites have: http://www.paychex.com/ http://www.bmc.com/ We've held off doing this type of navigation in the past because we were only seeing people use flash to create it and we knew that it wouldn't be good for link juice. But these two sites are using HTML and CSS - which seems like a much better style and good for SEO. Do you agree? We want to make the switch but are worried about losing linking power by nesting our navigation in 's and CSS styling.
Technical SEO | | sciway0 -
Too many on page links for WP blog page
Hello, I have set my WP blog to a page so new posts go to that page making it the blog. On a SEOmoz campaign crawl, it says there are too many links on one page, so does this mean that as I am posting my blog posts to this page, the search engines are seeing the page as one page with links instead of the blog posts? I worry that if I continue to add more posts (which obviously I want to) the links will increase more and more, meaning that they will be discounted due to too many links. What can I do to rectify this? Many thanks in advance
Technical SEO | | mozUser14692366292850 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0