Too Many Links on One Page - What to Do?!
-
Hello Geniuses, Prodigies, and Experts of the Field,
My website pages for www.1099pro.com have too many links on one page, something like 150-175, and I understand that each page should ideally be under 100. Most of these links, approx 105, come from dropdown navigation options in the header toolbar or the footer links.
It is my take that these links make our site easier to navigate but I'm sure that they are hurting my pagerank / SERPs. Is there a best way to handle a situation like this? I'd really prefer not to alter the header/footer layout of the entire site by removing 50-75 navigational links. The only other idea I have is below but I have no idea if it would work.
- For any link that I do not care to pass pagerank, institute a "nofollow" parameter. This would be my favorite option if it is viable.
-
That's good to hear and thanks for the input!
The MOZ page grader told me that over 100 links was too many and so did a commenter from a separate post. All clear now though.
-
The reasoning behind limiting the number of links is because the amount of authority that is passed by a page is divided by the total number of links on that page - regardless of nofollow or not. So, the fewer links the more authority you are passing to each of those internal pages. Answering your subsidiary question, there would be no SEO benefit from nofollowing these links.
That being said, usability trumps this in my book always. Go into your Google Analytics, and see which of these links people are actually clicking. If they are going into your drop down links, then leave them. If they are only clicking on the head link, then consider chopping them.
-
As long as they are all listed in the sitemap, dont worry about it. William is correct in that Google lifted the 100 links per page limit. The question is, do the links serve a purpose, or are they there to increase the page count? Perhaps you are not seeing the engagement or seo results you want due to the organization of the site structure?
BTW, who or what told you there was a need to reduce the link count? Unless you got manually penalized, why is there a need to do this?
-
Google updated their guidelines a while ago, and no longer suggests the 100 or less links per page. Now the guideline simply states, "Keep the links on a given page to a reasonable number," which is subjective. https://support.google.com/webmasters/answer/35769?hl=en
With a site like yours, full of different kinds of forms and such, it's logical to consider having 100+ links per page. There are other options for you as well, if you believe these links are hurting, but according to Google they likely are not.
If you wanted to try something different, you could think about building out detailed category pages for each sections of things you offer on the site, and make those the pages that rank for your terms. This way, the number of links on your main page is dramatically reduced, and the user experience might improve, since things aren't quite as condensed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
On Page Local SEO
What do you believe is the best approach when it comes to Local SEO for businesses in 2013?
Web Design | | BlueRockDigital0 -
404 page in Windows IIS. HELP!
I run a real estate website.
Web Design | | Jeepster
My webmaster needs to create a 404 page for listings when they get deleted.
So far all he's come up with is 302-redirect to a standard "error template" page.
Can anyone suggest a 404 how-to guide I can show him?
Thanks0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Transitioning to a dynamic home page. Impact on SEO?
Home page redesign advice, please. We're a growing college textbook publishing company; a unique one in that we publish everything under an open license. Our homepage www.flatworldknowledge.com has a solid page score (80), and since our product serves several different customers/audiences -- students, faculty, bookstores -- we're transitioning to a dynamic home page approach. Returning instructors will be served a personalized faculty page, returning students a student oriented page featuring the books they've most recently accessed, and first time/anon visitors will receive a more neutral welcome page until we know more about them. Pros, cons with this change to a dynamic homepage? What should we be thinking about/concerned about from an SEO perspective? How do you address title tags? Will this approach dilute page authority? Thanks all!
Web Design | | JasonBilog0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0 -
Flag page elements to not be loaded by Instapaper and co.
Does anybody know if there is a way to mark certain elements (especially navigation menus) so that instapaper and co don't pull them? I'm looking for a quick solution (best would be if it was CSS based) nothing fancy like parsing the user-agent. That would be plan B. I've added role="navigation" id="navigation" and class="navigation" to the nav elements in hope that it would work. Seems like it does not; sometimes the elements are present in the page generated by instapaper, sometimes not. Thank you for any replies and have a great day! Jan
Web Design | | jmueller0 -
Buying mutliple keyword rich domain names and directing them to one site
I've noticed some folks buying keyword rich domain names and pointing them to one site to try to rank for those keywords. An example of this is a plumbing business that buys domains like austinplumber.com, localaustinplumbingservice.com, bestplumberinaustin.com and then pointing these domains to their main website. Does this help the site rank for these key phrases? How does google see this? Thanks mozzers! Ron
Web Design | | Ron100