Too many links on page -- how to fix
-
We are getting reports that there are too many links on most of the pages in one of the sites we manage.
Not just a few too many... 275 (versus <100 that is the target).
The entire site is built with a very heavy global navigation, which contains a lot of links -- so while the users don't see all of that, Google does.
Short of re-architecting the site, can you suggest ways to provide site navigation that don't violate this rule?
-
Dr. Pete has a good post about this warning at http://www.seomoz.org/blog/how-many-links-is-too-many that may help you out.
-
How much damage does this issue really do?
I see many sites in our industry (enterprise software) whose pages are equally heavy with links. They still seem to enjoy a high page rank. Look at Microsoft, IBM, Oracle, Red Hat...
Just wondering how this compares to other issues that would be easier to address.
-
I don't know what you mean that users don't see the links and Google does.
I mean that most of these links are on drop-down menus so they aren't visible unless someone hovers over the tab. My point was that the user experience isn't affected by the presence of all of these links on the page, even though it is a problem from an SEO perspective.
The page in question is: http://www.novell.com/products/groupwise/ -- but since we use the same global navigation across the entire site, every page on the site is getting the same SEOMoz warning about too many links.
-
I do think a navigation restructure is the way to do it and a way to better focus your link juice. No following certain links no longer works for internal site links - Google see's through that now.
Instead of your categories nav including a/or multiple sub cat nav links (im guessing thats how it currently works) on every page, inc the sub cat links on the main cat pages not as part of the main cat nav e.g. dont inc red widget, blue widget, green widget under the main 'widget' navigation, instead add links to red, blue, green widget on the main widget page. Hope that makes sense.
-
You should bucket the categories into 10-20 primary categories. If you have have lower level categories that are potential traffic drivers, you can feature a few select categories along the primary categories. Every link takes away Pagerank from the host page and too many links on every page will devalue the pages. It's hard to give clear directions without seeing the site,
I don't know what you mean that users don't see the links and Google does. Are you hiding links? That may penalize the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https://www.fitness-china.com/pilates-equipment How to find the most relevant internal link pages
https://www.fitness-china.com/pilates-equipment How to find the most relevant internal link pages? site this page, find https://www.fitness-china.com/pilates-equipment-kr and https://www.fitness-china.com/pilates-equipment-jp Is there any other better way
On-Page Optimization | | ahislop5740 -
Which Is More Important? Building a web page for customer reviews or a careers page?
Hello, I am wondering which would be more important to have on a website a customer review page or a careers page? And as far an SEO advantage which is more important and why?
On-Page Optimization | | Nicks1230 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Not sure if I need to be concerned with duplicate content plus too many links
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes. We seem to have a lot of duplicate content and duplicate titles. This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed? The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do? Many thanks.
On-Page Optimization | | Niamh20 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
How many outbound links is too many outbound links?
As a part of our SEO strategy, we have been focusing on writing several high quality articles with unique content. In these articles we regularly link to other websites when they are high quality, authoritative sites. Typically, the articles are 500 words or more and have 3-5 outbound links, but in some cases there are as many as 7 or 8 outbound links. Before we get too carried away with outbound links, I wanted to get some opinions on how many outbound links we should be trying to include and more information on how the outbound links work. Do they pass our website's authority on to the other website? Could our current linking strategy cause future SEO problems? Finally, do you have any suggestions for guidelines we should be using? Thank you for your help!
On-Page Optimization | | airnwater0 -
Do NoFollow links still split link equity?
So I realize that Google will split link equity between all links on any given page. Example, if a landing page has 10 links then the authority from the landing page is split into 10 and each link given its own smaller amount of equity from that landing page. My question is if I were to turn 9 of the 10 links on this page to NoFollow links would the equity still remain split 10 ways or would it simply pass all of it to the one DoFollow link left on the page?
On-Page Optimization | | PageOnePowerGang0 -
Duplicate pages
Hi, I am using a CMS that generates dynamic urls that according to the SeoMoz tool will be indexed as duplicate pages. The pages in questions are forms, blog-posts etc. that are not crucial to achieve ranking for. I do worry though about the consequences of having 20 (non-duplicate)pages with static urls and about 100 pages that are duplicates with dynamic urls. What consequences will this have for the speed that the robots crawl the site and could there be negative effects on ranking for the entire domain?
On-Page Optimization | | vibelingo0