'Too many links' on our pages.
-
This figure includes links that sit within our navigational menus. Is there a way to block this somehow so that Google and Moz do not read them as 'internal links'?
Thanks in advance.
-
Thanks for the responses! Very helpful advice.
-
You can't tell Google not to include navigation links as internal links, as they are internal links. We just had this happen to a client of ours who had 100 links just in his navigation alone. We were able to reduce those links to 25 and we have already seen an improvement. You'd be surprised how many links you don't "need" in your navigation.
-
HI Ashley,
It has been awhile since I have seen this question. Figured I'd jump in on this one.
In the early years of crawlers it was a pretty good rule of thumb to limit internal linking to 100 links, mostly because the crawler's didn't have the resources to crawl a ton of links. A lot has changed since this rule was introduced. Spiders have gotten smarter and are able to handle way more data than what they could back in the day.
But the spiders ability to deal with content should not be the only reason you evaluate how many links you should put on a page. The rule of a 100is still a good rule when you start thinking about things like how link juice is passed and what you "Really" want the search engines to know about. Dr. Pete wrote a blog post here on Moz about this, you can read here.
Once you get to the 100 link range the search engines don't really place much value on any of your links. Conversely if you limit your linking to 25 or so, the search engine is able to correlate X page to Y content very highly.To address the question how can you limit the amount of links Google Or Moz sees, well the best way is to reduce the way your pages are internally linked. If you have a super large navigation in your header or footer it may make sense to change the way the navigation is built. But, if you REALLY feel that having all those links in the navigation is helpful to your viewers then you can use things like Ajax calls to load links on hover or click. Most crawlers won't be able to load action scripts but they are getting smarter.
In summary the rule of 100 is really an old rule but still important just not in the way it was originally introduced. Today you probably can expect most cralwers to do 150-300 links before giving up. If you have a high page and domain authority you may find just ignoring the warning is the best course of action. However, if you're still working to get those rankings up it may likely be worth the time to create a better navigation that logically points to the most important section from each page.
I hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need how many link for a keyword
i seo keyword "sim số đẹp", my competitor have 4k ref domain for this keyword. So Do I need more than 4k ref domain with anchortext "sim số đẹp"? Please help me. Thanks.
Link Building | | simthanglongdotvn0 -
'spammy' domains redirecting to website
Hi Everyone, I hope that someone will be able to help us with this one as we have trawled the internet looking for a solution! We have multiple domains (.com/.co.uk/.net versions) which all point to the one website, however, some of these domains have a high spam score - 9-11. Our first initial reaction would be to remove the auto redirects, but, the other domains have been a source of conversion in previous months (or so analytics tells us). So what I'm wondering, is do we remove the 'spammy' links from redirecting to our site, or do we leave them there? We certainly don't want to risk a penalty. Thanks for reading!
Link Building | | hydra_creative0 -
Would you delete the links page ?
I have noticed that more and more of my competitors are removing the links page from their sites. Either that or they are hiding them. I suppose it makes sense because all the links on my page http://www.kerryblu.co.uk/links-7-w.asp are just links to sites that are linking to me. Supposedly there is no worth in this but I'm just worried that I will lose a lot of links if I delete this page and this may effect my currently crappy rankings for the worse. What do you think? Keep the links page or delete it ?
Link Building | | Dill0 -
Why don't directory backlinks show up in Open Link Explorer?
Why don't we see linking root domains like yelp, google, yahoo, etc? Is there any way to get a more accurate list of links to a domain where webmaster tools are not an option?
Link Building | | ScottM0 -
Would you advise removing a "links" page?
I'm doing a site audit for someone and they have a links page full of reciprocal links for other similar businesses across the country. My gut instinct is to remove this page. How would you approach this if this was your client?
Link Building | | MarieHaynes0 -
Too many external links vs linking root domains - good or bad?
Hi guys, After the latest Linkscape update, we noticed that our website have 424,671 external links with 4,395 linking root domains, so roughly 100 links/domain. Does Google consider this to be a negative thing or we do not need to worry about it. Many thanks. David
Link Building | | sssrpm0 -
Multiple links to dead pages
I have just taken over a client and have been shocked to see a lot of wasted 'link juice' from hundreds of websites linking to old/dead pages within my client's domain. My question is - what is the best method to re-capture that juice given that it is impractical to change the links at source. Thanks.
Link Building | | driansmith0