Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Too many on page links
-
Hi
I know previously it was recommended to stick to under 100 links on the page, but I've run a crawl and mine are over this now with 130+
How important is this now? I've read a few articles to say it's not as crucial as before.
Thanks!
-
Hi Becky!
First, I would like to say this is it great you are being proactive in making sure your webpage doesn't have too many links on it! But, luckily for you, this is not something you need to worry about. 100 is a suggested number but not something that will penalize you if you go over.
Google’s Matt Cutts posted a video explaining why Google no longer has that 100-links-per-page Webmaster guideline—so be sure to check that out! It's commonly thought that having too many links will negatively impact your SEO results, but that hasn't been the case since 2008. However, Google has said if a site looks to be spammy and has way too many links on a single page—Google reserves the right to take action on the site. So, don't include links that could be seen as spammy and you should be fine.
Check out this Moz blog that discusses how many links is too many for more information!
-
Thank you for the advice, I'll take a look at the articles
Brilliant, the round table sounds great - I'll sign up for this
-
I honestly wouldn't worry Becky. The page looks fine, the links look fine and it is certainly not what you would call spammy,
Link crafting was a 'thing' a number of years ago, but today Google pretty much ignores this, as has been shown many times in testing.
However, you can benefit from internal links, but that is a different discussion. Read this if you are interested.
If you are interested, there is a round-table discussion on eCommerce SEO hosted by SEMrush on Thursday and that could be useful to you? Myself and 2 others will be talking on a number of issues.
-Andy
-
Thanks for the advice, I've looked into this before.
We have menu links and product links as it's an ecommerce site, so I wouldn't be able to remove any of these.
I've found it hard to find a way to decrease these links further on primary pages. For example http://www.key.co.uk/en/key/aluminium-sack-truck has 130 links.
Any advice would be appreciated
-
Confirmation from Google here to limit the links on a page to 3000
https://www.deepcrawl.com/knowledge/news/google-webmaster-hangout-notes-friday-8th-july-2016/
I would consider that to be a lot though
-Andy
-
Brilliant thank you!
-
In the "old days" (yup, I go back that far), Google's search index crawler wasn't all that powerful. So it would ration itself on each page and simply quit trying to process all the content on the page after a certain number of links and certain character count. (That's also why it used to be VERY important that your content was close to the top of your page code, not buried at the bottom of the code).
The crawler has been beefed up to the point where this hasn't been a limiting factor per page for a long time, so the crawler will traverse pretty well any links you feed it. But I +1 both Andy and Mike's advice about considering the usability and link power dilution of having extensive numbers of links on a page. (This is especially important to consider for your site's primary pages, since one of their main jobs is to help flow their ranking authority down to important/valuable second-level pages.)
Paul
-
Hi Becky,
Beyond the hypothetical limit, would be the consideration of dividing the link authority of the page by a really large number of links and therefor decreasing the relative value of each of those links to the pages they link to.
Depending on the page holding all these links, user experience, purpose of linked-to pages, etcetera, this may or may not be a consideration, but worth thinking about.
Good luck!
- Mike
-
Hi Becky,
If the links are justified, don't worry. I have clients with 3-400 and no problems with their positions in Google.
That doesn't mean to say it will be the same case for everyone though - each site is different and sometimes you can have too many, but just think it through and if you come to the conclusion that most of the links aren't needed and are stuffing keywords in, then look to make changes.
But on the whole, it doesn't sound like an issue to me - there are no hard and fast rules around this.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Avoid Too Many Internal Links" when you have a mega menu
Using the on-page grader and whilst further investigating internal linking, I'm concerned that as the ecommerce website has a very link heavy mega menu the rule of 100 may be impeding on the contextual links we're creating. Clearly we don't want to no-follow our entire menu. Should we consider no-indexing the third-level- for example short sleeve shirts here... Clothing > Shirts > Short Sleeve Shirts What about other pages we're don't care to index anyway such as the 'login page' the 'cart' the search button? Any thoughts appreciated.
Intermediate & Advanced SEO | | Ant-Scarborough0 -
Can Google Bot View Links on a Wix Page?
Hi, The way Wix is configured you can't see any of the on-page links within the source code. Does anyone know if Google Bots still count the links on this page? Here is the page in question: https://www.ncresourcecenter.org/business-directory If you do think Google counts these links, can you please send me URL fetcher to prove that the links are crawlable? Thank you SO much for your help.
Intermediate & Advanced SEO | | Fiyyazp0 -
Do internal links from non-indexed pages matter?
Hi everybody! Here's my question. After a site migration, a client has seen a big drop in rankings. We're trying to narrow down the issue. It seems that they have lost around 15,000 links following the switch, but these came from pages that were blocked in the robots.txt file. I was wondering if there was any research that has been done on the impact of internal links from no-indexed pages. Would be great to hear your thoughts! Sam
Intermediate & Advanced SEO | | Blink-SEO0 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
100 + links on a scrolling page
Can you add more than 100 links on your webpage If you have a webpage that adds more content from a database as a visitor scrolls down the page. If you look at the page source the 100 + links do not show up, only the first 20 links. As you scroll down it adds more content and links to the bottom of the page so its a continuos flowing page if you keep scrolling down. Just wanted to know how the 100 links maximum fits into this scenario ?
Intermediate & Advanced SEO | | jlane90 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0