What's the max number of links you should ever have on a page?
-
Our homepage has a few hundred links and our index pages(pages that link to our spintext pages) have about 900 links on them with no content. Our SEO guy said we have to keep the links under 1000 but I wanted to see what you guys think.
-
I think in Danny Dover's book he suggests no more than 150.
-
"Link Juice" passes from one page to another based on the number of links on that page. Here's what the SEOmoz basic guide to SEO says about it;
Number of Links on a Page
According to the original PageRank formula, the value that a link passes is diluted by the presence of other links on a page. Thus, getting linked-to by a page with few links is better than being linked-to by the same page with many links on it (all other things being equal). The degree to which this is relevant is unknowable (and in our testing, it appears to be important, but not overwhelmingly so), but it's certainly something to be aware of as you conduct link acquisition.
So the more links you have on a page, the less value each link passes.
The SEOmoz optimization report says to avoid "Excessive" internal and external links, and I think that number is somewhere between 100 and 150.
I don't think that you should get to a "maximum" number of links. Instead you should aggressively reduce the number of links to find the minimum. See also: Link Sculpting
-
900 links and no content, and something called "spintext". This sets off a few warning bells to me that the site doesn't seem to be for users as much as the search engines. Are you able to drop a link to the site? If so, we could give a little more opinion. It does seem to be a bit high for links.
-
I don't think there is any "good" rule of thumb out there, as it really depends on your website and the amount of authority you have, but I would say that your SEO guy is certainly right in that you should try to cut back on the amount of links you have. I don't know what your website is, but I would generally say that anywhere close to 1000 is probably too much. Mathematically speaking, the more links you have on any given page, the more the "link juice" is going to be divided up. For this reason, it makes sense to focus internal links on your most important pages, particularly if they are coming from one of your strongest pages, like your home page.
SEOmoz's Pro tool uses 100 links as their standard. Any pages with more than 100 links get flagged. In Danny Dover's book, Search Engine Optimization Secrets, I believe he cites his number as 150.
Dr. Pete wrote a tremendous post on this topic earlier this year which is certainly worth a read.
The rule of thumb I would use: Are these links good for the user? Is this the optimal way to offer easy navigation to my visitors? Can this be done easier, cleaner, or more efficiently?
At the end of the day, if your page is authoritative enough, something like too many links isn't going to pose a problem. But if they're affecting user experience, or if they're weakening your SEO efforts, than they can certainly hold you back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to move some pages of my website to a folder and nav menu in those pages should only show inner page links, will it hurt SEO?
Hi, My website has a few SaaS products, to make my website simple i want to move my website some pages to its specific folder structure , so eg website.com/product1/features
Technical SEO | | webbeemoz
website.com/product1/pricing
website.com/product1/information and same for product2 and so on, the website.com/product1/.. menu will only show the links of product1 and only one link to homepage (possibly in footer). Please share your opinion will it be a good idea, from UI perspective it will be simple , but i am not sure about SEO perspective, please help thanks1 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Should I disavow links from pages that don't exist any more
Hi. Im doing a backlinks audit to two sites, one with 48k and the other with 2M backlinks. Both are very old sites and both have tons of backlinks from old pages and websites that don't exist any more, but these backlinks still exist in the Majestic Historic index. I cleaned up the obvious useless links and passed the rest through Screaming Frog to check if those old pages/sites even exist. There are tons of link sending pages that return a 0, 301, 302, 307, 404 etc errors. Should I consider all of these pages as being bad backlinks and add them to the disavow file? Just a clarification, Im not talking about l301-ing a backlink to a new target page. Im talking about the origin page generating an error at ping eg: originpage.com/page-gone sends me a link to mysite.com/product1. Screamingfrog pings originpage.com/page-gone, and returns a Status error. Do I add the originpage.com/page-gone in the disavow file or not? Hope Im making sense 🙂
Technical SEO | | IgorMateski0 -
How long does it take for Moz to discover links to pages
Hi folks, Our website is doing well in the Google rankings relative to our competitors who often have higher "Domain authority" than us as reported by Moz. I'm wondering how closely Moz's "Domain Authority" correlates with Google's. In particular, I wonder how long it takes Moz to discover inbound links. For instance our page at http://www.educationquizzes.com/ks3/english has many inbound links from pages on an outstanding educational website and yet our page authority is given by Moz as a measly "1"! Any insights would be very much appreciated.
Technical SEO | | colinking0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
What is Too Many On-Page Links?
in campaigns i see " Too Many On-Page Links " what is this ? can anyone please tell me ?
Technical SEO | | constructionhelpline0 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
From your perspective, what's wrong with this site such that it has a Panda Penalty?
www.duhaime.org For more background, please see: http://www.seomoz.org/q/advice-regarding-panda http://www.seomoz.org/q/when-panda-s-attack (hoping the third time's the charm here)
Technical SEO | | sprynewmedia0