What's the max number of links you should ever have on a page?
-
Our homepage has a few hundred links and our index pages(pages that link to our spintext pages) have about 900 links on them with no content. Our SEO guy said we have to keep the links under 1000 but I wanted to see what you guys think.
-
I think in Danny Dover's book he suggests no more than 150.
-
"Link Juice" passes from one page to another based on the number of links on that page. Here's what the SEOmoz basic guide to SEO says about it;
Number of Links on a Page
According to the original PageRank formula, the value that a link passes is diluted by the presence of other links on a page. Thus, getting linked-to by a page with few links is better than being linked-to by the same page with many links on it (all other things being equal). The degree to which this is relevant is unknowable (and in our testing, it appears to be important, but not overwhelmingly so), but it's certainly something to be aware of as you conduct link acquisition.
So the more links you have on a page, the less value each link passes.
The SEOmoz optimization report says to avoid "Excessive" internal and external links, and I think that number is somewhere between 100 and 150.
I don't think that you should get to a "maximum" number of links. Instead you should aggressively reduce the number of links to find the minimum. See also: Link Sculpting
-
900 links and no content, and something called "spintext". This sets off a few warning bells to me that the site doesn't seem to be for users as much as the search engines. Are you able to drop a link to the site? If so, we could give a little more opinion. It does seem to be a bit high for links.
-
I don't think there is any "good" rule of thumb out there, as it really depends on your website and the amount of authority you have, but I would say that your SEO guy is certainly right in that you should try to cut back on the amount of links you have. I don't know what your website is, but I would generally say that anywhere close to 1000 is probably too much. Mathematically speaking, the more links you have on any given page, the more the "link juice" is going to be divided up. For this reason, it makes sense to focus internal links on your most important pages, particularly if they are coming from one of your strongest pages, like your home page.
SEOmoz's Pro tool uses 100 links as their standard. Any pages with more than 100 links get flagged. In Danny Dover's book, Search Engine Optimization Secrets, I believe he cites his number as 150.
Dr. Pete wrote a tremendous post on this topic earlier this year which is certainly worth a read.
The rule of thumb I would use: Are these links good for the user? Is this the optimal way to offer easy navigation to my visitors? Can this be done easier, cleaner, or more efficiently?
At the end of the day, if your page is authoritative enough, something like too many links isn't going to pose a problem. But if they're affecting user experience, or if they're weakening your SEO efforts, than they can certainly hold you back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
Will really old links have any benefit being 301'd
I have a client who when they built their site never had any of their old links 301'd - I've now managed to locate a few of these links and am going to redirect them. The site was rebuilt 2006/07 - and it ranked page one and #1 for lots of relevant keywords, if I redirect these to the current pages will the rankings still carry??
Technical SEO | | lauratagdigital0 -
Mysterious drop in the Number of Pages Crawled
The # of crawled pages on my campaign dashboard has been 90 for months. Approximate a week ago it dropped down to 25 crawled pages, and many links went with it. I have checked with my web master, and he said no changes have been made which would cause this to happen. I am looking for suggestions on how I can go about trouble shooting this issue, and possible solutions. Thanks in advance!
Technical SEO | | GladdySEO0 -
How do I know which page a link is from
I've got an interesting situation. I hope you can help. I have a list of links but I'm not sure which pages of my site they are from. How do I know which page a specific link is from? Thanks in advance.
Technical SEO | | VinceWicks0 -
How is link juice passed to links that appear more than once on a given page?
For the sake of simplicity, let's say Page X has 100 links on it, and it has 100 points of link juice. Each page being linked to would essentially get 1 point of link juice. Right? Now let's say Page X links to Page Y 3 times and Page Z 5 times, and every other link only once. Does this mean that Page Y would get 3 "link juice points" and Page Z would get 5? Note: I know that the situation is much more complex than this, such as the devaluation of footer links, etc, etc, etc. However, I am interested to hear peoples take on the above scenario, assuming all else is equal.
Technical SEO | | bheard0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0 -
Switching ecommerce CMS's - Best Way to write URL 301's and sub pages?
Hey guys, What a headache i've been going through the last few days trying to make sure my upcoming move is near-perfect. Right now all my urls are written like this /page-name (all lowercase, exact, no forward slash at end). In the new CMS they will be written like this: /Page-Name/ (with the forward slash at the end). When I generate an XML sitemap in the new ecomm CMS internally it lists the category pages with a forward slash at the end, just like they show up through out the CMS. This seems sloppy to me, but I have no control over it. Is this OK for SEO? I'm worried my PR 4, well built ecommerce website is going to lose value to small (but potentially large) errors like this. If this is indeed not good practice, is there a resource about not using the forward slash at the end of URLS in sitemaps i can present to the community at the platform? They are usually real quick to make fixes if something is not up to standards. Thanks in advance, -First Time Ecommerce Platform Transition Guy
Technical SEO | | Hyrule0