What's the max number of links you should ever have on a page?
-
Our homepage has a few hundred links and our index pages(pages that link to our spintext pages) have about 900 links on them with no content. Our SEO guy said we have to keep the links under 1000 but I wanted to see what you guys think.
-
I think in Danny Dover's book he suggests no more than 150.
-
"Link Juice" passes from one page to another based on the number of links on that page. Here's what the SEOmoz basic guide to SEO says about it;
Number of Links on a Page
According to the original PageRank formula, the value that a link passes is diluted by the presence of other links on a page. Thus, getting linked-to by a page with few links is better than being linked-to by the same page with many links on it (all other things being equal). The degree to which this is relevant is unknowable (and in our testing, it appears to be important, but not overwhelmingly so), but it's certainly something to be aware of as you conduct link acquisition.
So the more links you have on a page, the less value each link passes.
The SEOmoz optimization report says to avoid "Excessive" internal and external links, and I think that number is somewhere between 100 and 150.
I don't think that you should get to a "maximum" number of links. Instead you should aggressively reduce the number of links to find the minimum. See also: Link Sculpting
-
900 links and no content, and something called "spintext". This sets off a few warning bells to me that the site doesn't seem to be for users as much as the search engines. Are you able to drop a link to the site? If so, we could give a little more opinion. It does seem to be a bit high for links.
-
I don't think there is any "good" rule of thumb out there, as it really depends on your website and the amount of authority you have, but I would say that your SEO guy is certainly right in that you should try to cut back on the amount of links you have. I don't know what your website is, but I would generally say that anywhere close to 1000 is probably too much. Mathematically speaking, the more links you have on any given page, the more the "link juice" is going to be divided up. For this reason, it makes sense to focus internal links on your most important pages, particularly if they are coming from one of your strongest pages, like your home page.
SEOmoz's Pro tool uses 100 links as their standard. Any pages with more than 100 links get flagged. In Danny Dover's book, Search Engine Optimization Secrets, I believe he cites his number as 150.
Dr. Pete wrote a tremendous post on this topic earlier this year which is certainly worth a read.
The rule of thumb I would use: Are these links good for the user? Is this the optimal way to offer easy navigation to my visitors? Can this be done easier, cleaner, or more efficiently?
At the end of the day, if your page is authoritative enough, something like too many links isn't going to pose a problem. But if they're affecting user experience, or if they're weakening your SEO efforts, than they can certainly hold you back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'domain:example.com/' is this line with a '/' at the end of the domain valid in a disavow report file ?
Hi everyone Just out of curiosity, what would happen if in my disavow report I have this line : domain:example.com**/** instead of domain:example.com as recommended by google. I was just wondering if adding a / at the end of a domain would automatically render the line invalid and ignored by Google's disavow backlinks tool. Many thanks for your thoughts
Technical SEO | | LabeliumUSA0 -
How to handle pages I can't delete?
Hello Mozzers, I am using wordpress and I have a small problem. I have two sites, I don't want but the dev of the theme told me I can't delete them. /portfolio-items/ /faq-items/ The dev said he can't find a way to delete it because these pages just list faqs/portfolio posts. I don't have any of these posts so basically what I have are two sites with just the title "Portfolio items" and "FAQ Items". Furthermore the dev said these sites are auto-generated so he can't find a way to remove them. I mean I don't believe that it's impossible, but if it is how should I handle them? They are indexed by search engines, should I remove them from the index and block them from robots.txt? Thanks in advance.
Technical SEO | | grobro0 -
Transferring link juice on a page with over 150 links
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks
Technical SEO | | flo20 -
What is the best way to handle links that lead to a 404 page
Hi Team Moz, I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages. Steps I have taken: Multiple new sitemaps submitted with new URLs and the indexing looks solid used webmasters to remove urls with natural result listings that did not redirect and produce urls Completely built out new ppc campaigns with new URL structures contacted few major link partners Now here is my question: I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
Technical SEO | | mm9161570 -
Different IP's in one Server
Hi, I just want to ask if there is no bad effect in SEO if we do have different websites that has different IP address but has shared in only 1 server? Thank you
Technical SEO | | TirewebMarketing0 -
Has Google stopped rendering author snippets on SERP pages if the author's G+ page is not actively updated?
Working with a site that has multiple authors and author microformat enabled. The image is rendering for some authors on SERP page and not for others. Difference seems to be having an updated G+ page and not having a constantly updating G+ page. any thoughts?
Technical SEO | | irvingw0 -
Intuit's Homestead web developer
I used Intuit's homestead to develop my website and when I analyze my site on semoz, I get duplicate page content between the site and the "index". Is this something to worry about and can I fix it if it is? Thanks. Michael
Technical SEO | | thompsoncpa0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0