"Too many on page links" phantom penalty? What about big sites?
-
So I am consistently over the recommended "100 links" rule on our site's pages because of our extensive navigation and plentiful footer links (somewhere around 300 links per page). I know that there is no official penalty for this but rather that it affects the "link juice" of each link on there. I guess my question is more about how places like Zappos and Amazon get away with this? They have WAY over 100 links per page... in fact I think that Zappos footer is 100+ links alone. This overage doesn't seem to affect their domain rankings and authority so why does SEO moz place so much emphasis on this error?
-
I can totally agree with that statement. Perhaps I misspoke. Im not asking for them to set my guidelines but rather just noting that they are pretty certain about drawing that line at 100 with their error reporting as opposed to keeping that line fuzzy like it really is.
-
I cannot speak for SEOMoz, but I personally think one of the reasons they don't tell you what a good range to have is because a lot of it comes down to your site structure as well as authority. Telling people how many they can have based on domain authority, and knowing nothing of site structure, may lead to bad practices.
Like Marcus says above:
"Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation. "
-
Thanks for the reply. I guess then my question is... why doesn't SEO moz give you a range of links appropriate for your domain authority? So for example if my domain authority is 35 what range of links would be appropriate... if its 65, likewise what amount of links would be considered permissible? The reason i say range is because i know its not a hard fast rule. Its just hard to see thousands of errors glaring at you every day when in fact they may not be affecting your domain authority like they say.
-
This was a rule from back in the day when Google's advice was not to have more than 100 links per page. The thing is, whilst this exact rule is not really the case now, there is a lot of good reason not to have a link to every page on every page.
If you have this kind of huge navigation
- is it good for users? Will they really navigate through 300 odd links?
- do you really want to evenly distribute page rank across all pages in the site and indicate that each page is equally important?
Also, if all links are not getting crawled, then there is a potential for page rank distribution to be messed up with certain pages not getting anything due to the behemoth navigation.
It's important to note though, this has never been a penalty issue, it is really just a bad internal SEO (and possibly usability) issue.
This is an interesting read:
http://www.seomoz.org/blog/how-many-links-is-too-manyAlso, you can see Google still recommend that you 'keep the links on a given page to a reasonable amount':
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769#1Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation.
Hope this helps
Marcus -
The 100 links thing is more of a general recommendation than a "rule" . The answer could be different from site to site. Back in the day, this was a limit of 100 links crawled per page because Google put a cap on how much they would crawl in order to save some bandwidth.
In current times, this is more about how much domain authority your site has. If you have a lot of links, those are all going to be leeching your PR and they will all get less "juice" as the amount that flows over will keep getting divided.
Sites like Amazon and Zappos have a lot of authority and PR, therefore this is not as big of an issue to them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only few pages (308 pages of 1000 something pages) have been crawled and diagnosed in 4 days, how many days till the entire website is crawled complete?
Setup campaign about 4-5 days ago and yesterday rogerbot said 308 pages were crawled and the diagnostics were provided. This website has over 1000+ pages and would like to know how long it would take for roger to crawl the entire website and provide diagnostics. Thanks!
Moz Pro | | TejaswiNaidu0 -
Need to find all pages that link to list of pages/pdf's
I know I can do this in OSE page by page, but is there a way I can do this in a large batch? There are 200+ PDF's that I need to figure out what pages (if any) link to the PDF. I'd rather not do this page by page, but rather copy-paste the entire list of pages I'm looking for. Any tools you know of that can do this?
Moz Pro | | ryanwats0 -
What Exactly Does "Linking Root Domains" mean??
What Exactly Does "Linking Root Domains" mean?? And how does it affect your ranking for certain Keywords?? Thanks
Moz Pro | | Caseman57 -
How do i increae my page authority
Hi i am still finding my way around semoz and still learning what the tools do and i am trying to find out how i can increase my page authority. My competition page authority is 52 while mine is 36 and i would like to learn how i can increase mine to beat my competition. If anyone could give me some step by step instructions on what tools i should use and how i should use them to increase my page authority many thanks
Moz Pro | | ClaireH-1848860 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Hello guys, our site is nearly perfect - according to SEOmoz campaign overview. But, it shows me 5200 Errors, more then 2500 Pages with Duplicate Content plus more then 2500 Duplicated Page Titles. All these pages are sites to edit profiles. So I set them "noindex, follow" with meta robots. It works pretty good, these pages aren't indexed in the search engines. But why the SEOmoz tools list them as errors? Is there a good reason for it? Or is this just a little bug with the toolset? The URLs which are listet as duplicated are http://www.rimondo.com/horse-edit/?id=1007 (edit the IDs to see more...) http://www.rimondo.com/movie-edit/?id=10653 (edit the IDs to see more...) The crawling picture is still running, so maybe the errors will be gone away in some time...? Kind regards
Moz Pro | | mdoegel0 -
On-Page Optimisation tool on intranet pages
Does anybody know if there's any easy way to use the On-Page Optimisation tool on intranet or not publicly accessible pages? Thanks!
Moz Pro | | neooptic0