"Too many on page links" phantom penalty? What about big sites?
-
So I am consistently over the recommended "100 links" rule on our site's pages because of our extensive navigation and plentiful footer links (somewhere around 300 links per page). I know that there is no official penalty for this but rather that it affects the "link juice" of each link on there. I guess my question is more about how places like Zappos and Amazon get away with this? They have WAY over 100 links per page... in fact I think that Zappos footer is 100+ links alone. This overage doesn't seem to affect their domain rankings and authority so why does SEO moz place so much emphasis on this error?
-
I can totally agree with that statement. Perhaps I misspoke. Im not asking for them to set my guidelines but rather just noting that they are pretty certain about drawing that line at 100 with their error reporting as opposed to keeping that line fuzzy like it really is.
-
I cannot speak for SEOMoz, but I personally think one of the reasons they don't tell you what a good range to have is because a lot of it comes down to your site structure as well as authority. Telling people how many they can have based on domain authority, and knowing nothing of site structure, may lead to bad practices.
Like Marcus says above:
"Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation. "
-
Thanks for the reply. I guess then my question is... why doesn't SEO moz give you a range of links appropriate for your domain authority? So for example if my domain authority is 35 what range of links would be appropriate... if its 65, likewise what amount of links would be considered permissible? The reason i say range is because i know its not a hard fast rule. Its just hard to see thousands of errors glaring at you every day when in fact they may not be affecting your domain authority like they say.
-
This was a rule from back in the day when Google's advice was not to have more than 100 links per page. The thing is, whilst this exact rule is not really the case now, there is a lot of good reason not to have a link to every page on every page.
If you have this kind of huge navigation
- is it good for users? Will they really navigate through 300 odd links?
- do you really want to evenly distribute page rank across all pages in the site and indicate that each page is equally important?
Also, if all links are not getting crawled, then there is a potential for page rank distribution to be messed up with certain pages not getting anything due to the behemoth navigation.
It's important to note though, this has never been a penalty issue, it is really just a bad internal SEO (and possibly usability) issue.
This is an interesting read:
http://www.seomoz.org/blog/how-many-links-is-too-manyAlso, you can see Google still recommend that you 'keep the links on a given page to a reasonable amount':
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769#1Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation.
Hope this helps
Marcus -
The 100 links thing is more of a general recommendation than a "rule" . The answer could be different from site to site. Back in the day, this was a limit of 100 links crawled per page because Google put a cap on how much they would crawl in order to save some bandwidth.
In current times, this is more about how much domain authority your site has. If you have a lot of links, those are all going to be leeching your PR and they will all get less "juice" as the amount that flows over will keep getting divided.
Sites like Amazon and Zappos have a lot of authority and PR, therefore this is not as big of an issue to them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with "Missing Title Tag" isn't a page
Hello, I am going through the various errors that the Moz Pro Crawl report and some non-existent pages keep coming up in the report. For example, one error category is "Missing Title Tag" with one page identified. But this page http://www.immigroup.com/news/“http%3A/crs.yorku.ca”?page=2 isn't real. It would have been a 404 were there not a redirect for everything that is /news/gobbledygook to /news. So my question is: when moz (or GA for that matter) identifies these pages as "real" and having errors, do I need to take this seriously? And what do I do about it? Thanks! George
Moz Pro | | canadageorge0 -
Why would my site return an error when using Open Site Explorer to crawl it?
I have built several new sites over the last few months for others, but recently built a new one for myself. I have gone through most of the checklists from this site to address on-page SEO, and now I am looking at link building. When using Open Site Explorer, I receive an error saying that no information about the URL is available, even when I add competitor sites. Wondering if this is a common issue and if there is a convenient remedy? thanks!
Moz Pro | | MindSpark0 -
I know our business listed in Yahoo and medranks.com (for example). But my open site explorer report doesn't show those. however on their sites, I see the listing. Why is this?
I know our business listed in Yahoo and medranks.com (for example). But my open site explorer report doesn't show those links on the inbound report. however on their respective sites, I see the listing when I search for us. And the link does work..... Why is this? Why don't I see it on the open site report?
Moz Pro | | cschwartzel0 -
Duplicate pages with canonical links still show as errors
On our CMS, there are duplicate pages such as /news, /news/, /news?page=1, /news/?page=1. From an SEO perspective, I'm not too worried, because I guess Google is pretty capable of sorting this out, but to be on the safe side, I've added canonical links. /news itself has no link, but all the other variants have links to "/news". (And if you go wild and add a bunch of random meaningless parameters, creating /news/?page=1&jim=jam&foo=bar&this=that, we will laugh at you and generate a canonical link back to "/news". We're clever like that.) So far so good. And everything appears to work fine. But SEOMoz is still flagging up errors about duplicate titles and duplicate content. If you click in, you'll see a "Note" on each error, showing that SEOMoz has found the canonical link. So SEOMoz knows the duplication isn't a problem, as we're using canonical links exactly the way they're supposed to be used, and yet is still flagging it as an error. Is this something I should be concerned about, or is it just a bug in SEOMoz?
Moz Pro | | LockyDotser0 -
Domain & Page Authority of brand new site?
Something I always do to assess how easy/difficult it will be for a clients site to rank against competitors is to compare domain and page authority scores with the scores of the competitors on page 1. This is great when Moz have spidered my clients site and can provide scores but what about where a clients site is brand new with only a few natural links? Anyone suggest a ball park figures for Domain authority on brand new websites?
Moz Pro | | QubaSEO0 -
How do we use SEOmoz to track Local Searches? Is there a way to set the location from which the campaign tracker is "searching"?
It seems that there is no way to set a parameter for location. In Places, I'm able to define my targeted region. How does SEOmoz mimic that localization? Thank for any help! -- Chris
Moz Pro | | ChrisPalle1 -
"Duplicate Page Title" and "Duplicate Page Content" issue
Hi I am having an issue with my site showing duplicate page title and content issues for www.domain.com and www.domain.com/ Is the trailing slash really an issue? Can someone help me with a mod_rewrite rule to sort this please? Thanks,
Moz Pro | | JoeBrewer
Joe0 -
Why would Open Site Explorer say some internal links are images when in the source code they're text?
Hi, All! I was looking at OSE for one of my client's site's pages, and I saw that all internal links were said to be images. I was pretty sure their menu was CSS, and all the links were text. So I did "fetch as Googlebot" and looked in the resultant code of one of the homepage to see the main navigation bar. The navigation link for one of those pages looked like this: `onmouseover="doMenu2on(this);">[](http://www.mysite.com/solution_sub.asp?ID=8)` [``` <code>![](images/arrow6.gif)Anchor Text Here</code> ```](http://www.mysite.com/solution_sub.asp?ID=8) ``` The nav link for a page that does show internal anchor text in OSE looks like this: onmouseover="doMenu2on(this);">[](http://www.mysite.com/solution.asp?ID=5) [```
Moz Pro | | debi_zyx
<code>Anchor Text Here</code> So there is an image in the link (it's the little arrow before the text to indicate a sub-category), but the anchor text comes right after it. Is Linkscape (and therefore potentially Google) seeing them as two different links to the same page and only counting the first? But it's all wrapped within the same <a></a>tags. Any ideas on if this is a bug in Linkscape, or a real issue, and what should be done? Thanks! Aviva``0