"Too many on page links" phantom penalty? What about big sites?
-
So I am consistently over the recommended "100 links" rule on our site's pages because of our extensive navigation and plentiful footer links (somewhere around 300 links per page). I know that there is no official penalty for this but rather that it affects the "link juice" of each link on there. I guess my question is more about how places like Zappos and Amazon get away with this? They have WAY over 100 links per page... in fact I think that Zappos footer is 100+ links alone. This overage doesn't seem to affect their domain rankings and authority so why does SEO moz place so much emphasis on this error?
-
I can totally agree with that statement. Perhaps I misspoke. Im not asking for them to set my guidelines but rather just noting that they are pretty certain about drawing that line at 100 with their error reporting as opposed to keeping that line fuzzy like it really is.
-
I cannot speak for SEOMoz, but I personally think one of the reasons they don't tell you what a good range to have is because a lot of it comes down to your site structure as well as authority. Telling people how many they can have based on domain authority, and knowing nothing of site structure, may lead to bad practices.
Like Marcus says above:
"Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation. "
-
Thanks for the reply. I guess then my question is... why doesn't SEO moz give you a range of links appropriate for your domain authority? So for example if my domain authority is 35 what range of links would be appropriate... if its 65, likewise what amount of links would be considered permissible? The reason i say range is because i know its not a hard fast rule. Its just hard to see thousands of errors glaring at you every day when in fact they may not be affecting your domain authority like they say.
-
This was a rule from back in the day when Google's advice was not to have more than 100 links per page. The thing is, whilst this exact rule is not really the case now, there is a lot of good reason not to have a link to every page on every page.
If you have this kind of huge navigation
- is it good for users? Will they really navigate through 300 odd links?
- do you really want to evenly distribute page rank across all pages in the site and indicate that each page is equally important?
Also, if all links are not getting crawled, then there is a potential for page rank distribution to be messed up with certain pages not getting anything due to the behemoth navigation.
It's important to note though, this has never been a penalty issue, it is really just a bad internal SEO (and possibly usability) issue.
This is an interesting read:
http://www.seomoz.org/blog/how-many-links-is-too-manyAlso, you can see Google still recommend that you 'keep the links on a given page to a reasonable amount':
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769#1Now, only you can determine what a reasonable amount is for your site. If all pages have tons of authority, then great, but if not, you may want to rethink your navigation.
Hope this helps
Marcus -
The 100 links thing is more of a general recommendation than a "rule" . The answer could be different from site to site. Back in the day, this was a limit of 100 links crawled per page because Google put a cap on how much they would crawl in order to save some bandwidth.
In current times, this is more about how much domain authority your site has. If you have a lot of links, those are all going to be leeching your PR and they will all get less "juice" as the amount that flows over will keep getting divided.
Sites like Amazon and Zappos have a lot of authority and PR, therefore this is not as big of an issue to them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On page Grader is not working on specific site
Hello.
Moz Pro | | livedigm
When I try to use 'On Page Grader' on specific site, I get an error message. "
Page Optimization Error
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page.
"
example : https://www.livedigm.com Site's robots.txt settings are good. and I think there's no blocking factor. But On Page Grader cannot crawl the sites.
But campaign crawler is working well on the site. only On Page Grader is not working.. What should I change my server's setting or site's setting for crawling site on my site?
I'm using wordpress on cloudways / Digitalocean(singapore) server. Thank you.0 -
Why do I see a duplicate content errors when rel="canonical" tag is present
I was reviewing my first Moz crawler report and noticed the crawler returned a bunch of duplicate page content errors. The recommendations to correct this issue are to either put a 301 redirect on the duplicate URL or use the rel="canonical" tag so Google knows which URL I view as the most important and the one that should appear in the search results. However, after poking around the source code I noticed all of the pages that are returning duplicate content in the eyes of the Moz crawler already have the rel="canonical" tag. Does the Moz crawler simply not catch whether that tag is being used? If I have that tag in place, is there anything else I need to do in order to get that error to stop showing up in the Moz crawler report?
Moz Pro | | shinolamoz0 -
Help me to know why are all pages not being tracked by the Moz tool for on-page optimization reports?
The On-page Optimization report that the Moz tool shows, is not tracking all the pages from my website. I know this because it isn't showing a ranking for all pages on my website. Is there a particular reason why this is happening? It is important for me to know details of all pages, else it does not give me a comprehensive picture of what's going on in SEO.
Moz Pro | | jslusser0 -
Page Rank Report says #6 in Google but I can't find the page anywhere
So SEOMoz says that I've consistently ranked #6 for a certain keyword. But when I search I'm no where to be found. I've done regular searches, incognito and some non-seomoz reports and all come up with nothing in Google. I noticed it a week or two ago, but didn't think it would continue. This is no bueno. I wouldn't be surprised if I got penalized (luckily my homepage relatively well for similar keywords), an old seo consultant used very spammy tactics. I recently removed them, but not before I started to notice that I fell off the map. Why would SEOMoz not recognize this, and continue to say I'm ranking well? The keyword is bpi building analyst the page is http://www.cleanedison.com/courses/bpi-building-analyst
Moz Pro | | CleanEdisonInc0 -
On-Page Report Card B grade because its a PPC landing page
I have a PPC landing page with I'm getting a B grade on the On-Page Report Card. Can I just ignore that, it says its a "Critical Factor" Thanks Mike Crawl status <dd>Status Code: 200
Moz Pro | | mjrinvent
meta-robots: noindex,nofollowall
meta-refresh: None
X-Robots: None</dd> <dt>Explanation</dt> <dd>Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible.</dd> <dt>Recommendation</dt> <dd>Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)</dd>0 -
How do you solve this issue? Pages coming up as [No Title] in Open Site Explorer
I see this when researching top pages. I know for a fact that the pages have title tags that appear to be fine on google and our site, so I'm not sure what the [No Titlle] means or why it's appearing in Open Site Explorer. Thanks!
Moz Pro | | JasonBilog0 -
Can someone explain why I have been seeing an increase in the number of Linking Page URLs in OSE that link directly to downloads?
Ever since the last couple Linkscape updates when doing competitive back link analysis I have noticed a large increase in the number of URLs of Linking Pages in OSE that result in an immediate file download. The majority of the time these downloads are not common files ie PDF, DOC files. For example, these were all in a competitors back link profile: http://download.unesp.br/linux/debian/pool/main/i/isc-dhcp/isc-dhcp-relay-dbg_4.1.1-P1-17_ia64.deb http://snow.fmi.fi/data/20090210_eurasia_sd_025grid.mat http://www.rose-hulman.edu/class/me/HTML/ES204_0708_S/working model examples/Le25 mad hatter.wm?a=p&id=145880&g=5&p=sia&date=iso&o=ajgrep These are just a few I came across for a single competitor. Is this sketchy black hat SEO, some sort of error, actual links, or something else? Any information on this subject would be helpful. Thank you.
Moz Pro | | Gyi0 -
Why is blocking the SEOmoz crawler considered a red "error?"
Why is blocking the SEOmoz crawler considered a red "error?" Please see attached image... Y3Vay.png
Moz Pro | | vkernel0