Too Many On-Page Links Reported By SEOmoz
-
Hi,
I recently did run a crawl report for my blog dapazze.com, and found that SEOmoz is reporting many pages on my blog having more than 100 internal links.
I opened OSE, and made a search for one of my pages which was reported to contain more than 100 links. And I found it to contain 464 internal links.
Please have a look at it. I have chosen - Show "All" links from "only internal" pages to "this page" option in OSE, which reports me this.
I see almost every page in my blog linking to every page. This is not the problem for me. I have also tried to make a search for some popular bloggers, like ProBlogger.net, ShoutMeLoud.com, HellBoundBloggers.com, etc, and all of them have the same problem.
Should I be worrying about this problem? What is the problem actually?
-
Rahul the links do count as internal links if they are on the page. It matters not how they are being generated. Once again the 100 is just a general rule, if there is user benefits for having over 100 then I would say you have nothing to worry about.
The amount of links on your site is not burdening the user so I doubt those are factors causing you any issues.
-
Ok. Thanks for making me understand the problem.
But will it hurt, if all these sidebar, footer, and navigation links get counted as internal links? I only make about 5 internal links from the body of the post, but all these other links are making it to 100+.
Will these create any problem for me?
-
The article is excellent. It made my conception very very clear. Thank you.
-
Here's a good post Dr. Pete wrote on the subject about too many links that may help.
-
HI Rahul,
The links you are talking about from opensiteexplorer and both internal links and other domains linking to you.
Your internal links show as 261 using the On-Page Report
The other 180'ish links are other domains linking to you which is good.
Looking at the page...http://dapazze.com/2012/10/win-a-commentluv-premium-single-site-and-multi-site-license-worth-about-154-giveaway-of-october/ it is taking into account all the anchor links, all the navigational links, footer links, comment links ect.. easily 200+.
Hope that helps,
Don
-
Hi, Thanks a lot for making me understand the problem. But still there is a confusion left with me. Please have a look at the link I gave you with. I see that almost all pages in my blog are getting linked from that particular page, according to SEOmoz. But in reality, my post contains only a few internal links, not even 10 I guess. But how can it reach upto 464 links, while I am not physically linking to those pages, nor do I see any link currently present in that page. You can also check yourself if you want to confirm. Then how does SEOmoz show me about 464 links? Nt only this page, other pages also have similar stats. And not only my blog, I have noticed this same thing for many popular blogs like ProBlogger, Hongkiat.com, etc. Then how come they are also doing the same mistake? Is it a mistake from my/our side, or is it a mistake SEOmoz crawler has, and it is reporting internal links wrongly?
-
Hi Rahul,
In regards to on page links...
Like most things SEOMoz informs you about you should take it as a very important suggestion, but not necessarily the absolute rule.
Here is a direct quote from Matt Cutts
REF: MattCutts.com"But in some cases, it might make sense to have more than a hundred links. Does Google automatically consider a page spam if your page has over 100 links? No, not at all. The “100 links” recommendation is in the “Design and content” guidelines section, and it’s the Quality guidelines that contain the things that we consider webspam (stuff like hidden text, doorway pages, installing malware, etc.). Can pages with over 100 links be spammy? Sure, especially if those links are hidden or keyword-stuffed. But pages with lots of links are not automatically considered spammy by Google."
That being said, you should also note by having that many links, you are effectively diluting the link juice each page passes on to next to nothing. Each link passes a percentage of the "link juice" if you have 100 links then each link is passing about 1% juice when you get into 400+ you are effectively passing nothing to along to any sub pages. This can be really problematic if lets say you have a very poor performing article or post on your site if you are linking to it from every other page you are saying hey this is a great page, but in reality the search engines thinks it isn't, you just wasted all that juice that could have been benefiting your higher quality pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
Landing page content and link distribution
Hey there fellow mozers, Need some advice, one of my clients asked us about the best way to distribute their content: number of restaurants per page, links and footer in their Landing pages. Here are 2 examples of what I mean: http://www.just-eat.es/adomicilio/madrid http://www.just-eat.es/adomicilio/pizza Thanks a lot!
On-Page Optimization | | Comunicare0 -
To Many Links On Page
I'm having a problem on a crawl warning for our main site. The warning is that every one of my pages has to many links, a little over 1,000 on almost all of them. I think this is because our category list on our left hand sidebar has so many categories, and that sidebar appears on every last one of our pages even all the way into our products. Can anyone take a look and tell me if this is the reason why and what I could possibly do about this? Thanks in advance! www.Ocelco.com
On-Page Optimization | | Mike.Bean0 -
Why would my homepage be ranked lower (Page Rank 2) than my other pages on the site (PR3) ?
Why would my homepage be ranked lower (Page Rank 2) than my other pages on the site (PR3) ?
On-Page Optimization | | dmurtagh0 -
Excessive Internal Linking...But it's a product page. What to do?
A few of our companies sites' product pages have the warning about excessive internal links. But these pages are product pages (for example). Should we be worried about this warning? Are there ways to avoid it? Or is it just the nature of the beast...? Thanks in advance!
On-Page Optimization | | DevonIntl0 -
Breadcrumb links
Hello. Are the breadcrumb links valued more over other links ? (google searches for them specificaly to display them in the results)
On-Page Optimization | | seo.academy0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5