Too Many On-Page Links
-
Most of my pages have "Too Many On-Page Links".
If you view the website you will see this is mainly down to the top navigation drop down menu: http://www.cwesolutions.co.uk
So if I wanted to reduce the number of links I would have to have category links with landing pages.
How much does having "Too Many On-Page Links" effect my website ranking? Is it really important and would I notice a difference if I changed it?
-
In the sense that a report is alerting that there are too many links on a page though is going to be because it is warning of authority leakage more than anything else though.
Granted, the insertion of 'just' in my previous statement probably misleads...
I agree though that if you were to link every page to every page, it would just cause a royal headache to crawlers - why one would do that though is beyond me.
-
Sorry Pete, that was a typo where I meant to say 'internal linkage is classed differently as external linkage' (I've amended my reply).
-
To say "linking just dilutes the authority from the page" is not strictly true.
In this particular instance I do not believe that the menu is too OTT in terms of links but if you have every page linking to every page you just end up creating a mesh of links that search engines can waste crawl time trying to decipher.
Too many links with poor structure can mean that your crawl allocation is wasted which results in less of your pages being indexed properly.
-
Thanks Geoff, but want did you mean here...
Internal linkage is classed differently as internal linkage?
-
What makes you think you have too many on page links? Internal linkage is classed differently as external linkage, linking just dilutes the authority from the page (amongst other crawlability and usability factors), but if it's internal linkage, at least the authority is kept on your domain.
Simple, effective and easy to use navigation is critical to website usability and will have an affect on performance in search engines. In your example, I wouldn't expect any negative ranking affects to occur as a result of the style of navigation menu your website utilises. If that is deemed most useful for your customers, then that's the best approach.
-
Use a tool such as SEOmoz' Crawl Test or Xenu's Link Sleuth to determine any crawling issues. There are a number of free online tools for this too.
-
Thanks for your reply and explanation. One more question... How would I know if the bots are having a problem crawling my website?
-
Too many links on a page can make it difficult for bots to crawl your site. If you have over 100 links on page, make certain you have a sitemap that lays out a road map for the bots.
If your site is being effectively crawled, don't worry about it. I would worry if the linking structure is impacting the user experience or if deeper pages weren't getting crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We redesigned our website, make it responsive and page views tanked. What happened?
Last year, we redesigned our site and made it responsive. Our page views only grew by only 3% (the previous year they grew by 40%). If we exclude homepage views from our calculations, we get a drastically different picture-- and see over 30% growth for both total and unique pageviews. Any thoughts?
Web Design | | Anna720 -
Reasons Why Our Website Pages Randomly Loads Without Content
I know this is not a marketing question but this community is very dev savvy so I'm hoping someone can help me. At random times we're finding that our website pages load without the main body content. The header, footer and navigation loads just fine. If you refresh, it's fine but that's not a solution. Happens on Chrome, IE and Firefox, testing with multiple browser versions Happens across various page types - but seems to be only the main content section/container Happens while on the company network, as well as externally Happens after deleting cookies, temporary internet files and restarting computer We are using a CMS that is virtually unheard of - Bridgeline/Iapps Codebase is .net Our IT/Dev group keeps pushing back, blaming it on cookies or Chrome plugins because they apparently are unable to "recreate the problem". This has been going on for months and it's a terrible experience for the user to have. It's also not great when landing PPC visitors on pages that load with no content. If anyone has ideas as to why this may be happening I would really appreciate it. I'm not sure if links are allowed, by today the issue happened on this page serversdirect.com/dm/geek-biz Linking to an image example below knEUzqd
Web Design | | CliqStudios0 -
Advice needed: Google crawling for single page applicartions with java script
Hi Moz community,we have a single page application (enjoywishlist.com) with a lot of content in java script light boxes. There is a lot of valuable content embedded but google can not crawl the content and we can missing out on some opportunities as a result. I was wondering if someone was able to solve a similar issue (besides moving the content from the java script to the HTML body). There appears to be a few services sprouting up to handle single page applications and crawling in google.http://getseojs.com/https://prerender.io/Did anyone use these services? Some feedback would be much appreciated!ThanksAndreas
Web Design | | AndreasD0 -
Can only get a few pages indexed on by google
Hi I've touched upon this before on previous questions so apologies for repeating myself. In a nutshell out of the 60 webpages submitted to Google 11 have been indexed and out of the 140 images submitted none have indexed any ideas would be great! Here is a screen shot of what Google Webmaster is showing http://www.tidy-books.com/sitemapshow.png and here is the sitemap - > http://www.tidy-books.com/sitemap/us/sitemap.xml Thanks
Web Design | | tidybooks0 -
How can we improve our e-commerce site architecture to help best preserve Page Authority?
Today I installed the SEOMoz toolbar for Firefox (very cool, highly recommended). I was comparing our site http://www.ccisolutions.com to this competitor: http://www.uniquesquared.com For the most part, the deeper I go in our site the more the page authority drops. We have a few exceptions where the page authority of a subcategory page is actually better than the cat. page one level up. In comparison, when I was looking at http://www.uniquesquared.com I noticed that their page authority stays at "21" on every single category page I visit. Are you seeing what I'm seeing? Is this potentially a problem with the tool bar or, is there something significantly different about their site architecture that allows them to maintain that PA across all category and sub category pages? Is there something fundamentally wrong with our (http://www.ccisolutions.com) site architecture? I understand that we have longer URLs, but this is an old store with a lot of SKUs, so we have decided not to remove the /category/ and /product/ from the URLs because the 301 redirects that would result wouldn't pass all of the authority they've built up over the years. Interested to know viewpoints on the site architecture and how it might be improved. Thanks!
Web Design | | danatanseo0 -
How can I reduce my warnings for excesive links on our site?
Our campaign overview shows well over 100 warnings that could be hurting our google ranking based on excessive links on pages. Each page listed, however, is simply due to listing the brands we carry, and linking to the products. Is there a way to do this without hurting our ranking? A better way than linking, perhaps? Thanks in advance!
Web Design | | guycochran0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0