Too Many On-Page Links
-
Most of my pages have "Too Many On-Page Links".
If you view the website you will see this is mainly down to the top navigation drop down menu: http://www.cwesolutions.co.uk
So if I wanted to reduce the number of links I would have to have category links with landing pages.
How much does having "Too Many On-Page Links" effect my website ranking? Is it really important and would I notice a difference if I changed it?
-
In the sense that a report is alerting that there are too many links on a page though is going to be because it is warning of authority leakage more than anything else though.
Granted, the insertion of 'just' in my previous statement probably misleads...
I agree though that if you were to link every page to every page, it would just cause a royal headache to crawlers - why one would do that though is beyond me.
-
Sorry Pete, that was a typo where I meant to say 'internal linkage is classed differently as external linkage' (I've amended my reply).
-
To say "linking just dilutes the authority from the page" is not strictly true.
In this particular instance I do not believe that the menu is too OTT in terms of links but if you have every page linking to every page you just end up creating a mesh of links that search engines can waste crawl time trying to decipher.
Too many links with poor structure can mean that your crawl allocation is wasted which results in less of your pages being indexed properly.
-
Thanks Geoff, but want did you mean here...
Internal linkage is classed differently as internal linkage?
-
What makes you think you have too many on page links? Internal linkage is classed differently as external linkage, linking just dilutes the authority from the page (amongst other crawlability and usability factors), but if it's internal linkage, at least the authority is kept on your domain.
Simple, effective and easy to use navigation is critical to website usability and will have an affect on performance in search engines. In your example, I wouldn't expect any negative ranking affects to occur as a result of the style of navigation menu your website utilises. If that is deemed most useful for your customers, then that's the best approach.
-
Use a tool such as SEOmoz' Crawl Test or Xenu's Link Sleuth to determine any crawling issues. There are a number of free online tools for this too.
-
Thanks for your reply and explanation. One more question... How would I know if the bots are having a problem crawling my website?
-
Too many links on a page can make it difficult for bots to crawl your site. If you have over 100 links on page, make certain you have a sitemap that lays out a road map for the bots.
If your site is being effectively crawled, don't worry about it. I would worry if the linking structure is impacting the user experience or if deeper pages weren't getting crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are slides how's etc the new Splash Pages?
I just did SEO audits of approx 50 websites in the tourism sector. Nearly all had poor Google Pagespeed ratings, partly down to that, among other factors. I also feel that slideshows,, large images and videos in headers are poor for usability. I say get the content people need to engage with in front of them asap Are there any stats or studies that can provide insight on this? I've been telling those with these designs to keep an eye on bounce rates and let that guide them
Web Design | | anndonnelly0 -
What are the downsides and/or challenges to putting page paths (www.example.com/pagepath) on a different server?
Hi, Our company is organized into three different segments and our development team recently needed to switch a portion of the business to subdomain because they wanted to move to a different server platform. We are now seeing the impact of moving this segment of the business to a subdomain on the main domain. SEO is hurting and our MOZ score has dropped significantly. One fix they are debating is moving everything back to one domain, but place segments of the business on different page paths and hosting specific paths on different servers. I.e. the main domain could be www.example.com hosted in one location and then www.example.com/segment1 would be hosted on a different server. They are hoping to accomplish this using some sort of proxy/caching redirection solution. The goal of this change would be to recapture our domain strength. Is this something that is a good option or no? If not, what are the challenges and issues you see arising from doing something like that as I don't know of any other site set up like this. Thanks in advance.
Web Design | | bradgreene0 -
Duplicate Content Home Page http Status Code Query
Hi All, We have just redone a site wide url migration (from old url structure to new url structure) and set up our 301's etc but have this one issue whereby I don't know if' it's a problem of not. We have 1 url - www.Domain.co.uk**/** which has been set up to 301 redirect back to www.domain.co.uk However, when I check the server response code, it comes back as 200. So although it appears to visually 301 redirect if I put the url in the tool bar, the status code says different. Could this be seen as a potential duplicate home page potentially and if so , any idea how I could get around it if we can't solve the root cause of it. This is on a cake php framework, thanks PEte
Web Design | | PeteC120 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
What is the best tool to view your page as Googlebot?
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
Web Design | | EcommerceSite0 -
Decreasing Page Load Time with Placeholder Images - Good Idea or Bad Idea?
In an effort to decease our page load time, we are looking at making a change so that all product images on any page past page 1 load with a place holder image. When the user clicks to the next page, it then loads all of the images for that page. Right now, all of the product divs are loaded into a Javascript array and loaded in chunks to the page display div. Product-heavy pages significantly increase load time as the browser loads all of the images from the product HTML before the Javascript can rewrite the display div with page-specific product HTML. In order to get around this, we are looking at loading the product HTML with a small placeholder image and then substituting the appropriate product image URLs when each page is output to the display div. From a user experience, this change will be seamless and they won't be able to tell the difference, plus they will benefit from a potentially a short wait on loading the images for the page in question. However, the source of the page will have all of the product images in a given category page all having the same image. How much of a negative impact will this have on SEO?
Web Design | | airnwater0 -
Google Penalizing Websites that Have Contact Forms at Top of Website Page?
Has anyone else heard of Google penalizing websites for having their contact forms located at the top of the website? For example http://www.austintenantadvisors.com/ Look forward to hearing other thoughts on this.
Web Design | | webestate1 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090