I have a client where every page has over 100 links
-
Some links are in the main navigation (it has a secondary and tertiary level) and some links are repeated in the left navigation. Every page has over 100 links if crawled.
From a practical standpoint, would you (a) delete the 3rd-level links (or at least argue for that) or (b) rel='nofollow' them? From a usability standpoint, this setup works as they are almost one click from everything. From a crawl standpoint, I see some pages missed in google (the sitemap has over 200 links).
Looking for the best on-page current SEO advice to set these guys on the road to success.
-
The site isn't low quality (there are no ads and they don't sell anything -- it is a scientific site) -- it's just that EVERY link is available as a secondary or tertiary link. My initial thought is to simply get rid of the tertiary level within the main nav, cutting out roughly half of the links. On any inside page, they are available on a left-side nav anyway. The smallest number of links is about 110, the largest is pushing 250. I just wondered about everyone's opinion.
Rendering the menu via jQuery as Chad suggests might help. This is a wordpress-based site so I'll have to really look into it as they have to edit it too.
We've already begun mapping out clicks as goals (conversions) within GA.
-
If the site has good authority/PR I wouldnt worry about it although I would look at in-page analytics to see whether people actually click the links. If its setup like this to pass link juice the above is more applicable, if not look at whether people are actually clicking the links and if they arent id suggest an alternative navigation.
-
First off, never use rel='nofollow' to your own site.
Personally, I would trim up the left menu if you can, or find an easier / creative JavaScript-driven way to present the data. The 100 links thing isn't a law written in stone. SEOmoz's tools do yell about it if you go over 100 links. This "100 link lore" comes from a Matt Cutts blog post:
http://www.mattcutts.com/blog/how-many-links-per-page/
If you look close, you may notice that there are more than 100 links even on the page that Matt wrote about this. It's kind of a loose guideline in my eyes. From my own professional experience, if every page on your site has 500 links, you're going to hurt for it. But if you have 125 links on quite a few pages, or put out a blog post that's just an insane resource that links to a few hundred people, you'll still be just fine.
I'd think about it as just one more signal of a potentially abusive or low quality site. If your site isn't under heavy scrutiny for other reasons, and you don't go totally nuts with links, you'll probably be just fine, but there is a lot of wisdom in Matt Cutt's post all the same. Eliminate the unnecessary and things will work better, in and out of Google.
-
Use jQuery- it will basically solve all your to many links on a page.
Chad
-
nofollow is not a good idea if you have a good pagerank I wouldn't worry about it. But remove duplicates first..
-
We had a similar issue with one site -- a footer with a ton of links to all parts of the site, duplicated on every page. The feeling was that it flattened out the site's link structure too much, so we changed it so that the footer was loaded via ajax on page load.
In that case, I don't think it made much difference performance-wise so I can't say for certain if it will help you. But, it is a way to clean up your link structure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page optimisation
It's not possible to add a keyword to the homepage url. Do we still add a secondary keyword along the primary one on the home page title and meta description etc. How do we make our primary keyword to dominate in this case. Thanks
On-Page Optimization | | Timberwink0 -
Page Hierarchy Question
I understand the basic concept of page hierarchy, i.e. parent and child pages. My question is: Should the home page be the parent of all 2nd-level pages? Can/should there only be one top-level page, the home page? In other words, is this: site.com/homesite.com/home/products site.com/home/products/widgetsite.com/home/aboutsite.com/home/contactbetter than this:site.com/homesite.com/products site.com/products/widgetsite.com/aboutsite.com/contactThanks for your opinion!
On-Page Optimization | | BillWoods0 -
Is it better to have an hreflang go to the home page in a different language if there's no corresponding page
If I have some pages in English, but not in Spanish on my website: Should my hreflang go to the home page on the Spanish site? Or should I not have an "es-MX" hreflang for that page? Ideally I would have all the pages translated, but this has not all been done yet.
On-Page Optimization | | RoxBrock0 -
To Many Links On Page
I'm having a problem on a crawl warning for our main site. The warning is that every one of my pages has to many links, a little over 1,000 on almost all of them. I think this is because our category list on our left hand sidebar has so many categories, and that sidebar appears on every last one of our pages even all the way into our products. Can anyone take a look and tell me if this is the reason why and what I could possibly do about this? Thanks in advance! www.Ocelco.com
On-Page Optimization | | Mike.Bean0 -
Link Building
I have to be doing something wrong. I have been trying to get homes for sale in Casa Grande AZ, and Casa Grande Real Estate to rank well in google. However, I am dropping in rank. What am I doing wrong http://azbestlistings.com/casa-grande-az-real-estate-homes-for-sale-in-casa-grande-az
On-Page Optimization | | sansonj0 -
Page without content
Hey Everyone, I've started an SEO On Page analysis for a web site and I've found a lot of duplicate content and useless pages. What do I have to do? Delete this useless page, redirect or do canonical tag? If I have to delete what is the best way to do? Should I use GWT to delete? or just delete from the server? This URL for example: http://www.sexshopone.com.br/?1.2.44.0,0,1,13,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} There is no content and it is duplicate in reference of this: http://www.sexshopone.com.br/?1.2.44.0,0,1,12,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} and the correct page of the product is: http://www.sexshopone.com.br/?1.2.44.0,423,anel-peniano-evolved-boss-cock's-pleasure-rings-collar-white-reutilizavel-e-a-prova-d'agua-colecao-evolved.html [admin note: NSFW page} What is happening is that we have 8.000 pages like this. Useless and without any content. How do I proceed? Thanks!
On-Page Optimization | | luf07090 -
Internal link to the home page
When building menus and other internal links, should the link to the home page be http://www.domain.com/ or http://www.domain.com/index.html or does it matter? Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | | MeghanPrudencio0