I have a client where every page has over 100 links
-
Some links are in the main navigation (it has a secondary and tertiary level) and some links are repeated in the left navigation. Every page has over 100 links if crawled.
From a practical standpoint, would you (a) delete the 3rd-level links (or at least argue for that) or (b) rel='nofollow' them? From a usability standpoint, this setup works as they are almost one click from everything. From a crawl standpoint, I see some pages missed in google (the sitemap has over 200 links).
Looking for the best on-page current SEO advice to set these guys on the road to success.
-
The site isn't low quality (there are no ads and they don't sell anything -- it is a scientific site) -- it's just that EVERY link is available as a secondary or tertiary link. My initial thought is to simply get rid of the tertiary level within the main nav, cutting out roughly half of the links. On any inside page, they are available on a left-side nav anyway. The smallest number of links is about 110, the largest is pushing 250. I just wondered about everyone's opinion.
Rendering the menu via jQuery as Chad suggests might help. This is a wordpress-based site so I'll have to really look into it as they have to edit it too.
We've already begun mapping out clicks as goals (conversions) within GA.
-
If the site has good authority/PR I wouldnt worry about it although I would look at in-page analytics to see whether people actually click the links. If its setup like this to pass link juice the above is more applicable, if not look at whether people are actually clicking the links and if they arent id suggest an alternative navigation.
-
First off, never use rel='nofollow' to your own site.
Personally, I would trim up the left menu if you can, or find an easier / creative JavaScript-driven way to present the data. The 100 links thing isn't a law written in stone. SEOmoz's tools do yell about it if you go over 100 links. This "100 link lore" comes from a Matt Cutts blog post:
http://www.mattcutts.com/blog/how-many-links-per-page/
If you look close, you may notice that there are more than 100 links even on the page that Matt wrote about this. It's kind of a loose guideline in my eyes. From my own professional experience, if every page on your site has 500 links, you're going to hurt for it. But if you have 125 links on quite a few pages, or put out a blog post that's just an insane resource that links to a few hundred people, you'll still be just fine.
I'd think about it as just one more signal of a potentially abusive or low quality site. If your site isn't under heavy scrutiny for other reasons, and you don't go totally nuts with links, you'll probably be just fine, but there is a lot of wisdom in Matt Cutt's post all the same. Eliminate the unnecessary and things will work better, in and out of Google.
-
Use jQuery- it will basically solve all your to many links on a page.
Chad
-
nofollow is not a good idea if you have a good pagerank I wouldn't worry about it. But remove duplicates first..
-
We had a similar issue with one site -- a footer with a ton of links to all parts of the site, duplicated on every page. The feeling was that it flattened out the site's link structure too much, so we changed it so that the footer was loaded via ajax on page load.
In that case, I don't think it made much difference performance-wise so I can't say for certain if it will help you. But, it is a way to clean up your link structure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Single Page on my client's website is not crawling and indexing new changes. What could be the possible reason?
I made several changes on client's website on different pages, changed titles, add content on few pages, moved blog from subdomain to sub directory. Everything is crawled but there is one page on the website (not part of the blog) that isn't getting crawled in Google and picking up changes. The last crawl of the website is 2 days back whereas that page was last crawled on 30th sep. I just wanted to know the possible reasons and has anyone encountered this before?
On-Page Optimization | | MoosaHemani0 -
Inbound Linking from your own sites
Good evening, On each of the sites I have made, I have a link with the anchor text 'Build and Design by Christoper Davies' to my own website. This link is in the footer of every page each of all the sites. Should I have a 'no follow' rel added to these links, or does linking from all the sites (on all pages) help my ranking? I am concerned that having so many inbound links from the same sites, with the same anchor text may be doing me more damage than good.
On-Page Optimization | | chrisdavieswebdesign0 -
Too Many on page links on my homepage question?
Question, on moz analytics, for my homepage, I've gotten the " Too many on page links" notification every crawl. I've always ignored it because i didnt think it affected ranking or anything really. The on page links on the bottom of our homepage are landing pages. Most users reach those pages by searching those specific pages on google. We just decided to put them on the bottom of our page for unknown reasons. My question is, should i remove the landing pages on the bottom of our homepage? WIll it improve search rankings for my homepage? if i do remove them, should i put the landing pages on another page besides the homepage? Does google index my website better without the on page links on my homepage? My website is- prestigeluxuryrentals.com
On-Page Optimization | | prestigeluxuryrentals.com0 -
Any issue with my On page SEO
Kindly review my website and let me know if there is any issue with On page or I am missing anything? Home page - http://www.ayurjeewan.com Deep Page - http://www.ayurjeewan.com/natural/divya-triphala-churna/
On-Page Optimization | | MasonBaker0 -
Client has no visibility
Background: Client was hacked a few months back. Everything was cleaned up, and he maintained rankings for some keywords before completely dropping off. Webmasters Tools was added after the hack, which shows no issues. ahrefs, majestic, and OSE show bad links but not GMT. Disavowed anyway. Not sure what else could be blocking his visibility, main pages have been index but not specific blog posts. Any suggestions or directions to look?
On-Page Optimization | | MibuKotaro0 -
Wordpress Post as Slideshow - One long page vs many short pages?
We are working on implementing a slideshow format for some of the posts on a website, and it appears that using this format breaks a long post into several shorter pages. That's what we want from a user experience standpoint, but are wondering if there are negative SEO implications from having the content broken up in this way, and whether search engines will view it as one longer page or several very short pages? Here is an example: http://www.forthebestrate.com/10-cheap-ideas-for-summer-fun/ Thanks for the help!
On-Page Optimization | | ILM_Marketing0 -
404 link | How to remove the link so it is not found?
My report has listed a few links with 404 errors. They are internal links but are not found. Is there a way to remove that link so it is not found again? Thanks
On-Page Optimization | | SavingSense0