Best Site navigation solution
-
Hi there,
We are getting our website redesigned and would like to know whether to increase the links on our site wide navigation or not.
At the moment we have around 30 links from the navigation. We want to use exploding navigation menu and increase the links to our most important categories. Say if we increase to 60-70 would that be alright. (what will be the highest we can go for)
At the moment categories that get links from navigation are ranking pretty good. If we increase would we loose those rankings.
What will be the pros and cons of increasing navigation links?
Second question we are also adding fooer links to top 10 categories in the footer. Would this be ok as far as seo and google concerned.
Many Thanks
-
The link limit is 100 suggested by matt cutts.
No it's not The 100 link limit suggestion was removed a while ago, though it's is indeed best not to go wild with it.
But the more pages that you link from your home page and back to your home pge the better the home page will rank.
This isn't really true either. If that were the case I could just make a site with 1,000,000 pages all linking back to the homepage and expect to rank. Whereas actually the only way I'll get value to flow back up to the homepage is to attract links to these 1,000,000 pages from external sites.
-
The link limit is 100 suggested by matt cutts.
But the more pages that you link from your home page and back to your home pge the better the home page will rank.
if yo havemnt already ready this article and play with the calculator -
The more link on a page the less value each one carries. For a visualisation check out the images here.
So going from 30 to 80 is obviously going to devalue the authority of each link, however it should mean there are less 'clicks' the bots have to go through to get to all your pages so may increase the number of pages indexed.
Navigation links are (apparently) often recognised by Google Bot anyway, and the links from them are treated differently to links within content.
If it's going to be a better user experience by having these links then I'd definitely go for it as I don't think it will have a big negative impact (I have sites with over 100 nav links and it gets crawled fine and ranks well).
As for the footer links, traditionally links in footers haven't carried much value (having been spammed in theme marketing) but again it should be fine to have.
You could also start coding your page with HTML tags using things like nav and footer, as I anticipate that links and content in these new section types will eventually have SEO implications a little down the line.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the Estimated Time for SERP Rankings to Replenish after a Site Redesign?
Hello Fellow Moz'ers, My company's website, www.1099pro.com, is currently OLD and not mobile-friendly! However, we rank #1 for out most important keywords and don't want to lose that ranking. I've recently redesigned our site, currently in testing, to use the same standard desktop pages but to also have responsive, mobile friendly, pages for different view ports. My question is if anyone knows an estimated time frame that search engines (mainly Google) takes to re-crawl the site and restore SERP rankings to their previous levels? The reason is because we are HIGHLY seasonal and if we are not back at our top rankings by early December, at latest (November would be better), then we stand the chance to lose a considerable amount of traffic/revenue. -The Unenlightened One
Web Design | | Stew2220 -
Google Indexing Multi-Store Best Practice
Hi Guys, We currently have a main store view and a uk store view setup with a Litespeed Redirect for our website, redirecting UK IP Customers to the UK Store. We recently noticed that we were running into some issues with Google indexing pages from the uk site as well as the main store view. With trying to avoid duplicate content, my question being: What is the best practice for google indexing the UK and Main store views? Any advice would be greatly appreciated. Thanks.
Web Design | | centurysafety0 -
URL Re-Mapping Question ?. Do I need to the theme of my business in my url struture even though GWT knows what my site is about
Hi All, I have currently planning to do some url remapping on my Hire Website as alot of most important pages are far to many levels deep from the root domain. This is also making my sitemap not tidy etc. In GWT, Google knows that the theme is my website is Hire as it's the top word. Therefore do I still need to use the word hire in all my new url categories / structures or not ? Examples http://goo.gl/BFmvk2 I was thinking of remapping to www.xxxxxxx.xco.uk/tool-hire-birmingham http://goo.gl/pC9Bdp I was thinking of remapping to www.xxxxxx.co.uk/cleaning-equipment Notice in the later example , I do not have the word rent in the url. Any advice is much appreciated thanks peter
Web Design | | PeteC120 -
Site structure and blog tags for local with five locations
I have a client who has five locations. Their current web site was structured very well for the pre-penguin and Panda world. However it does not seem to do as well after these changes. I believe it would serve them both with their customers as well as on Google if they localized the site for each location. Currently all the content on the site if focused on one location that is in the largest metro. On the content side we have a plan to produce local content and blogs for each location. My questions are how to go about structuring the site map and blogs to provide the most local juice. I was also wondering how to properly mark up a site with a main trunk and five local branches. I am also trying to figure out how to structure the tags on the blog. On the site map itself I was planning on maintaining all the content as well as the older blogs in the main trunk of the web site. Under this trunk there is a locations page that currently goes to five pages that simply have an address as well as a bulletin board of upcoming events. The blog is directly off the main page with no tie to any location. Here are my thoughts on what I think we should do: I believe we should create a mini web site starting at the location page that has specific content and navigation related to each location. That the content should focus on the specifics of that area and what would serve that clientele the best. We should add to each branch location based on the key words and competition in that area. The blog off the main web site should continue to house the general categories that are already there as well as any other general posts. I think we should add a link to each store page with a location specific blog in each mini location site. Each mini location site should have it's own blog with specific blogs targeted towards the local market. This local blog would also feed in the general blogs from the "trunk" as they are posted. Relating back to my original questions: is what I outlined the right approach or is there a more effective way to do this? Is there any special mark up I should do to tell the directories what to do? How do I structure the tags for the blog? I was thinking of a structure like this: General blog/category/subject under the main structure : local blog/category/subject Any ideas of input on this? Ron
Web Design | | Ron_McCabe1 -
We believe we accomplished an SEO Parallax site with a nice balance. Can the MOZ community critique this site from an SEO perspective?
Our goal was to accomplish a site that has parallax scrolling and great onsite optimization. We noticed that most Awwward winning sites www.awwwards.com have great parallax scrolling but no SEO. Can the MOZ community critique this site from an SEO perspective? (Note this site was optimized for Chrome or Firefox. If you are using IE, you will be redirected to the old site.) www.posicionamientowebenbuscadores.com Note the site is in BETA still. It has the following technologies CSS3 HTML5 REsponsive Wordpress Parallax Scrolling Onsite Optimization (SEO) No mobile (ran out of funds...)
Web Design | | Carla_Dawson0 -
Free related posts recommendation solution?
I was preparing to use Diqus plugin to show related posts, but our site has already used another comment solution which is developed ourselves, and I don't think it's a good idea to replace all the comments of the entire site. So is there another free service to show visitors related posts? We are not a WP site, so I can only accepts HTML&JS codes, not WP plugins, etc. And I also don't want it contain external links even if I can benefit from them, because we are in a pretty specific niche, the recommendations to posts on other sites may take our orders away. Thanks
Web Design | | JonnyGreenwood0 -
Managing international sites
Hi all, I am trying to figure out the best way to manage our international sites. We have two locations, 1 in the UK and 1 in the USA. I currently use GEOIP to identify the location of the browser and redirect them using a cookie to index.php?country=uk or index.php?country=usa. Once the cookie is set I use a 301 redirect to send them to index.php, so that Google doesnt see each url as duplicate content, which Webmaster tools was complaining about. This has been working wonderfully for about a year. It means I have a single php language include file and depending on the browser location I will display $ or £ and change the odd ise to ize, etc. Problem I am starting to notice is that we are starting to rank better and better in the USA search result. I am guessing this is because the crawlers must be based out of the USA. This is great, but my concern is that I am losing rank in the UK, which is currently where most of our business is done out of... So I have done my research and because I have a .net will go for a /uk/ or /us/ sub folder and create two separate webmaster tools site and set them up to target each geographic location. Is this okay? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#2 HERE IS THE PROBLEM: I don't was to have to run two separate website with two separate sets of copy. Also, I dont want to lose all the rank data on urls like: http://www.mysite.net/great-rank-result.html now becomes http://www.mysite.net/uk/great-rank-result.html. On top of this I will have two pages, the one just mentioned and now adding http://www.mysite.net/us/great-rank-result.html, which I presume would be seen as duplicate copy? (Y/n) Can I use rel canonical to overcome this? How can I don't this without actually running the two pages. Could you actually have 1 site in the root folder and just use the same GEOIP techology to do a smart MOD REWRITE adding either UK or US to the url therefore being able to create two webmaster accounts targeting each geographic location? Any advise is most welcome.
Web Design | | Mediatomcat0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0