Best method to stop crawler access to extra Nav Menu
-
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text.
We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages.
You can get to every product and category page without using the drop down mega-menu.
Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else.
I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this?
I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else.
Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all.
Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider.
What would you do and why?
Thanks,
James
-
I agree Alan,
Mega Menu's are a good way to dilate the link equity of your page and in most cases it isn't needed at all. Keep the top-level navigation simple and have a submenu on all pages that contain links relevant to that section.
EG: Mega Menu could be:
Home Mens (Mens Tops, Mens Jeans, Mens Coats), Women (Womens Tops, Womens Jeans etc) Contact us
In this example it would be better to have one top level menu for:
Home | Mens | Women | Contact us
Then when your in the men or women section show links to "Tops", "Jeans" and "Coats". That way those links are relevant to the section you're in and reinforces the structure of that section to search engines.
After giving it further thought I would suggest not having a mega menu at all, because it may harm your SEO on-page optimisation efforts in the long term.
-
Ben's partially correct. Unfortunately Google has been claiming they do process Javascript for a while, and they recently stated they've begun reading AJAX. Of course they do a lousy job of it and don't always get it right, which just makes things even more muddy.
So from an SEO best practices perspective, you shouldn't have the menu(s) in the first place, at all.
You may also THINK their good for users but has any significant study been performed to confirm that? You'd need to check click-through rates on all the links to know for sure.
What I've found through years of auditing sites that have such menus is that it almost always turns out to be the case where most of the deeper links NEVER get clicked on from within these menus. Instead, they're overwhelming to users. This is why it's better to not have them from a UX perspective.
If you abandon them and go with more traditional hierarchical index and sub-index pages, and if those are properly optimized, you'll not only eliminate the massive SEO problem but in fact get more of your category pages to have higher ranking strength and authority over time.
IF you're going to keep them in any form because you don't want to go to the extreme I recommend, then yes - AJAX would likely be the only scenario that offers the least likelihood of search engines choking on the over-use of links.
And for the record, the real current problem with all those links on every page is duplicate content confusion - all of those URLS at the source level dilutes the uniqueness of content on every page of the site. And that also means you're harming the topical focus of every page as well. So whatever you do, AJAX or doing away with them altogether is going to be of high value long term.
- Alan Bleiweiss - Click2Rank's Search Team Director
-
From my experience I don't think you can really 'hide' the megamenu links from a crawler if they are generated using a content management system (code server side). If the link is on the page in the HTML then it will be crawled by a bot etc.
The general method of getting a mega menu to work is through the use of CSS and JavaScript, so you might want to have a look at using AJAX to get the relevant links from the database and then use JavaScript to put the links into the page.
This isn't a great solution, but bots cannot load JavaScript, so what they will see is only the links that are served up from the content management system.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best less expensive graphic design
Hello, We have an Ecommerce store and we need our category buttons to be redone and to shine. Unfortunately, I've tried all my places and none are 10X, though none of them cost very much (like Fiverr, Freelancer) What I would like is some advice on where to go for inexpensive but still very good quality graphics. I'm a good designer, and so far, I can do better graphics than the designers I've found, I just don't have the time as a busy SEO. Please let me know If you have any gems you are willing to share. I'm searching the community colleges as we speak. Thanks.
Web Design | | BobGW2 -
Best practice for product detail when all products are onepage
HI there,
Web Design | | Hephey
I have a page utilizing isotope with multiple products with small text excerpts and when you click an item i opens a detailed view without requiring a new page load. I've read some of the one page posts but can't get my head around what's best SEO wise when dealing with possible duplicate content. I guess one method could be to have the product list with small excerts of text and all the details hidden in some json and then when the user clicks it, it will open the product and fill with details from inline json. The click action is overring the a tag action e.g. with jquery, so the the a tag has a clean url to a proper subpage with meta, h1 and all that stuff so google can follow it. The jquery thing enables the navigation without a page reload and I can update the document url with pushState.
The subpage, if visited directly, includes the same animation stuff as the master but now has h1, p meta specific to that product but still with same effect, navigation and layout as the master page. Does anybody know if there is a better way to do this with one page sites when wanting to seo optimize detailed contents?0 -
A/B Testing.. Are you doing? how is it been? What do you think would be the best path for who is starting now?
Hey Mozers, One of my 2014 resolutions is to start doing A/B Testing, so far I have been following "best practices" and "common sense" when comes to website design, but I would like to go above and beyond. I was hoping a could get a few tips some of you that are already doing A/B testing. How is it been? Do you see a great ROI? What do you think would be best path for who is starting now? Any book or links you would recommend? Thanks
Web Design | | Felip30 -
Best Forum Platform from an SEO perspective
Hi All, I am looking to start a eCommerce business and would like to centre the user engagement of the site around a forum. Can anyone suggest a forum platform that adopts good SEO practice? So far my considerations are; phpBB vBulletin Wordpress with bbPress plugin Wordpress with Tal.ki plugin Anyone used these with great success? Do you have another suggestion? I am simply in the preliminary stage of sourcing something and am eager to here your thoughts... Thanks in advance... Dan
Web Design | | djlaidler0 -
What is the best way to point newly built website on new domain name to the original more well known domain?
Live website on abc.com domain is being totally redone and moved to a new platform. to facilitate full testing and compliance, the new look and content was built on a different url - xyz.com for example. Now that all content is approved and testing, we want people visiting the abc.com domain to see the xyz.com website without necessarily redirecting abc.com to xyz.com What is the best to do this? Thanks all
Web Design | | wkismb0 -
Redirect based on location best practice clarification?
Hi, i have a question that i have seen some other have also had. The question is what is the best practice to serve the location specific page to the user (based on their location)? This post (http://www.seomoz.org/q/redirecting-users-based-on-location) suggests against automatically redirecting the user based on IP address. I guess the primary concern is that Google bot will also be redirected in this case... I see a number of well known sites use automatic redirect based on location. Take Urbanspoon for example (http://www.urbanspoon.com/), they use a 302 redirect to redirect to location specific page. Do they not redirect Google bot? Is there any way to test this? Can creating a rule to exclude crawlers from redirect cause SEO problems? How? Another example that i am somewhat confused as to how it works effectively is groupon.com.au It selects my closest city (i assume using IP), however the URL stays as the root URL. For example, i typed in http://www.groupon.com.au/ and it stays as http://www.groupon.com.au/ with the city chosen as "Melbourne". The canonical url for this page is the root URL (ie http://www.groupon.com.au/). If you then select "change city" and click the same city (ie Melbourne), it redirects to http://www.groupon.com.au/deals/melbourne. Canonical URL of this page is http://www.groupon.com.au/deals/melbourne. How is this not duplicate content? Can you please advise on the best way to redirect (ideally automatically), to provide the best user experience, while still having Google bot able to crawl the site effectively? Thanks
Web Design | | blackrails0 -
For a new business which would be the best domain name?
I'm starting a new business selling Whitby Jet and I want to buy the best domain name which will help me in the search engines, I want to use the keywords in the domain name because it think it will help when people link to the website, but whitbyjet.com and whitbyjet.co.uk have both been taken. I could be wrong I think .com is generally thought of as the best and I don't really like the idea of .net or .org So maybe I might have to choose the third word which tends to make the doamin a little bit longer like whitbyjetjewlery.co.uk. Any suggestions would be greatly received.
Web Design | | whitbycottages0 -
Best Site navigation solution
Hi there, We are getting our website redesigned and would like to know whether to increase the links on our site wide navigation or not. At the moment we have around 30 links from the navigation. We want to use exploding navigation menu and increase the links to our most important categories. Say if we increase to 60-70 would that be alright. (what will be the highest we can go for) At the moment categories that get links from navigation are ranking pretty good. If we increase would we loose those rankings. What will be the pros and cons of increasing navigation links? Second question we are also adding fooer links to top 10 categories in the footer. Would this be ok as far as seo and google concerned. Many Thanks
Web Design | | Jvalops0