Best method to stop crawler access to extra Nav Menu
-
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text.
We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages.
You can get to every product and category page without using the drop down mega-menu.
Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else.
I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this?
I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else.
Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all.
Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider.
What would you do and why?
Thanks,
James
-
I agree Alan,
Mega Menu's are a good way to dilate the link equity of your page and in most cases it isn't needed at all. Keep the top-level navigation simple and have a submenu on all pages that contain links relevant to that section.
EG: Mega Menu could be:
Home Mens (Mens Tops, Mens Jeans, Mens Coats), Women (Womens Tops, Womens Jeans etc) Contact us
In this example it would be better to have one top level menu for:
Home | Mens | Women | Contact us
Then when your in the men or women section show links to "Tops", "Jeans" and "Coats". That way those links are relevant to the section you're in and reinforces the structure of that section to search engines.
After giving it further thought I would suggest not having a mega menu at all, because it may harm your SEO on-page optimisation efforts in the long term.
-
Ben's partially correct. Unfortunately Google has been claiming they do process Javascript for a while, and they recently stated they've begun reading AJAX. Of course they do a lousy job of it and don't always get it right, which just makes things even more muddy.
So from an SEO best practices perspective, you shouldn't have the menu(s) in the first place, at all.
You may also THINK their good for users but has any significant study been performed to confirm that? You'd need to check click-through rates on all the links to know for sure.
What I've found through years of auditing sites that have such menus is that it almost always turns out to be the case where most of the deeper links NEVER get clicked on from within these menus. Instead, they're overwhelming to users. This is why it's better to not have them from a UX perspective.
If you abandon them and go with more traditional hierarchical index and sub-index pages, and if those are properly optimized, you'll not only eliminate the massive SEO problem but in fact get more of your category pages to have higher ranking strength and authority over time.
IF you're going to keep them in any form because you don't want to go to the extreme I recommend, then yes - AJAX would likely be the only scenario that offers the least likelihood of search engines choking on the over-use of links.
And for the record, the real current problem with all those links on every page is duplicate content confusion - all of those URLS at the source level dilutes the uniqueness of content on every page of the site. And that also means you're harming the topical focus of every page as well. So whatever you do, AJAX or doing away with them altogether is going to be of high value long term.
- Alan Bleiweiss - Click2Rank's Search Team Director
-
From my experience I don't think you can really 'hide' the megamenu links from a crawler if they are generated using a content management system (code server side). If the link is on the page in the HTML then it will be crawled by a bot etc.
The general method of getting a mega menu to work is through the use of CSS and JavaScript, so you might want to have a look at using AJAX to get the relevant links from the database and then use JavaScript to put the links into the page.
This isn't a great solution, but bots cannot load JavaScript, so what they will see is only the links that are served up from the content management system.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best markup for testimonials?
These are testimonials that our staff have collected themselves, so I'm unsure about using a block quote and citing a source, since these testimonials are original content, not references from another website. I just need to place the quote, the author's name, and, in some cases, the author's city. Any recommendations?
Web Design | | bcaples0 -
Best Location for Copy Block
We are having discussions around the appropriate location to place the SEO copy block on an eCommerce category page. Would like to get the communities opinion to share with the creative team.
Web Design | | TukTown0 -
Best URL when adding an SSL certificate . . .
Our (small) company is a little late to the party on this, and we've only just realised that we're better off with an SSL certificate for our website. (Yes I know, I know, but we dropped SEO some time ago after getting severely bitten by a certain Penguin, and are only just making tentative step back to it after those intervening years, so we're running to get back up to date with these things.) This has now been implemented, but our web guy has dropped the 'www' element during the process. Our http://domain.com address has always historically been redicrected to our main http://www.domain.com address. Now our web guy has implemented the SSL cert, our website URL is appearing as https://domain.com, and he has redirected the http://www.domain.com to that new URL. Obviously all our historic (and more recent) link building has been to the http://www.domain.com address. Is this an issue, should the new Https URL keep the 'www', or does it make no difference what so ever? Conversely could it actually be of benefit dropping the 'www.' because our keyword specific product URL's are now 4 characters closer to the http and 4 digits shorter? Finally, on the links we have control of (professional trade associations etc) do we need to ask them to change the links to the new Https address, or does the transition from Http to Https make no difference?
Web Design | | Wookii0 -
Drupal Domain Access SEO Issues
I'm working with a new developer to redesign several Drupal sites and have 3-4 sites with similar designs and modules. The developer is keen on using Drupal Domain Access to make maintenance and sharing user information easier. Each site currently has a unique domain and content (although the sites are in related niches). Are there issues from an SEO perspective with the Drupal Domain Access Module? With only one instance of Drupal on the backend will Google somehow not view these as independent sites? Thanks for any info!
Web Design | | talltrees0 -
What is the best way to handle annual events on a website?
Every year our company has a user conference with between 300 - 400 attendees. I've just begun giving the event more of a presence on our website. I'm wondering, what is the best way to handle highlights from previous years? Would it be to create an archive (e.g. www.companyname.com/eventname/2015) while constantly updating the main landing page to promote the current event? We also use an event website (cvent) to handle our registrations. So once we have an agenda for the current years event I do a temporary redirect from the main landing page to the registration website. I don't really like this practice and I feel like it might be better to keep all of the info on the main domain. Wondering if anybody has any opinions or feedback on that process as well. Just looking for best practices or what others have done and have had success with.
Web Design | | Brando161 -
Best Practices for Leveraging Long Tail Content & Gated Content
Our B2B site has a lot of of long form content (e.g., transcriptions from presentations and webinars). We'd like to leverage the long tail SEO traffic driven to these pages and convert those visitors to leads. Essentially, we'd like Google to index all this lengthy, keyword-rich content AND we'd like to put up a read gate that requires users to register before viewing the full article. This is a B2B site, and the goal is to generate leads. Some considerations and questions: How much of the content to share before requiring registration? Ask too soon and it's a terrible user experience, give too much away and our business objectives are not met. Design-wise, what are good ways to do this? I notice Moz uses a "teaser" to block Mozinar content, and I've seen modals and blur bars on other sites. Any gotchas that Google doesn't like that we should be aware of? Trying to avoid anything that might seem like cloaking. Is it better to split the content across several pages (split a 10K word doc across 10 URLs and include a read gate on each) or keep to one page? Thank you!
Web Design | | Allie_Williams0 -
Confirm responsive design is accessible to crawlers?
Trying to ensure crawlers can access all important info on a new responsive website, but it's javascript-heavy. Following Google's advice/best practices, but can't get the Fetch and Render tool to confirm it's working. Anyone dealt with this before? Also posted in Product Forums with more detail: https://productforums.google.com/forum/#!msg/webmasters/PjxtBYClupw/56id80FUQdMJ
Web Design | | BoxerPropertyHouston0 -
For a varied product type or keywords group is it best to have several sites?
Hello everyone... Question: I have 7-8 generic keywords that I would like to rank for, is it possible for one site to rank highly for all these different keywords, or would this be best achieved by making 2 or 3 websites in total targeting different keywords (product sectors)? More info: We are in a niche industry & would like to know if it would be beneficial to have several websites made for specific product types rather than one main site? Although these sub classifications of products are nice, they are competitive as they have a high search volume Would it be better to build specific websites that only do that one type of product and have related keyword in domain, content & blogs on the site to that effect to increase relevance and positions as a result? Thanks
Web Design | | Ray_UK0