Best method to stop crawler access to extra Nav Menu
-
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text.
We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages.
You can get to every product and category page without using the drop down mega-menu.
Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else.
I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this?
I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else.
Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all.
Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider.
What would you do and why?
Thanks,
James
-
I agree Alan,
Mega Menu's are a good way to dilate the link equity of your page and in most cases it isn't needed at all. Keep the top-level navigation simple and have a submenu on all pages that contain links relevant to that section.
EG: Mega Menu could be:
Home Mens (Mens Tops, Mens Jeans, Mens Coats), Women (Womens Tops, Womens Jeans etc) Contact us
In this example it would be better to have one top level menu for:
Home | Mens | Women | Contact us
Then when your in the men or women section show links to "Tops", "Jeans" and "Coats". That way those links are relevant to the section you're in and reinforces the structure of that section to search engines.
After giving it further thought I would suggest not having a mega menu at all, because it may harm your SEO on-page optimisation efforts in the long term.
-
Ben's partially correct. Unfortunately Google has been claiming they do process Javascript for a while, and they recently stated they've begun reading AJAX. Of course they do a lousy job of it and don't always get it right, which just makes things even more muddy.
So from an SEO best practices perspective, you shouldn't have the menu(s) in the first place, at all.
You may also THINK their good for users but has any significant study been performed to confirm that? You'd need to check click-through rates on all the links to know for sure.
What I've found through years of auditing sites that have such menus is that it almost always turns out to be the case where most of the deeper links NEVER get clicked on from within these menus. Instead, they're overwhelming to users. This is why it's better to not have them from a UX perspective.
If you abandon them and go with more traditional hierarchical index and sub-index pages, and if those are properly optimized, you'll not only eliminate the massive SEO problem but in fact get more of your category pages to have higher ranking strength and authority over time.
IF you're going to keep them in any form because you don't want to go to the extreme I recommend, then yes - AJAX would likely be the only scenario that offers the least likelihood of search engines choking on the over-use of links.
And for the record, the real current problem with all those links on every page is duplicate content confusion - all of those URLS at the source level dilutes the uniqueness of content on every page of the site. And that also means you're harming the topical focus of every page as well. So whatever you do, AJAX or doing away with them altogether is going to be of high value long term.
- Alan Bleiweiss - Click2Rank's Search Team Director
-
From my experience I don't think you can really 'hide' the megamenu links from a crawler if they are generated using a content management system (code server side). If the link is on the page in the HTML then it will be crawled by a bot etc.
The general method of getting a mega menu to work is through the use of CSS and JavaScript, so you might want to have a look at using AJAX to get the relevant links from the database and then use JavaScript to put the links into the page.
This isn't a great solution, but bots cannot load JavaScript, so what they will see is only the links that are served up from the content management system.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Menu setup
I'm looking for feedback on menu structure to be SEO friendly. This is setup with ubmenu. What I am gathering is that because of the amount of items it could cause issues with google because it is exceeding the maximum links per page, currently the come shows 153 links. I'm not sure what to do since people search for all these terms in the menu, but I don't want to be penalized. also saw this http://www.practicalecommerce.com/articles/127191-SEO-Simplifying-Navigation-Cuts-Performance so im really confused. http://www.familychristiandoors.com/
Web Design | | familychristiandoors1 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Can external links in a menu attract a penalty?
We have some instances of external links (i.e. pointing to another domain) in site menus. Although there are legitimate reasons (e.g. linking to a news archive kept on a separate domain) I understand this can be considered bad from a usability perspective. This begs the question - is this bad for SEO? With the recent panda changes we've seen certain issues which were previously "only" about usability attract SEO penalties, but I can't find any references to this example. Anyone have thoughts / experience?
Web Design | | SOS_Children0 -
Best Practices for home page design for ecommerce website
I know this question is not directly related to SEO, but I figured I have been getting some good help from this forum, so why not? The website is www.vrtack.com. I am looking to redesign the home page. It is an ecommerce website selling equestrian clothing and leather goods. My goals are: 1. Reduce the very high bounce rate and drop-off rate. 2. Fine tune the relevancy of the website towards a handful of keyword phrases. 3. Engage the visitor to create better click-through and to increase the average time spent on the page/site. 4. Page Loading time is of importance. It has to load quickly. I would love to hear some specific suggestions, examples, best practices.
Web Design | | amitramani0 -
Main Menu
On our website http://villasdiani.com we have in main menu 3 categories+1 category which scrolls down, and 2 pages. In order to avoid duplicate we have set categories non index. Is it not a problem that in main menu of the website are categories which are set non-index? There are as well 2 pages of which 1 is contact page which is also set on non-index and the second is Favorites page where visitor can view properties they liked and is also set on non index. I will appreciate any help 🙂
Web Design | | VillasDiani0 -
Best Way to Remove Mutltiple XML Sitemaps From Multiple Subdomains
Just found a series of of XML sitemaps hosted like so: http://www.thesite.anothersite.com/sitemap.xml and defaulted to remove and 301 redirect but as this is the first time I've encountered an issue like this, an outside opinion or two would be much appreciated. Is the 301 the best option, should I 404 them or what?
Web Design | | ePageCity0 -
Best Way To Have HD Videos On Site That Will Work On Mobile Devices
Hi, I hope someone can help me with this. I am working on a site for a client who works at a video production company. They want to have a fair few HD videos on there site but also for the site and videos to be viewable on mobile devices. I have got a responsive wordpress theme and the site is beginning to take shape. I am wondering however how I can best get the videos to display on mobile devices while maintaining a good load speed. Until now I have been using amazon S3 which stores and feeds the videos and I use Easyvideoplayer to embed the videos. The problem is they do not appear to show up from mobile devices when using wordpress. can anyone suggest the best way for me to still feed the videos from S3 but get them to display on mobile devices. oh, they are private videos so they cannot be placed on youtube.
Web Design | | jensonseo0 -
What's the best way to structure original vs aggregated content
We're working on a news site that has a mix of news wires such as Reuters and original opinion articles. Currently the site is setup with /world /sports etc categories with the news wire content. Now we want to add the original opinion content. Would it be better to start a new top /Opinion category and then have sub-categories for each Opinion/world, Opinion/sports subject? Or would it be better to simply add an opinion sub-category under the existing news categories, ie /world/opinion? I know Google requests that original content be in a separate directory to be considered for inclusion in Google news. Which would be better for that? Regarding link building, if the opinion sub-categories were under the top news categories, would the link juice be passed more directly than if we had a separate Opinion top category?
Web Design | | ScottDavis0