Best method to stop crawler access to extra Nav Menu
-
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text.
We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages.
You can get to every product and category page without using the drop down mega-menu.
Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else.
I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this?
I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else.
Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all.
Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider.
What would you do and why?
Thanks,
James
-
I agree Alan,
Mega Menu's are a good way to dilate the link equity of your page and in most cases it isn't needed at all. Keep the top-level navigation simple and have a submenu on all pages that contain links relevant to that section.
EG: Mega Menu could be:
Home Mens (Mens Tops, Mens Jeans, Mens Coats), Women (Womens Tops, Womens Jeans etc) Contact us
In this example it would be better to have one top level menu for:
Home | Mens | Women | Contact us
Then when your in the men or women section show links to "Tops", "Jeans" and "Coats". That way those links are relevant to the section you're in and reinforces the structure of that section to search engines.
After giving it further thought I would suggest not having a mega menu at all, because it may harm your SEO on-page optimisation efforts in the long term.
-
Ben's partially correct. Unfortunately Google has been claiming they do process Javascript for a while, and they recently stated they've begun reading AJAX. Of course they do a lousy job of it and don't always get it right, which just makes things even more muddy.
So from an SEO best practices perspective, you shouldn't have the menu(s) in the first place, at all.
You may also THINK their good for users but has any significant study been performed to confirm that? You'd need to check click-through rates on all the links to know for sure.
What I've found through years of auditing sites that have such menus is that it almost always turns out to be the case where most of the deeper links NEVER get clicked on from within these menus. Instead, they're overwhelming to users. This is why it's better to not have them from a UX perspective.
If you abandon them and go with more traditional hierarchical index and sub-index pages, and if those are properly optimized, you'll not only eliminate the massive SEO problem but in fact get more of your category pages to have higher ranking strength and authority over time.
IF you're going to keep them in any form because you don't want to go to the extreme I recommend, then yes - AJAX would likely be the only scenario that offers the least likelihood of search engines choking on the over-use of links.
And for the record, the real current problem with all those links on every page is duplicate content confusion - all of those URLS at the source level dilutes the uniqueness of content on every page of the site. And that also means you're harming the topical focus of every page as well. So whatever you do, AJAX or doing away with them altogether is going to be of high value long term.
- Alan Bleiweiss - Click2Rank's Search Team Director
-
From my experience I don't think you can really 'hide' the megamenu links from a crawler if they are generated using a content management system (code server side). If the link is on the page in the HTML then it will be crawled by a bot etc.
The general method of getting a mega menu to work is through the use of CSS and JavaScript, so you might want to have a look at using AJAX to get the relevant links from the database and then use JavaScript to put the links into the page.
This isn't a great solution, but bots cannot load JavaScript, so what they will see is only the links that are served up from the content management system.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best markup for testimonials?
These are testimonials that our staff have collected themselves, so I'm unsure about using a block quote and citing a source, since these testimonials are original content, not references from another website. I just need to place the quote, the author's name, and, in some cases, the author's city. Any recommendations?
Web Design | | bcaples0 -
Best Location for Copy Block
We are having discussions around the appropriate location to place the SEO copy block on an eCommerce category page. Would like to get the communities opinion to share with the creative team.
Web Design | | TukTown0 -
Multiple Similar Product Variations - Page layout, Title and SEO best practice??
Im doing some research into SEO for our new web design. I sell designer eyewear prescription and sunglasses. Lets take a Ray Ban Wayfarer sunglass it comes in 30 colours and 3 sizes for each model. Up till now i was of the impression that for best practice SEO i would need to have each individual variation as its own page, this would also help with things like google shopping too. So for example heres 1 colour product in 3 sizes of 30 colour variations for this particular model. Ray Ban Wayfarer RB2140
Web Design | | Craigboi1987
Colour: Black 901
Sizes: 47, 50, 54 Currently my urls looks like this with a new page and the size changing on the end for each variation. Ray Ban Wayfarer RB2140 - Black 901 - 47 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=47 Ray Ban Wayfarer RB2140 - Black 901 - 50 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=50 Ray Ban Wayfarer RB2140 - Black 901 - 54 URL: www.mywebsite.com/ray-ban-wayfarer-rb2140.html?colour=Black+901&size=54 This is very time consuming and I'm not sure if its adding any benefit to my SEO in fact scared its actually a) slowing my site down (content heavy)
b) looking like duplicate content I am thinking about moving towards a page more like this were it would be just be a model with variations. (not effecting the title/getting a new page per variation) http://demoleotheme.com/vigoss/index.php/atomic-endurance-running-tee-crew-neck.html I am not sure of the pros and cons of doing it this way over the way I'm doing it currently all i know is my site is ranking horribly. Lastly I'm currently running a magento V1.9 store which is renowned for duplicate content slow site speeds etc so have been told moving to woo commerce would benefit me for both site performance and seo but I'm skeptical as currently with this structure of a each SKU being a new page il be up to 8000+ products and multiple product variations that it can handle my needs, anyone with any experience on woo commerce platform? (this might be a operate question apologise) This is absolutely frying my brain so any advice appreciated. Im prepared to put every dying second into just need some solid advice in which direction to go!0 -
Best SEO practice - Umbrella brand with several domains
Hi, we have several blogs and comparison sites on specific topics. All the domains rank on top positions in very competitive niche markets. We think that we can get more profit out of the domains when we put them under an umbrella brand. Customers that visit domain A can then also find products easily on domain B. We see this for example on health.com, with several brands in the top. To maintain or improve our rankings i'm looking for specific information for the link structure. For example, is it better to have the 'about us'/rel=author on each domain, with contributors on that specific domain or is it better to have them all in the (umbrella) brand domain. At the moment we have the structure like this: domainA.com, domainA.com/blog, domainA.com/about-us and domainB.com, domainB.com/blog, domainB.com/about-us. I think to maintain the rankings it is best to keep specific content (like blog/ about us) on the domain. So is it the best to just do side wide links with a logo (like health.com) and what about hosting? We work with wordpress, so all domains will be hosted on one ip? when we use the multiple site option of WP? All information on this topic is more than welcome 🙂
Web Design | | remkoallertz0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Side Nav. Vs. Top Nav
I have a client that currently has a side navigation and wants to know how changing to a top nav will affect her SEO. We always recommend top nav for user experience but I am not sure if there is a direct effect on SEO. Would the change affect it? Thoughts?
Web Design | | hwade0 -
Redirect based on location best practice clarification?
Hi, i have a question that i have seen some other have also had. The question is what is the best practice to serve the location specific page to the user (based on their location)? This post (http://www.seomoz.org/q/redirecting-users-based-on-location) suggests against automatically redirecting the user based on IP address. I guess the primary concern is that Google bot will also be redirected in this case... I see a number of well known sites use automatic redirect based on location. Take Urbanspoon for example (http://www.urbanspoon.com/), they use a 302 redirect to redirect to location specific page. Do they not redirect Google bot? Is there any way to test this? Can creating a rule to exclude crawlers from redirect cause SEO problems? How? Another example that i am somewhat confused as to how it works effectively is groupon.com.au It selects my closest city (i assume using IP), however the URL stays as the root URL. For example, i typed in http://www.groupon.com.au/ and it stays as http://www.groupon.com.au/ with the city chosen as "Melbourne". The canonical url for this page is the root URL (ie http://www.groupon.com.au/). If you then select "change city" and click the same city (ie Melbourne), it redirects to http://www.groupon.com.au/deals/melbourne. Canonical URL of this page is http://www.groupon.com.au/deals/melbourne. How is this not duplicate content? Can you please advise on the best way to redirect (ideally automatically), to provide the best user experience, while still having Google bot able to crawl the site effectively? Thanks
Web Design | | blackrails0 -
What's the best way to sculpt links on a page?
I know PR isn't a top ranking factor anymore, so "PR sculpting" isn't something to focus on. But isn't it still true that having more links that you need on any given page is worse than having fewer, in terms of that page's authority? I'm managing a site that has a lot of navigational links in the footer, which are duplicative because they're almost all included in the top nav bar, and several are triplicated in the sidebar as well. I wanted to remove 85% of these duplicative links from the footer, thinking they diluted the page authority and that most users probably won't scroll there anyway when we launch the site. The site owner is pushing back, though, not wanting to remove so many links because he believes they might be useful to some users. We can test our respective user-behavior theories after launching, but right now I have two questions: Will having a sizable number of duplicative links in the footer dilute the page's authority? and 2) Are there any other ways to reduce this dilution, aside from simply removing the links? (I know nofollow is not the answer, but possibly using iframes or Java or something like that?)
Web Design | | KyleJB0