Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best method to stop crawler access to extra Nav Menu
-
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text.
We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages.
You can get to every product and category page without using the drop down mega-menu.
Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else.
I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this?
I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else.
Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all.
Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider.
What would you do and why?
Thanks,
James
-
I agree Alan,
Mega Menu's are a good way to dilate the link equity of your page and in most cases it isn't needed at all. Keep the top-level navigation simple and have a submenu on all pages that contain links relevant to that section.
EG: Mega Menu could be:
Home Mens (Mens Tops, Mens Jeans, Mens Coats), Women (Womens Tops, Womens Jeans etc) Contact us
In this example it would be better to have one top level menu for:
Home | Mens | Women | Contact us
Then when your in the men or women section show links to "Tops", "Jeans" and "Coats". That way those links are relevant to the section you're in and reinforces the structure of that section to search engines.
After giving it further thought I would suggest not having a mega menu at all, because it may harm your SEO on-page optimisation efforts in the long term.
-
Ben's partially correct. Unfortunately Google has been claiming they do process Javascript for a while, and they recently stated they've begun reading AJAX. Of course they do a lousy job of it and don't always get it right, which just makes things even more muddy.
So from an SEO best practices perspective, you shouldn't have the menu(s) in the first place, at all.
You may also THINK their good for users but has any significant study been performed to confirm that? You'd need to check click-through rates on all the links to know for sure.
What I've found through years of auditing sites that have such menus is that it almost always turns out to be the case where most of the deeper links NEVER get clicked on from within these menus. Instead, they're overwhelming to users. This is why it's better to not have them from a UX perspective.
If you abandon them and go with more traditional hierarchical index and sub-index pages, and if those are properly optimized, you'll not only eliminate the massive SEO problem but in fact get more of your category pages to have higher ranking strength and authority over time.
IF you're going to keep them in any form because you don't want to go to the extreme I recommend, then yes - AJAX would likely be the only scenario that offers the least likelihood of search engines choking on the over-use of links.
And for the record, the real current problem with all those links on every page is duplicate content confusion - all of those URLS at the source level dilutes the uniqueness of content on every page of the site. And that also means you're harming the topical focus of every page as well. So whatever you do, AJAX or doing away with them altogether is going to be of high value long term.
- Alan Bleiweiss - Click2Rank's Search Team Director
-
From my experience I don't think you can really 'hide' the megamenu links from a crawler if they are generated using a content management system (code server side). If the link is on the page in the HTML then it will be crawled by a bot etc.
The general method of getting a mega menu to work is through the use of CSS and JavaScript, so you might want to have a look at using AJAX to get the relevant links from the database and then use JavaScript to put the links into the page.
This isn't a great solution, but bots cannot load JavaScript, so what they will see is only the links that are served up from the content management system.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What’s the best tool to visualize internal link structure and relationships between pages on a single site?
I‘d like to review the internal linking structure on my site. Is there a tool that can visualize the relationships between all of the pages within my site?
Web Design | | QBSEO0 -
Best SEO-friendly CMS platform?
I have been tasked with rebuilding a small e-commerce website using a CMS, but I'm not sure which one has the most SEO compatibility. One SEO company recommended Squarespace. Another warned me against Squarespace because of its limited SEO features and instead recommended Wordpress with the WooCommerce toolkit. I've also heard Drupal and Joomla mentioned. Are certain CMS platforms more SEO-friendly? If so, what are the best ones that can also handle e-commerce? Thanks!
Web Design | | businessimagesolutions1 -
Side bar menu, good or bad idea.
Hi everyone, I have a little problem. Not that long ago I launched my new site. Everything seems ok, but I'm not sure if it was clever idea to have additional side bar menu option. I wanted relevant content to be accessible very easy without dropdown in main menu. It looks ok on desktop, but we have a problem with mobile devices. Even main menu is a bit confusing and sidebar at the moment is at the bottom of each page. When I placed it on top of the page, we had problem with tablet users as it is showing side menu with blank page and content is almost below the fold. I have a tool installed called usability tools and it shows how visitors are using my site. The hard bit is that nobody on mobile devices are using sidebar and that means people visit one page and leave without exploring any additional resources. Me and my developer are discussing that maybe we should have two main menu bars instead of sidebar, but I have no idea how this looks in real life. What is the best practises for sidebar menus these days? Maybe we have a designer here who can help me with this and do some work? My site is https://a-fotografy.co.uk/ Thank you for all help in input in advance. Regards, Armands
Web Design | | A_Fotografy0 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Best Webhosting Suggestions??
Good morning my fellow Mozzers! I am currently looking at adding some diversity to my current web hosting and I was hoping I could get some suggestions. I dont currently need a VPS or Dedicated Server, I just need some shared hosting, you know, packeges that are sub $20 a month...i mean i will pay more than that, but so far everything i look at that meets my needs(basic hosting, email, ect...). This is for client sites and they are growing in number somewhat rapidly. I currently host with GoDaddy and they are amazing in the support department, but I do question whether their servers are causing slow page loads ect...but all in all I am happy with them. I have used Netword Solutions in the past, but left them because i was not a big fan of talking to support people in india and malasia. I do think that their servers might have performed better than GoDaddy so i am not ruling them out at this point i am looking for a provider that has excellent support and who has servers that are not so overloaded the can render pages and content slowly. Performance is very important to me. I am not looking for the cheapest, I am looking for the overall best. Thanks in advance SEOmoz family!!!
Web Design | | WebbyNabler0