Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Click To Reveal vs Rollover Navigation Better For Organic?
-
Hi,
Any thoughts, data or insights as which is better in a top navigation... click to reveal the nav links or rollover to reveal the nav links? Regular content in an accordion (click to reveal) is evidently not best practice. Does that apply to navigation as well?
Thanks! Best... Mike
-
Interesting UX question. Short answer; click menu is best, but its not black and white.
Naturally its more subtle than that. You mention regular content. Regular content being hidden by any mechanism is naturally not too user friendly. Accordions can often be overlooked, text hidden in the hover state of images is a client favourite that is also terrible UX practice. The mechanism doesn't matter too much - its the fact content is hidden by an un-signposted mechanism. The author knows its there, but your visitor will not.
Menu isn't content though; its a different beast. A menu needs to exhibit good information hierarchy. We try to keep our main menu to 7 items or less, essentially for clarity of the first tier of offerings. This can often necessitate sub-menus. Sub-menus are hidden content, we're just arguing the toss about mechanism. So first off we'd suggest a nice little signpost like a downward arrow to show which main items have sub-menus
Also note we don't have hover states on touch devices, so unless you're planning on a second type of menu for that, your choice is made for you and it'll certainly need to be selection rather than hover based.
Select to get something is more in keeping with how everything else on the web works; text links, buttons etc. Hover feels more immediate but if your site demographic is broad, bear in mind that the dexterity required will elude a percentage of your audience. Consider the accessibility implications of this and your site client needs.
For example, hover menus can be a real pain when the sub-menu content is wider than the trigger area. This will have happened to all of you; hover over the main menu item, see the sub-menu item you want, move the mouse to select the sub menu item... o dear the sub menu has disappeared on you. You left the hover area before reaching the sub menu and the hover state is lost. As well as accidental deactivation its quite possible to get annoying accidental activation with hover too.
As well as audience consider the sub-menu itself. If you have a couple of small items consider hover, a massive mega-menu will nearly always be better toggled by selection. On that note, if you're using mega-menus consider Nielsens excellent guide here: https://www.nngroup.com/articles/mega-menus-work-well/
PS: I'd encourage everyone to start thinking about selection rather than 'clicks'. I still slip up myself, but clicks are an outmoded, desktop-centric term that is very dangerous to bandy about when making responsive websites. Much as your anchor text should never be "Click here" we should always be thinking about "selection". Selection speaks to intent and action rather than physical methodology, as that methodology can be clicking, yes, but also tapping, voice command, keyboard based, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Organic Traffic Drop of 90% After Domain Migration
We moved our domain is http://www.nyc-officespace-leader.com on April 4th. It was migrated to https://www.metro-manhattan.com Google Search Console continues to show about 420of URLs indexed for the old "NYC" domain. This number has not dropped on Search Console. Don't understand why Google has not de-indexed the old site.
Intermediate & Advanced SEO | | Kingalan1
For the new "Metro" domain only 114 pages are being shown as valid. Our search volume has dropped from about 85 visits a day to 12 per day. 390 URLs appear as "crawled- currently not indexed". Please note that the migrated content is identical. Nothing at all changed. All re-directs were implemented properly. Also, at the time of the migration we filed a disavow for about 200 spammy links. This disavow file was entered for the old domain and the new one as well. Any ideas as to how to trouble shoot this would be much appreciated!!! This has not been very good for business.0 -
Absolute vs. Relative Canonical Links
Hi Moz Community, I have a client using relative links for their canonicals (vs. absolute) Google appears to be following this just fine, but bing, etc. are still sending organic traffic to the non-canonical links. It's a drupal setup. Anyone have advice? Should I recommend that all canonical links be absolute? They are strapped for resources, so this would be a PITA if it won't make a difference. Thanks
Intermediate & Advanced SEO | | SimpleSearch1 -
My direct traffic went up and my organic traffic went down. Help!
So on Oct. 21, our direct traffic increased 3x and our organic traffic decreased 3x. And it has been that way ever since. Almost like they flip flopped. Additionally, that was the same day I started retargeting to our site. I have tagged all the links from the ads and they're being counted as google paid clicks in GA. And our accounts are linked. I am just dumbfounded as to how this could happen.
Intermediate & Advanced SEO | | Eric_OWPP1 -
P.O Box VS. Actual Address
We have a website (http://www.delivertech.ca) that uses a P.O Box number versus an actual address as their "location". Does this affect SEO? Is it better to use an actual address? Thanks.
Intermediate & Advanced SEO | | Web3Marketing870 -
Click Through Rate on Password Protected Pages
Hi Moz community, I have a website that has a large database with 800+ important pages, and want Google to know when people visit and stay on these pages. However, these pages are only accessible to people once they create an account with a password, and sign in. I know that since these pages are password protected, Google doesn't index them, but when our visitors stay for a while on our site browsing through our database, does this data get included in our CTR and Bounce Rate by Google? This is really important for Google to know about our database (that people are staying on our site for a while) for SEO purposes, so I wanted to know that if the CTR gets measured even though these pages aren't crawled. Thanks for the help!!
Intermediate & Advanced SEO | | danstern0 -
Lowercase VS. Uppercase Canonical tags?
Hi MOZ, I was hoping that someone could help shed some light on an issue I'm having with URL structure and the canonical tag. The company I work for is a distributor of electrical products and our E-commerce site is structured so that our URL's (specifically, our product detail page URL's) include a portion (the part #) that is all uppercase (e.g: buy/OEL-Worldwide-Industries/AFW-PG-10-10). The issue is that we have just recently included a canonical tag in all of our product detail pages and the programmer that worked on this project has every canonical tag in lowercase instead of uppercase. Now, in GWT, I'm seeing over 20,000-25,000 "duplicate title tags" or "duplicate descriptions". Is this an issue? Could this issue be resolved by simply changing the canonical tag to reflect the uppercase URL's? I'm not too well versed in canonical tags and would love a little insight. Thanks!
Intermediate & Advanced SEO | | GalcoIndustrial0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
External 404 vs Internal 404
Which one is bad? External - when someone adds an incorrect link to your site, maybe does a typo when linking to an inner page. This page never existed on your site, google shows this as a 404 in Webmaster tools. Internal - a page existed, google indexed it, and you deleted it and didnt add a 301. Internal ones are in the webmaster's control, and i can understand if google gets upset if it sees a 404 for a URL that existed before, however surely "externally created" 404 shoudnt cause any harm cause that page never existed. And someone has inserted an incorrect link to your site.
Intermediate & Advanced SEO | | SamBuck0