Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Click To Reveal vs Rollover Navigation Better For Organic?
-
Hi,
Any thoughts, data or insights as which is better in a top navigation... click to reveal the nav links or rollover to reveal the nav links? Regular content in an accordion (click to reveal) is evidently not best practice. Does that apply to navigation as well?
Thanks! Best... Mike
-
Interesting UX question. Short answer; click menu is best, but its not black and white.
Naturally its more subtle than that. You mention regular content. Regular content being hidden by any mechanism is naturally not too user friendly. Accordions can often be overlooked, text hidden in the hover state of images is a client favourite that is also terrible UX practice. The mechanism doesn't matter too much - its the fact content is hidden by an un-signposted mechanism. The author knows its there, but your visitor will not.
Menu isn't content though; its a different beast. A menu needs to exhibit good information hierarchy. We try to keep our main menu to 7 items or less, essentially for clarity of the first tier of offerings. This can often necessitate sub-menus. Sub-menus are hidden content, we're just arguing the toss about mechanism. So first off we'd suggest a nice little signpost like a downward arrow to show which main items have sub-menus
Also note we don't have hover states on touch devices, so unless you're planning on a second type of menu for that, your choice is made for you and it'll certainly need to be selection rather than hover based.
Select to get something is more in keeping with how everything else on the web works; text links, buttons etc. Hover feels more immediate but if your site demographic is broad, bear in mind that the dexterity required will elude a percentage of your audience. Consider the accessibility implications of this and your site client needs.
For example, hover menus can be a real pain when the sub-menu content is wider than the trigger area. This will have happened to all of you; hover over the main menu item, see the sub-menu item you want, move the mouse to select the sub menu item... o dear the sub menu has disappeared on you. You left the hover area before reaching the sub menu and the hover state is lost. As well as accidental deactivation its quite possible to get annoying accidental activation with hover too.
As well as audience consider the sub-menu itself. If you have a couple of small items consider hover, a massive mega-menu will nearly always be better toggled by selection. On that note, if you're using mega-menus consider Nielsens excellent guide here: https://www.nngroup.com/articles/mega-menus-work-well/
PS: I'd encourage everyone to start thinking about selection rather than 'clicks'. I still slip up myself, but clicks are an outmoded, desktop-centric term that is very dangerous to bandy about when making responsive websites. Much as your anchor text should never be "Click here" we should always be thinking about "selection". Selection speaks to intent and action rather than physical methodology, as that methodology can be clicking, yes, but also tapping, voice command, keyboard based, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Is it possible to have organization markup schema for sub domain ? and how should it look like ?
Can we have organization markup schema for subdomain ? For example if my main domain is xyz.com and subdomain is sub.xyz.com If i plan to have organization markup schema for subdomain how should it look like ? Should the markup schema must have main domain url or sub domain url in markup schema ? Should it be like this ?
Intermediate & Advanced SEO | | NortonSupportSEO0 -
Top hierarchy pages vs footer links vs header links
Hi All, We want to change some of the linking structure on our website. I think we are repeating some non-important pages at footer menu. So I want to move them as second hierarchy level pages and bring some important pages at footer menu. But I have confusion which pages will get more influence: Top menu or bottom menu or normal pages? What is the best place to link non-important pages; so the link juice will not get diluted by passing through these. And what is the right place for "keyword-pages" which must influence our rankings for such keywords? Again one thing to notice here is we cannot highlight pages which are created in keyword perspective in top menu. Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Onsite SEO vs Offsite SEO
Hey I know the importance of both onsite & offsite, primarily with regard to outreach/content/social. One thing I am trying to determine at the moment, is how much do I invest in offsite. My current focus is to improve our onpage content on product pages, which is taking some time as we have a small team. But I also know our backlinks need to improve. I'm just struggling on where to spend my time. Finish the onsite stuff by section first, or try to do a bit of both onsite/offsite at the same time?
Intermediate & Advanced SEO | | BeckyKey1 -
H2 vs. H3 Tags for Category Navigation
Hey, all. I have client that uses tags in the navigation for its blog. For example, tags might appear around "Library," "Recent Posts," etc. This is handled through their WordPress theme. This seems fairly standard, but I wonder whether tags are semantically appropriate. Since each blog post is fairly lengthy (about 500-1000 words) with multiple tags, would it be more appropriate to use tags for this menu navigation? Are we cutting into the effectiveness of our tags by using them for menu navigation? The navigation is certainly an important page element, and it structures content, so it seems that it should use some header tag. Anyways, your thoughts are greatly appreciated. I'm a content creator, not an SEO, so this is a bit out of my skillset.
Intermediate & Advanced SEO | | Ask44435230 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Would it be better to Start Over vs doing a Website Migration?
Hey guys /gals I have a question please. I have a computer repair business that does extremely well in search and is on the front page of google for anything computer repair related. However, I am currently re-branding my company and have completely redesigned every aspect of the UI and the SEO Site structure as well as the fact that I have completely written vastly different content and different title tag lines and meta descriptions for each page. So basically when doing a migration we know that we want to keep our content, titles, headlines and meta descriptions the same as to not lose our page rank. Seeing that I have completely went against the grain in all directions on a much needed company re-branding and everything is completely different from the old site is it even worthwhile 301 redirecting my old urls to the new ones that would (best) correspond with the new? In the plainest English, would I do better at Ranking the New Website QUICKER without doing 301 redirects from the OLD to the NEW? In an EXTREME instance like what I have done, would the Domain Migration IMPEDED me ranking the new site seeing how nothing is the same? I have build a Rock solid SILO Site Architecture on the New site which is WordPress using the Thesis Framework and the old domain is built on JOOMLA 1.5 Thank fellas Marshall
Intermediate & Advanced SEO | | MarshallThompson0 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0