Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
CSS vs Javascript vs JQuery drop down navigation
-
For a user / seo perspective, what is the best way to code a drop down menu nav bar? Is it best to use css, javascript or a scripting library like jquery?
I am thinking about overall best practice that will not have a negative impact on serps.
I am also thinking about what will work best on all types of devices i.e. desk tops, lap tops, smart phones and tablets.
What are the Pro's & Cons of Using CSS for Drop Down Menus.
What are the Pro's & cons of using Javascript for drop down menus.
And the same question for jquery.
Thank you all in advance for your ideas.
-
You can't go wrong with CSS. Endless styling possibilities. Also, I'd stay away from javascript because it's executed in order on the page. Meaning, if the JS doesn't load properly in the menu, the remaining JS below it wont load either.
-
Thumbs up to you too Joel and good luck with your project.
-
Seems like the seo jury has spoken and CSS it is. Thank you all for your help on this matter. Thumbs up to you all.
-
Hands down that CSS is the preferred way make dropdown menus. Google and on a lesser note, other search engines, have improved drastically with their ability to parse JavaScript / jQuery, and you may be able to get away with it, but it really should be avoided if possible. With the "new" CSS3 styles, you get almost any style you'd like with that.
-
Hi Joel,
I echo Marek's comments. However, I'm a great fan of making 100% sure that the bots can access everything that I want them to so if I'm ever in doubt I go with css and html combination as much as possible. We use Ajax and jQuery totally etc only on pages which we believe are 100% to be used primarily for user experience and engagement. E.g. When they're doing searches for specific things and the page needs to be ultra fast and efficient. However, we also try to ensure that we have crawlable pages which output the full content of a search wherever possible so that we can get the SEO benefit too. It also helps for when people have javascript disabled (not many granted).
I've seen so many ecommerce sites with great content but it's often got some kind of blockage that means a button has to be pressed or a form submitted to see it and if I'm not mistaken bots can't access this easily.
Hope this helps.
-
Hi Joel,
In my opinion CSS is "The Best". Simple, easy usage, easy changes, very good speed of page load ... etc...
As I red on many forums on the internet JQuery and JS are are available for robots, so there are no contraindications to employ them.
But
CSS - better code/text ratio (no long scripts in page code)
CSS - simple usage and changes (CSS3, HTML5)
CSS - faster loading (only simple text and html)
In my opinion, now when we have HTML5 and CSS3 there is no better way - it's innovative and simple solution,
Marek
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Does too much inline CSS impact SEO rankings
Hello, Does implementing a lot of inline CSS have a negative impact on SEO rankings? I imagine it could affect page speed, but any other issues I might run in to?
Web Design | | STP_SEO1 -
Drop Down Menus and Crawlability
Hello, We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear. Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page. I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form? Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page? Any thoughts or clarity around this would be appreciated.
Web Design | | Colbys0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Side Nav. Vs. Top Nav
I have a client that currently has a side navigation and wants to know how changing to a top nav will affect her SEO. We always recommend top nav for user experience but I am not sure if there is a direct effect on SEO. Would the change affect it? Thoughts?
Web Design | | hwade0 -
How would restructuring the navigation of my website affect my rankings?
I want to restructure the navigation of my website for a few reasons: 1. It isn't intuitive/clear to the user 2. It is way too big, it has too many links and thus causes the number of links on many pages to be >100. 3. I want to get rid of file extensions as part of the URLs (.html, .php) 4. I want to achieve a "tree"-like navigation system, with categories, subcategories and so on. In the process of cleaning up my website, I had to 301 redirect a lot of duplicate pages, fix broken links, etc. I have a lot of 301 redirects already, and in the process of restructuring the navigation of my website I know I'm going to get more. Will the addition of new 301 redirects have an effect on my rankings? (I'm basically going to be changing all of the URLs) What kind of SEO effect will restructuring the navigation at the top of the page (reducing the # of links on the main menu) have on my site? What is the best strategy to implement in this situation?
Web Design | | deuce1s0 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090 -
Wordpress vs. mvc framework
What's the benefits of choosing an mvc framework such as codeigniter or cakephp over wordpress? Wordpress has so many plugins, and a universally known UI for customers, it just saves a ton of time. However, a lot of the 'big guys' like SEOmoz and Distilled(?) use Cakephp and other mvc frameworks so it has me wondering what the benefits are...... anyone?
Web Design | | DonnieCooper2