How to optimize drop down menus?
-
Thanks in advance!
-
I agree with Federico, as long as it's readable to the search engine then you should not have an issue.
You could try seo-browser.com to see if the menu is visible to search engines.
-
What do you mean by "optimize drop down menus"?
I am guessing you are referring to "how to let search engines crawlers to recognize your menus"??. Well, there's really no need to do anything.
Drop down menus actually use CSS or Javascript (or a conjunction of both) to load. Search engine crawlers are already smart enough to recognize and run JS/css code, that means that there's no need for you to do anything (unless your menus are really messy and unusable to users..?
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Traffic Dropping To Website
Hi In Google Analytics:
Web Design | | SEOguy1
I have noticed up to 50% of traffic coming to the website drops off at the home page point,
and drops further from other pages on the site. I realise some may possibly say that this could be down to various factors such as server issues, poor web design, or the wrong traffic reaching the site I have did corrected the following: There was an issue with there being www.domain.com and www.domain.com/home, Screaming Frog and Moz showed that these both had duplicate meta tagging issues. Initially I had created a separate page called 'home' to include in the main nav bar under the slider, but yesterday I replaced this page with a request in the functions.php to place 'home' in the nav bar as a redirect back to the home www.domain.com page. This works great. So I now have the following 301 permanent redirects: non-www to www resolve in the htaccess file, plus 2 permanent 301 redirects in the nav bar. I wonder if this is acceptable protocol re the nav bar redirects, and I wonder if you could possibly advise if the actions that I have taken will have any negative impact on the web seo, link structure, crawlability or indexing. Thanks.0 -
Help, site traffic has dropped significantly since we changed from http to https
Heya, so I am just in charge of the content on the site, and the SEO content, not the actual back-end stuff. A little under 2 weeks ago we switched to https, and our site traffic has been down a lot ever since. When I SERP check our keywords, they don't seem to have dropped in rankings pages. Here is what I got when I asked our dev guy if 301 redirects were put in: I did not add any redirects so all of the content is accessible on both unless individual links get hardcoded one way or the other. The only thing in place is a Cloudflare plugin which rewrites links in cached pages to match the way its accessed, so if for example you access a page over https you don’t get the version cached with a bunch of http links since that will throw up mixed content warnings in the browser. Other than that WP mostly generates all its links to match whatever protocol you are accessing the current page with. We can make specific pages redirect one way or the other in the future if we want to though... As a startup, site traffic is a metric we track to gouge progress, and so I really need to get to the bottom of if it was the change from http to https that has causes the drop, and if so, what can we do about it? Also, in case it is relevant: the bounce rate is now sky high (ave. 15% to 64% this last week!) Any help is very welcome! Site: https://mobileday.com Thank you!
Web Design | | MobileDay1 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
How Does Google differentiate a keyword you are optimizing for and a non-keyword?
So, let's say that my company is called John's Business Consulting and I offer outsourced HR work (recruiting, evaluating, personality assessments, background checks). So for my home page I want "Business Consulting" to be my keyword that I want to rank for. But "recruiting services", "talent development" are all words that describe a service that I offer and could potential be keywords, how do I get Google to not dilute my authority for "business consulting"?
Web Design | | wlw20090 -
Is this causing me to drop in rank?
Today I noticed I was dropping (pretty big jump) for some keywords, so I checked out the source of a page, and noticed that my source code has two canonical urls. One to the home page, and one to the /page-title. I just changed themes recently, and the dropped happened after I changed themes. Is this what's causing me to drop in rank for certain terms? You can view the source here: http://noahsdad.com/physical-characteristics/
Web Design | | NoahsDad0 -
Could Website redesign be a cause of drop in rankings?
We had a complete redesign of our website and moved it over to wordpress several months ago. As url's changed, we had appropriate 301 redirects done. Rankings for our top keywords dropped, but others remained intact. Our SEO company told us rankings drop when a redesign is done, but I thought if we did all redirects properly (which they approved), it wouldn't be much of a problem. Additionally, we've been steadily adding good new content. Any advice?
Web Design | | rdreich490 -
Can "poor" subdomains drop PR of the root domain?
The page rank of my company's website has dropped from a 6 to a 4 over the past year or so. In that time, we implemented subdomains for development sites to show clients progress on their websites. I noticed that our "dev" sites are being indexed while in development and my question is, will Google drop pagerank of our root domain purely off of these "dev" subdomains? Example - our site is www.oursite.com Dev site - development1.oursite.com I just began investigating the drop and this came to my mind yesterday but am not too sure what type of impact these non-credible subdomains will have on our root domain. Any thoughts?
Web Design | | ckilgore0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0