Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
-
This concerns f5.com, a large website with navigation menus that drop down when hovered over. The sub nav items (example: “DDoS Protection”) are not cached by Google and therefore do not distribute internal links properly to help those sub-pages rank well.
Best option naturally is to change the nav menus from JS to CSS but barring that, is there another option? Will Schema SiteNavigationElement work as an alternate?
-
Meh, I guess not. It's just like talking about it to clients or friends. I've made some fine noise with lots of technical words.
-
Hi Carl - Did you see Travis' thoughtful response to your question?
-
I would generally prefer CSS over JS for navigational elements, but that probably isn't the problem here. Google can crawl JavaScript and attribute links fine. And per SEM Rush, it looks like the site is enjoying a pretty sharp uptick in organic traffic recently. That would seem to be at odds with big indexation problems.
I'm not so sure if it's my network, I'm on a sub par connection now, but I noticed that some CSS and JS files were timing out when I crawled the site. That could lead to a big problem. I would advise that someone check the server log files and see if those files are regularly timing out. Ideally one would want their CSS and JS files combined/concatenated where possible, to reduce the possibility of any such rendering issues.
More on that from SE Roundtable
I checked the cache for the EN version of a few of those pages, and they appear to be cached fine.
cache:https://f5.com/products/security/distributed-denial-of-service-ddos-protection yields, which is pretty much what we want.
But I do see some problems that could lead to problems with indexation/display. The site has a number of different languages/translations. However, I noticed that the hreflang attribute was missing. It's strongly recommended that hreflang is implemented. You're good on the language meta tag Bing recommends, though.
That would cause some problems, especially on a site that large. I've researched Radware, their competitor, years ago. F5 seems like the type of organization that would pay for a decent translation. (my German and Spanish are so limited, I couldn't discern the quality of the translations) But if it is automatically generated, that would more than likely lead to indexation problems as well.
Another thing I see is that each translation is marked as canonical. This could also cause problems with display and link equity.
Here's more on internationalization from Moz and Google.
I would also look for ways to build internal links to the important products (DDoS Mitigation is supposed to be a huge money maker now.) on the home page, in the body. Not just in boilerplate (nav... footer... etc....) areas.
Edit: Forgot to mention that the mobile menu doesn't appear to directly link important products. I would make sure the experience is the same across devices.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
Meta Titles and Meta Descriptions are not Indexing in Google
Hello Every one, I have a Wordpress website in which i installed All in SEO plugin and wrote meta titles and descriptions for each and every page and posts and submitted website to index. But after Google crawl the Meta Titles and Descriptions shown by Google are something different that are not found in Content. Even i verified the Cached version of the website and gone through Source code that crawled at that moment. the meta title which i have written is present there. Apart from this, the same URL's are displaying perfect meta titles and descriptions which i wrote in Yahoo and Bing Search Engines. Can anyone explain me how to resolve this issue. Website URL: thenewyou (dot) in Regards,
Technical SEO | | SatishSEOSiren0 -
Image Indexing Issue by Google
Hello All,My URL is: www.thesalebox.comI have Submitted my image Sitemap in google webmaster tool on 10th Oct 2013,Still google could not indexing any of my web images,Please refer my sitemap - www.thesalebox.com/AppliancesHomeEntertainment.xml and www.thesalebox.com/Hardware.xmland my webmaster status and image indexing status are below, Can you please help me, why my images are not indexing in google yet? is there any issue? please give me suggestions?Thanks!
Technical SEO | | CommercePundit0 -
Unnecessary pages getting indexed in Google for my blog
I have a blog dapazze.com and I am suffering from a problem for a long time. I found out that Google have indexed hundreds of replytocom links and images attachment pages for my blog. I had to remove these pages manually using the URL removal tool. I had used "Disallow: ?replytocom" in my robots.txt, but Google disobeyed it. After that, I removed the parameter from my blog completely using the SEO by Yoast plugin. But now I see that Google has again started indexing these links even after they are not present in my blog (I use #comment). Google have also indexed many of my admin and plugin pages, whereas they are disallowed in my robots.txt file. Have a look at my robots.txt file here: http://dapazze.com/robots.txt Please help me out to solve this problem permanently?
Technical SEO | | rahulchowdhury0 -
Fixing a website redirect situation that resulted in drop in traffic
Hi, I'm trying to help someone fix the following situation: they had a website, www.domain.com, that was generating a steady amount of traffic for three years. They then redesigned the website a couple of months ago, and the website developer redirected the site to domain.com but did not set up analytics on domain.com. We noticed that there was a drop in traffic to www.domain.com but have no idea if domain.com is generating any traffic since analytics wasn't installed. To fix this situation, I was going to find out from the developer if there was a good reason to redirect the site. What would have prompted the developer to do this if www.domain.com had been used already for three years? Then, unless there was a good reason, I would change the redirect back to what it was before - domain.com redirecting to www.domain.com. Presumably this would allow us to regain the traffic to the site www.domain.com that was lost when the redirect was put in place. Does this sound like a reasonable course of action? Is there anything that I'm missing, or anything else that I should do in this situation? Thanks in advance! Carolina
Technical SEO | | csmm0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0