Ever Wise to Intentionally Use Javascript for Global Navigation?
-
I may be going against the grain here, but I'm going to throw this out there and I'm interested in hearing your feedback...
We are a fairly large online retailer (50k+ SKUs) where all of our category and subcategory pages show well over 100 links (just the refinement links on the left can quickly add up to 50+). What's worse is when you hover on our global navigation, you see the hover menu (bot sees them as
-
) of over 80 links.
Now I realize the good rule of thumb is not to exceed 100 links on a page (and if you did your math, you can see we already exceeded that well before we let the bots get to the good stuff we really wanted them to crawl in the first place).
So...
Is it wise to intentionally shield these global nav links from the bots by using javascript?
-
-
I had this same conversation with someone yesterday about a very similar set-up.
In a 2009 blog post, Matt Cutts said that the main reason not to include over 100 links was for user experience. It used to be for technical reasons, but those no longer apply. Here is the post: http://www.mattcutts.com/blog/how-many-links-per-page/
Lots of links can lead to lots of code, which slows things down. It will also be dividing up the page rank fairly heavily. However in the age of mega-menus I don't think that the number of links is, in itself, a problem.
Just for reference (and the answers to your situation may be different) our conversation about this ended with the decision to reduce the number slightly - structuring to leak less page rank to unimportant pages. However overall we still have a LOT of links and are happy with that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Duplicate title tags due to lightbox use
I am looking at a site and am pulling up duplicate title tags because of their lightbox use so... So they have a page: http://www.website.com/page and then a duplicate of that page: http://www.website.com/page?width=500&height=600 on a huge number of pages (using Drupal)... that kind of thing - what would be the best / cleanest solution?
Intermediate & Advanced SEO | | McTaggart0 -
Can spiders crawl javascript navigation now?
I was reading Danny Dover's book and decided to try some websites and so far everyone I have looked at has had navigation that does not work with disabled javascript. Is this still as important as it was at the time of publish (2011)? Thanks!
Intermediate & Advanced SEO | | Sika220 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
What Navigation strategy should I pursue for this local SEO project? Why?
Hi Mozzers, Context: I am working with a plumbing company that is located in NC and covers 6 locations( 1 address + 5 target cities). To start off I am planning to SEO his main location + 2 to 3 areas. The initial plan is to create 30 (10 services for 3 locations) unique landing pages for the main area and the extra locations,. His services are: PLUMBING EMERGENCY PLUMBING WATER TREATMENT WATER HEATER INSTALLATION WATER HEATER REPAIR TANKLESS WATER HEATERS SEWER AND DRAIN CLEANING REPIPING GARBAGE DISPOSALS GENERAL PLUMBING Since I am including 2 to 3 extra areas offering all these services above, should I include a subnavigation which will create a localized microsite within the site itself for these extra locations or not? Please specify why? Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Is it better to use geo-targeted keywords or add the locations as separate keywords?
For example... state keyword (nyc real estate) or keyword, state (nyc, real estate) = 2 keywords Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0 -
Meta description Tag who should it be used and what should it describe?
Hi when using the meta description tag how is it best used? what should it describe? should it describe the topic or the services offered i.e should it be a sales message for the services. For example: a page that promotes the benefits of acupuncture whilst pregnant should it describe the page content or the service provided? i.e. acupuncture for pregnancy in Chester or acupuncture can benefit pregnancy because.... thanks
Intermediate & Advanced SEO | | Bristolweb0 -
Proving Bad Intent
Okay, so based on common sense re: author name and generic comment... ...I'm pretty sure this blog comment awaiting approval is aimed at getting users to a phony site in hopes they will make a donation to a fraudster impersonating Johns Hopkins. But if you check out the URL, you'll see they are not idiots. It's an .edu address with a high DA. Two questions: Are my suspicions well founded? How would I go about proving this, in a less clear cut case? Author : how to grow weed (IP: 173.208.91.231 , 173-208-91-231.ipvnow.com)
Intermediate & Advanced SEO | | DanielFreedman
E-mail : Diekema@gmail.com
URL : http://apps.pathology.jhu.edu/blogs/pancreas/?p=121
Whois : http://whois.arin.net/rest/ip/173.208.91.231
Comment:
After study a few of the blog posts on your website now, and I truly like your way of blogging. I bookmarked it to my bookmark website list and will be checking back soon. Pls check out my web site as well and let me know what you think0