Drop Down Menus and Crawlability
-
Hello,
We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear.
Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page.
I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form?
Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page?
Any thoughts or clarity around this would be appreciated.
-
"But if they are already in the html, would that be considered cloaking?"
There are times when presenting something in HTML that is otherwise not visible, but having other features on the page that allow people to read / click / access that content in other ways, is fine. Linking is a tricky because links are so inherently valuable in terms of SEO. You don't can't really be too careful.
I'd be wary of presenting links (a subset or full set) in HTML if there is a form process to actually arrive at the links' targets. Essentially you'll be linking to products X, Y and Z on a page, for search engines but requiring a specific input from a user to see X, Y or Z - an input that only very few overall visitors are actually likely to make. I would say this qualifies as showing different content for SEO's sake and not providing a UX alternative that is pretty much the same thing. Others may disagree with me on that - I'm being wary here
I would very much like to see the HTML if you are still active in this thread when it is produced, but you may be left with a situation where the pages need to be linked to elsewhere throughout the site to ensure they are crawled.
-
Thanks, Jane. I don't have the raw html because only a static design has been produced at this point.
I'm not sure I want the form filled out by the bots, I just want to make sure that the links that are the end result of filling out the form are crawlable because that will be the only path to the product pages. I've been speaking with IT to figure out if the links will already be in the html even if they are not displayed on the page, or if the links are dynamically generated from another location, which means they will not be crawlable. They are not sure yet. But if they are already in the html, would that be considered cloaking? Since the user cannot see them until they fill out the form? And even then they will only see a small subset of the links.
My other concern for this page is that we are taking our largest parent category and putting ALL of the products on one page - you just can't get to them until you fill out the form. My worry is that this page will be way to broad, this parent category is normally made up of several subcategories. I don't think we will rank well for some long tail terms because there is going to be so much broad content on the page pertaining to so many different types of products.
Any thoughts or suggestions are appreciated.
-
Hi Kelli,
From what you have described here, I don't think this will easily be crawled. Obviously the necessary code for the click activity is included in the HTML, and whilst Google has been known to complete forms and take actions on sites before, it's far from guaranteed that it will do this.
Usually when Google completes actions like this, it's not desirable - you used to see websites with millions of junk URLs indexed because Google had either "filled out forms" itself, or spammers had dynamically generated millions of versions of a form to fill Google's index, in order to harm a competitor. It's not common to want Google to complete activity like this, rather than just let it crawl the information deliberately given to it in HTML.
I would be really curious to see what the menus looked like in HTML though. That would give us a better idea of whether it's likely Google will crawl any of the content.
If the menus are not crawlable, there are range of other good options (that can also be user-friendly and attractive) for menu development. The Distilled guide is a good resource.
If we are I am able to look at the raw HTML for the planned menus, please let me know. If you'd rather not post it in here, feel free to PM me (I am not a Moz staff member - I used to be - but I act as an associate contractor for the company) or email jane.copland@gmail.com.
Cheers,
Jane
-
Thanks Evan. Please keep in mind, this is not the navigation, it is essentially a form on a page that dynamically generates a list of product page links. My question is that I want to know if those products cannot be viewed until the form is filled out, how can the bots see them.
This form will require click activity to fill out, not just hovering over it. And I don't just want the dropdowns to be crawled, the dropdown items themselves are not links, they are just decisions that the user has to make in order to see matching products.
Even if the bot could fill out the form, it is only going to display a small subset of product links. If this is the only page that will have links to all of our products in a particular category, I want to make sure that all of those product pages will get crawled. So I was wondering if all of the product links will still be see by the bots even though the user will not be able to see them.
-
hey Kelli,
i'm not entirely sure what the mock-up design is like, but I have used dropdown me us in the past, and as long as they are in html, bots should be able to crawl. I have found this article helpful on the past.: https://www.distilled.net/blog/seo/site-navigation-for-seo/
Hopefully this is helpful.
-
Thanks, but I cannot fetch as googlebot because the page is not live yet, we are wireframing the design first.
-
A simple way to see how Google sees your page is to use the "Fetch as Googlebot" function in Google Webmasters. This way you can see if there is anything not being crawled. The more traditional way to do this set up would be to have a search bar above the fold, and then have category pages people can click through to browse if they want. Messy drop-downs are never fun.
Let me know if that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Heavy rank drop post migration
Our website has been migrated from Joomla to Wordpress at the end of 2015 and we have tasted the loss of 20% of the traffic. After an year at the end of 2016, we have relaunched the website in same word press with new theme. Again we lost n rankings and traffic. I would say ranking. Because mostly people land on our website by searching for our brand. Now we almost went invisible for "keywords" we been targeting. We have checked all the possibilities like duplicate content, redirection, alt tags, speed, canonicals, backlinks, etc..and couldn't find what is hitting us. What could be such strong factor hitting us ?
Web Design | | vtmoz0 -
Responsive image plugins and seo / crawlability
Note : For the background of this question please read the preface below. Ive been researching responsive image options the main issue i can see with them is that they are not semantic html so bots may not index them correctly. For instance many of the responsive image plugins use data-src for an image rather than src. Does any one have any experience with this and if it impacts on SEO ? Does any one know of a client side responsive image soltion that uses a normal img tag with the image stored in the src and with the option to set an alt attribute ? **Preface : ** Ive got a site we are currently developing, the site has a large full width responsive image slider. To serve images that wont be pixilated we are making the width of the images 1800px wide (which should cover most screens, but isn't actually big enough if the site was viewed on a 27" imac) these 1800px wide images weight about 350kb - 500kb per image and our image slider has about 20 of them. As you can see this would be a problem for anyone with a connection slower than c.10 mbps. This is especially true for mobile devices that will be downloading an image 1800px wide although only require a much smaller one, this coupled with a 3g connection will make the site really slow.
Web Design | | Sam-P0 -
404's and a drop in Rank - Site maps? Data Highlighter?
I managed an old (2006 design) ticket site that was hosted and run by the same company that handled our point of sale. (Think, really crappy, customer had to click through three pages to get to the tickets, etc.) In Mid February, we migrated that old site to a new, more powerful site, built by a company that handles sites exclusively for ticket brokers. (My site: TheTicketKing. - dot - com) Before migration, I set up 301's for all the pages that we had currently ranked for, and had inbound links pointing to, etc. The CMS allowed me to set every one of those landing pages up with fresh content, so I created unique content for all of them, ran them through the Moz grader before launch, etc. We launched the site in Mid February, and it seemed like Google responded well. All the pages that we had 301's set up for stayed up fairly well in rank, and some even reached higher positions, while some took a few weeks to get back up to where they were before. Google was also giving us an average of 8-10K impressions per day, compared to 3000 per day with the old site. I started to notice a slow drop in impressions in mid April (after two months of love from Google,) and we lost rank on all our non branded pages around 4/23. Our branded terms are still fine, we didn't get a message from Google, and I reached out to the company that manages our site, asking if they had any issues with their other clients. They suggested that I resubmit our sitemaps. I did, and saw everything bump back up (impressions and rank) for just one week. Now we're back in the basement with all the non branded terms once again. I realize that Google could have penalized us without giving us a message, but what got me somewhat optimistic was the fact that resubmitting our sitemaps did bring us back up for around a week. One other thing that I was working on with the site just before the drop was Google's data highlighter. I submitted a set of pages that now come back with errors, after Google seemed to be fine with the data set before I submitted it. So now I'm looking at over 300 data highlighter errors when I'm in WMT. I deleted that set, but I still get the error listings in WMT, as if Google is still trying to understand those pages. Would that have an effect on our rank? Finally I do see that our 404's have risen steadily since the migration, to over 1000 now, and the people who manage the CMS tell me that it would have no effect on rank overall. And we're going to continue to get 404's as the nature of a ticket site would dictate? (Not sure on that, but that's what I was told.) Would anyone care to chime in on these thoughts, or any other clues as to my drop?
Web Design | | Ticket_King0 -
CSS vs Javascript vs JQuery drop down navigation
For a user / seo perspective, what is the best way to code a drop down menu nav bar? Is it best to use css, javascript or a scripting library like jquery? I am thinking about overall best practice that will not have a negative impact on serps. I am also thinking about what will work best on all types of devices i.e. desk tops, lap tops, smart phones and tablets. What are the Pro's & Cons of Using CSS for Drop Down Menus. What are the Pro's & cons of using Javascript for drop down menus. And the same question for jquery. Thank you all in advance for your ideas.
Web Design | | bronxpad0 -
Wordpress drop down menu issue...
While I am tweaking my current site with getting rid of my footer links, my site will not allow a drop down menu for my service area parent page. It will do a drop down menu for any other tab. Could not find anything in a google search besides just link to parent for drop down... Any ideas?
Web Design | | greenjoe0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
Do drop caps impact the search value of your content?
A client of mine wants to include drop caps at the start of the first paragraph on the page because they think it looks nice. I found some css techniques for implementing this using a span on the first character to enlarge the size of just that character. First word of the first paragraph. Are there any seo concerns I should have for adding drop caps?
Web Design | | fivelinesmedia0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0