Drop Down Menus and Crawlability
-
Hello,
We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear.
Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page.
I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form?
Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page?
Any thoughts or clarity around this would be appreciated.
-
"But if they are already in the html, would that be considered cloaking?"
There are times when presenting something in HTML that is otherwise not visible, but having other features on the page that allow people to read / click / access that content in other ways, is fine. Linking is a tricky because links are so inherently valuable in terms of SEO. You don't can't really be too careful.
I'd be wary of presenting links (a subset or full set) in HTML if there is a form process to actually arrive at the links' targets. Essentially you'll be linking to products X, Y and Z on a page, for search engines but requiring a specific input from a user to see X, Y or Z - an input that only very few overall visitors are actually likely to make. I would say this qualifies as showing different content for SEO's sake and not providing a UX alternative that is pretty much the same thing. Others may disagree with me on that - I'm being wary here
I would very much like to see the HTML if you are still active in this thread when it is produced, but you may be left with a situation where the pages need to be linked to elsewhere throughout the site to ensure they are crawled.
-
Thanks, Jane. I don't have the raw html because only a static design has been produced at this point.
I'm not sure I want the form filled out by the bots, I just want to make sure that the links that are the end result of filling out the form are crawlable because that will be the only path to the product pages. I've been speaking with IT to figure out if the links will already be in the html even if they are not displayed on the page, or if the links are dynamically generated from another location, which means they will not be crawlable. They are not sure yet. But if they are already in the html, would that be considered cloaking? Since the user cannot see them until they fill out the form? And even then they will only see a small subset of the links.
My other concern for this page is that we are taking our largest parent category and putting ALL of the products on one page - you just can't get to them until you fill out the form. My worry is that this page will be way to broad, this parent category is normally made up of several subcategories. I don't think we will rank well for some long tail terms because there is going to be so much broad content on the page pertaining to so many different types of products.
Any thoughts or suggestions are appreciated.
-
Hi Kelli,
From what you have described here, I don't think this will easily be crawled. Obviously the necessary code for the click activity is included in the HTML, and whilst Google has been known to complete forms and take actions on sites before, it's far from guaranteed that it will do this.
Usually when Google completes actions like this, it's not desirable - you used to see websites with millions of junk URLs indexed because Google had either "filled out forms" itself, or spammers had dynamically generated millions of versions of a form to fill Google's index, in order to harm a competitor. It's not common to want Google to complete activity like this, rather than just let it crawl the information deliberately given to it in HTML.
I would be really curious to see what the menus looked like in HTML though. That would give us a better idea of whether it's likely Google will crawl any of the content.
If the menus are not crawlable, there are range of other good options (that can also be user-friendly and attractive) for menu development. The Distilled guide is a good resource.
If we are I am able to look at the raw HTML for the planned menus, please let me know. If you'd rather not post it in here, feel free to PM me (I am not a Moz staff member - I used to be - but I act as an associate contractor for the company) or email jane.copland@gmail.com.
Cheers,
Jane
-
Thanks Evan. Please keep in mind, this is not the navigation, it is essentially a form on a page that dynamically generates a list of product page links. My question is that I want to know if those products cannot be viewed until the form is filled out, how can the bots see them.
This form will require click activity to fill out, not just hovering over it. And I don't just want the dropdowns to be crawled, the dropdown items themselves are not links, they are just decisions that the user has to make in order to see matching products.
Even if the bot could fill out the form, it is only going to display a small subset of product links. If this is the only page that will have links to all of our products in a particular category, I want to make sure that all of those product pages will get crawled. So I was wondering if all of the product links will still be see by the bots even though the user will not be able to see them.
-
hey Kelli,
i'm not entirely sure what the mock-up design is like, but I have used dropdown me us in the past, and as long as they are in html, bots should be able to crawl. I have found this article helpful on the past.: https://www.distilled.net/blog/seo/site-navigation-for-seo/
Hopefully this is helpful.
-
Thanks, but I cannot fetch as googlebot because the page is not live yet, we are wireframing the design first.
-
A simple way to see how Google sees your page is to use the "Fetch as Googlebot" function in Google Webmasters. This way you can see if there is anything not being crawled. The more traditional way to do this set up would be to have a search bar above the fold, and then have category pages people can click through to browse if they want. Messy drop-downs are never fun.
Let me know if that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop after new website
Hi there, I have a new client who has just had a new website built (by someone else). It was quite a major change as it was 12 years old and has just been moved to Wordpress. However although they are by and large happy with the new site, they have lost a lot of their rankings in Google. The content and menu structure is apparently identical. I told them I didn't think this was unusual but I'm not sure how easy it will be to get them ranking again. Where are they likely to be starting from? Is it a case of starting from the beginning or will there be some residual ranking capability left over? Or can they expect a full recovery over time? I was going to start by looking to see if things like tagging and meta data has been filled in (I will add the site to my Moz account) but is there any way of comparing the old site with the new for SEO purposes? Thanks so much, Sarah.
Web Design | | Frog-Marketing0 -
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Traffic Dropping To Website
Hi In Google Analytics:
Web Design | | SEOguy1
I have noticed up to 50% of traffic coming to the website drops off at the home page point,
and drops further from other pages on the site. I realise some may possibly say that this could be down to various factors such as server issues, poor web design, or the wrong traffic reaching the site I have did corrected the following: There was an issue with there being www.domain.com and www.domain.com/home, Screaming Frog and Moz showed that these both had duplicate meta tagging issues. Initially I had created a separate page called 'home' to include in the main nav bar under the slider, but yesterday I replaced this page with a request in the functions.php to place 'home' in the nav bar as a redirect back to the home www.domain.com page. This works great. So I now have the following 301 permanent redirects: non-www to www resolve in the htaccess file, plus 2 permanent 301 redirects in the nav bar. I wonder if this is acceptable protocol re the nav bar redirects, and I wonder if you could possibly advise if the actions that I have taken will have any negative impact on the web seo, link structure, crawlability or indexing. Thanks.0 -
New Mobile Site Traffic Drop
With all the talk about how much mobile is important and how it is going to return its own search results, we finally decided to make a mobile site for one of our smaller websites to test the water. We put it up about two weeks ago and did Vary HTTP header method to serve the site. Before the change, on the average week we would get 270-300 mobile visitors from organic search results and we converted 0.78% to sales. Since the change, we are now getting about 70 mobile organic visitors per week but converting 2.47% So what can I say but WOW. We are converting way way better but our organic mobile search traffic has dropped off a ton. Luckily our desktop and tablet traffic(we serve the desktop version of the site to tablets) has stayed the same and has not dipped. Do any of you guys have experience or gone through launching a mobile site before? Did you see the immediate drop in organic mobile traffic and did you recover your traffic back to previous levels? If so, do you know how long it takes to recover? I am thinking it is a big change and will take time for Google to adjust but I am not sure since the mobile version has so much less text now on the home page and on category or product list pages or whatever you guys want to call them.
Web Design | | KurtL0 -
Java Script and Menus
I have a client who wants to create drop down menus on their E commerce Site. Currently they do not have any. If we create drop down menus I am afraid we will loose too much category and product page power. However these drop downs would be good for the user. To accomplish this drop down menu without negative SEO side effects I had an idea. Create a javascript menu. Google's spiders do not read javascript correct? This would bypass any negative effect of having an HTML menu correct?
Web Design | | waqid0 -
$100 to who discovers why our rankings drop
I'm offering $100 to the SEO that pinpoints why our rankings dropped. Here's details: Some very good people have this site: nlpca(dot)com and it has dropped for many of it's keywords, including the keywords "NLP" "NLP Training" and many other keywords. We dropped from 19th to 42nd for the term "NLP". Here's what I'm doing about it: (1) making sure all of the keywords (on all pages) in the titles reflect what's in the content, and that the keywords show up exactly in the content 3 times or more. (2) making sure all of the keywords (on all pages) in the URLs reflect what's in the content, and that the keywords show up exactly in the content 3 times or more. (3) We're redoing the home page as (1) above. (4) We're fixing the 404s (5) We're shortening the titles that are too long, and we're thinking of reducing the home page keyword count to 3 keyword phrases, although 4 keywords work in all of our other sites that have the keywords showing up at least 3 times in the content. If it is something else, and you pinpoint it, and if because of you, we rise back up to around 19th (more or less) again then we'll give you $100 payable via paypal as a thank you. I'm going to leave this question 'unanswered' until this is resolved.
Web Design | | BobGW0 -
Drop Down Menus & SEO?
Do these typically have a negative impact on SEO? I know this is kind of a vague question, does it make it harder to spider? Are there SEO friendly ways of coding these? There are so many sites out there that have these, so I've got to assume it's different on a case by case basis.
Web Design | | MichaelWeisbaum0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0