Drop Down Menus and Crawlability
-
Hello,
We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear.
Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page.
I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form?
Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page?
Any thoughts or clarity around this would be appreciated.
-
"But if they are already in the html, would that be considered cloaking?"
There are times when presenting something in HTML that is otherwise not visible, but having other features on the page that allow people to read / click / access that content in other ways, is fine. Linking is a tricky because links are so inherently valuable in terms of SEO. You don't can't really be too careful.
I'd be wary of presenting links (a subset or full set) in HTML if there is a form process to actually arrive at the links' targets. Essentially you'll be linking to products X, Y and Z on a page, for search engines but requiring a specific input from a user to see X, Y or Z - an input that only very few overall visitors are actually likely to make. I would say this qualifies as showing different content for SEO's sake and not providing a UX alternative that is pretty much the same thing. Others may disagree with me on that - I'm being wary here
I would very much like to see the HTML if you are still active in this thread when it is produced, but you may be left with a situation where the pages need to be linked to elsewhere throughout the site to ensure they are crawled.
-
Thanks, Jane. I don't have the raw html because only a static design has been produced at this point.
I'm not sure I want the form filled out by the bots, I just want to make sure that the links that are the end result of filling out the form are crawlable because that will be the only path to the product pages. I've been speaking with IT to figure out if the links will already be in the html even if they are not displayed on the page, or if the links are dynamically generated from another location, which means they will not be crawlable. They are not sure yet. But if they are already in the html, would that be considered cloaking? Since the user cannot see them until they fill out the form? And even then they will only see a small subset of the links.
My other concern for this page is that we are taking our largest parent category and putting ALL of the products on one page - you just can't get to them until you fill out the form. My worry is that this page will be way to broad, this parent category is normally made up of several subcategories. I don't think we will rank well for some long tail terms because there is going to be so much broad content on the page pertaining to so many different types of products.
Any thoughts or suggestions are appreciated.
-
Hi Kelli,
From what you have described here, I don't think this will easily be crawled. Obviously the necessary code for the click activity is included in the HTML, and whilst Google has been known to complete forms and take actions on sites before, it's far from guaranteed that it will do this.
Usually when Google completes actions like this, it's not desirable - you used to see websites with millions of junk URLs indexed because Google had either "filled out forms" itself, or spammers had dynamically generated millions of versions of a form to fill Google's index, in order to harm a competitor. It's not common to want Google to complete activity like this, rather than just let it crawl the information deliberately given to it in HTML.
I would be really curious to see what the menus looked like in HTML though. That would give us a better idea of whether it's likely Google will crawl any of the content.
If the menus are not crawlable, there are range of other good options (that can also be user-friendly and attractive) for menu development. The Distilled guide is a good resource.
If we are I am able to look at the raw HTML for the planned menus, please let me know. If you'd rather not post it in here, feel free to PM me (I am not a Moz staff member - I used to be - but I act as an associate contractor for the company) or email jane.copland@gmail.com.
Cheers,
Jane
-
Thanks Evan. Please keep in mind, this is not the navigation, it is essentially a form on a page that dynamically generates a list of product page links. My question is that I want to know if those products cannot be viewed until the form is filled out, how can the bots see them.
This form will require click activity to fill out, not just hovering over it. And I don't just want the dropdowns to be crawled, the dropdown items themselves are not links, they are just decisions that the user has to make in order to see matching products.
Even if the bot could fill out the form, it is only going to display a small subset of product links. If this is the only page that will have links to all of our products in a particular category, I want to make sure that all of those product pages will get crawled. So I was wondering if all of the product links will still be see by the bots even though the user will not be able to see them.
-
hey Kelli,
i'm not entirely sure what the mock-up design is like, but I have used dropdown me us in the past, and as long as they are in html, bots should be able to crawl. I have found this article helpful on the past.: https://www.distilled.net/blog/seo/site-navigation-for-seo/
Hopefully this is helpful.
-
Thanks, but I cannot fetch as googlebot because the page is not live yet, we are wireframing the design first.
-
A simple way to see how Google sees your page is to use the "Fetch as Googlebot" function in Google Webmasters. This way you can see if there is anything not being crawled. The more traditional way to do this set up would be to have a search bar above the fold, and then have category pages people can click through to browse if they want. Messy drop-downs are never fun.
Let me know if that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Heavy rank drop post migration
Our website has been migrated from Joomla to Wordpress at the end of 2015 and we have tasted the loss of 20% of the traffic. After an year at the end of 2016, we have relaunched the website in same word press with new theme. Again we lost n rankings and traffic. I would say ranking. Because mostly people land on our website by searching for our brand. Now we almost went invisible for "keywords" we been targeting. We have checked all the possibilities like duplicate content, redirection, alt tags, speed, canonicals, backlinks, etc..and couldn't find what is hitting us. What could be such strong factor hitting us ?
Web Design | | vtmoz0 -
Help, site traffic has dropped significantly since we changed from http to https
Heya, so I am just in charge of the content on the site, and the SEO content, not the actual back-end stuff. A little under 2 weeks ago we switched to https, and our site traffic has been down a lot ever since. When I SERP check our keywords, they don't seem to have dropped in rankings pages. Here is what I got when I asked our dev guy if 301 redirects were put in: I did not add any redirects so all of the content is accessible on both unless individual links get hardcoded one way or the other. The only thing in place is a Cloudflare plugin which rewrites links in cached pages to match the way its accessed, so if for example you access a page over https you don’t get the version cached with a bunch of http links since that will throw up mixed content warnings in the browser. Other than that WP mostly generates all its links to match whatever protocol you are accessing the current page with. We can make specific pages redirect one way or the other in the future if we want to though... As a startup, site traffic is a metric we track to gouge progress, and so I really need to get to the bottom of if it was the change from http to https that has causes the drop, and if so, what can we do about it? Also, in case it is relevant: the bounce rate is now sky high (ave. 15% to 64% this last week!) Any help is very welcome! Site: https://mobileday.com Thank you!
Web Design | | MobileDay1 -
New Mobile Site Traffic Drop
With all the talk about how much mobile is important and how it is going to return its own search results, we finally decided to make a mobile site for one of our smaller websites to test the water. We put it up about two weeks ago and did Vary HTTP header method to serve the site. Before the change, on the average week we would get 270-300 mobile visitors from organic search results and we converted 0.78% to sales. Since the change, we are now getting about 70 mobile organic visitors per week but converting 2.47% So what can I say but WOW. We are converting way way better but our organic mobile search traffic has dropped off a ton. Luckily our desktop and tablet traffic(we serve the desktop version of the site to tablets) has stayed the same and has not dipped. Do any of you guys have experience or gone through launching a mobile site before? Did you see the immediate drop in organic mobile traffic and did you recover your traffic back to previous levels? If so, do you know how long it takes to recover? I am thinking it is a big change and will take time for Google to adjust but I am not sure since the mobile version has so much less text now on the home page and on category or product list pages or whatever you guys want to call them.
Web Design | | KurtL0 -
Website design chnage and massive traffic drop?
I changed my Drupal theme to "Bootstrap 3" and now my traffic is down 50% gradually in past 5 days, can this be theme related? Answers to checks below : 1). There were no redirects involved, I just flipped a switch and changed theme for my Drupal blog. 2). No issues reported by Google WMT except the fact that impressions fell, see stat images for comparison at - http://imgur.com/a/5PssH#0. 3). site:mysiteurl.com shows healthy "About 201,000 results". 4). Checked slow loading times and browser issue, nothing there. 5). It's not a seasonal drop. Pls. suggest what else should I focus upon to find the reason. @Prateek_Chandra where can I share my analytic report with you privately. I can also enable guest access to my account for you to have a look.
Web Design | | techdna11 -
Question re. crawlable textual content
I have a client who is struggling to fit crawlable textual content on their pages. I'm wondering if we can add a "Learn More..." feature that works as a mouse over pop up. When a page visitor runs their curser over the link or button, a window bubble pops up and textual content about the page will show. Not knowing much about code, can text in this format be crawlable by search engines and count as unique and relevant content? Thanks, Dino
Web Design | | Dino640 -
Rankings Dropped After Redesign
Hi, I've recently redesigned our website with the main changes being sidebar changes and source ordering (making the main content appear before the sidebars). No URL changes have been made. A few days after making these changes our positions dropped heavily and have been dropping ever since. It's been a week and a half now and traffic is down by around 40%. Google has the new changes cached. Do people feel this just a temporary drop and will we rankings to go back at least or should we revert to the old structure? Website: http://www.diyorgasms.co.uk (NSFW) Thanks
Web Design | | diyorgasms0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
Google News we were dropped and need help finding ot why
Hi i have a site called in2town lifestyle magazine http://www.in2town.co.uk/ and up until two months ago we were with google news and for a long time. But then all of a sudden we were dropped which left us with no confidence about our site and led us to make changes to the site, some good and some bad to try and find out what was wrong with our site and why we were dropped. We have now been concentrating on sorting the site out which has led in a drop in traffic due to not updating it as we should because we are more concerned in trying to make it a quality lifestyle magazine and get back in google as well as making it a good experience for our readers.. I would like your help and finding out what you feel is wrong with our site so we can then work on it and change it and try and find out what went wrong with google news. we have spent years on the site but now we have gone in the wrong direction because we were more worried about google news. If you can advise us on how we should change the site and sort the site out and make it into the professional site it was once more then that would be great.
Web Design | | ClaireH-1848860