Drop Down Menus and Crawlability
-
Hello,
We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear.
Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page.
I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form?
Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page?
Any thoughts or clarity around this would be appreciated.
-
"But if they are already in the html, would that be considered cloaking?"
There are times when presenting something in HTML that is otherwise not visible, but having other features on the page that allow people to read / click / access that content in other ways, is fine. Linking is a tricky because links are so inherently valuable in terms of SEO. You don't can't really be too careful.
I'd be wary of presenting links (a subset or full set) in HTML if there is a form process to actually arrive at the links' targets. Essentially you'll be linking to products X, Y and Z on a page, for search engines but requiring a specific input from a user to see X, Y or Z - an input that only very few overall visitors are actually likely to make. I would say this qualifies as showing different content for SEO's sake and not providing a UX alternative that is pretty much the same thing. Others may disagree with me on that - I'm being wary here
I would very much like to see the HTML if you are still active in this thread when it is produced, but you may be left with a situation where the pages need to be linked to elsewhere throughout the site to ensure they are crawled.
-
Thanks, Jane. I don't have the raw html because only a static design has been produced at this point.
I'm not sure I want the form filled out by the bots, I just want to make sure that the links that are the end result of filling out the form are crawlable because that will be the only path to the product pages. I've been speaking with IT to figure out if the links will already be in the html even if they are not displayed on the page, or if the links are dynamically generated from another location, which means they will not be crawlable. They are not sure yet. But if they are already in the html, would that be considered cloaking? Since the user cannot see them until they fill out the form? And even then they will only see a small subset of the links.
My other concern for this page is that we are taking our largest parent category and putting ALL of the products on one page - you just can't get to them until you fill out the form. My worry is that this page will be way to broad, this parent category is normally made up of several subcategories. I don't think we will rank well for some long tail terms because there is going to be so much broad content on the page pertaining to so many different types of products.
Any thoughts or suggestions are appreciated.
-
Hi Kelli,
From what you have described here, I don't think this will easily be crawled. Obviously the necessary code for the click activity is included in the HTML, and whilst Google has been known to complete forms and take actions on sites before, it's far from guaranteed that it will do this.
Usually when Google completes actions like this, it's not desirable - you used to see websites with millions of junk URLs indexed because Google had either "filled out forms" itself, or spammers had dynamically generated millions of versions of a form to fill Google's index, in order to harm a competitor. It's not common to want Google to complete activity like this, rather than just let it crawl the information deliberately given to it in HTML.
I would be really curious to see what the menus looked like in HTML though. That would give us a better idea of whether it's likely Google will crawl any of the content.
If the menus are not crawlable, there are range of other good options (that can also be user-friendly and attractive) for menu development. The Distilled guide is a good resource.
If we are I am able to look at the raw HTML for the planned menus, please let me know. If you'd rather not post it in here, feel free to PM me (I am not a Moz staff member - I used to be - but I act as an associate contractor for the company) or email jane.copland@gmail.com.
Cheers,
Jane
-
Thanks Evan. Please keep in mind, this is not the navigation, it is essentially a form on a page that dynamically generates a list of product page links. My question is that I want to know if those products cannot be viewed until the form is filled out, how can the bots see them.
This form will require click activity to fill out, not just hovering over it. And I don't just want the dropdowns to be crawled, the dropdown items themselves are not links, they are just decisions that the user has to make in order to see matching products.
Even if the bot could fill out the form, it is only going to display a small subset of product links. If this is the only page that will have links to all of our products in a particular category, I want to make sure that all of those product pages will get crawled. So I was wondering if all of the product links will still be see by the bots even though the user will not be able to see them.
-
hey Kelli,
i'm not entirely sure what the mock-up design is like, but I have used dropdown me us in the past, and as long as they are in html, bots should be able to crawl. I have found this article helpful on the past.: https://www.distilled.net/blog/seo/site-navigation-for-seo/
Hopefully this is helpful.
-
Thanks, but I cannot fetch as googlebot because the page is not live yet, we are wireframing the design first.
-
A simple way to see how Google sees your page is to use the "Fetch as Googlebot" function in Google Webmasters. This way you can see if there is anything not being crawled. The more traditional way to do this set up would be to have a search bar above the fold, and then have category pages people can click through to browse if they want. Messy drop-downs are never fun.
Let me know if that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mega Menus and SEO
Hi Everyone, I know this has been brought up before, but wanted your opinion for 2020. I have a new client that is hesitant to do a mega menu for their huge site due to the amount of links and "dilution". I have quite a few clients with mega menus with no problems at all from an SEO standpoint. But I can understand his perspective. I am suggesting that we have the main links (looking at GA) as the the navigation, then clicking them takes you to subcategory page listing all the subcats within. Problem is that the developer/designer has made this mega menu already and it is pretty slick. Now they already are killing it search-wise on Google, but don't have a mega menu or a secondary category page. Just a a category with too many products, so we are trying to go one way or the other. Any opinions on which route to best take from a user and SEO perspective?
Web Design | | vetofunk0 -
Migrating Magento site to Shopify Plus without dropping in SERPS
We have been looking at moving our ecommerce store www.pretavoir.co.uk from Magento to Shopify Plus. However, as we rank quite well at present we are interested in hearing experience others may have had making this change and also any advice that you may have... Also, any general comments on Shopify appreciated..
Web Design | | seanmccauley0 -
Heavy rank drop post migration
Our website has been migrated from Joomla to Wordpress at the end of 2015 and we have tasted the loss of 20% of the traffic. After an year at the end of 2016, we have relaunched the website in same word press with new theme. Again we lost n rankings and traffic. I would say ranking. Because mostly people land on our website by searching for our brand. Now we almost went invisible for "keywords" we been targeting. We have checked all the possibilities like duplicate content, redirection, alt tags, speed, canonicals, backlinks, etc..and couldn't find what is hitting us. What could be such strong factor hitting us ?
Web Design | | vtmoz0 -
Traffic Dropping To Website
Hi In Google Analytics:
Web Design | | SEOguy1
I have noticed up to 50% of traffic coming to the website drops off at the home page point,
and drops further from other pages on the site. I realise some may possibly say that this could be down to various factors such as server issues, poor web design, or the wrong traffic reaching the site I have did corrected the following: There was an issue with there being www.domain.com and www.domain.com/home, Screaming Frog and Moz showed that these both had duplicate meta tagging issues. Initially I had created a separate page called 'home' to include in the main nav bar under the slider, but yesterday I replaced this page with a request in the functions.php to place 'home' in the nav bar as a redirect back to the home www.domain.com page. This works great. So I now have the following 301 permanent redirects: non-www to www resolve in the htaccess file, plus 2 permanent 301 redirects in the nav bar. I wonder if this is acceptable protocol re the nav bar redirects, and I wonder if you could possibly advise if the actions that I have taken will have any negative impact on the web seo, link structure, crawlability or indexing. Thanks.0 -
Website organic traffic unchanged, impressions took a 98% drop in the last week.
Hi all, I have a very curious predicament and I'd be grateful if someone could shed some light on the situation. As mentioned in the title, organic traffic to our website has remained unchanged, but organic impressions have taken a 98% drop in the last week. This happened suddenly over one day; on October 22, impressions were 700, on October 23, they were 500, and on October 24 they drastically dropped to 50. The next two days they were at 22 and then up to 35. Organic traffic, however, showed the normal "weekend drop" as of October 24, and is still showing normal level (even increased a bit) continuing into this week. These are organic impressions according to Google Analytics and Google Webmaster tools. We did perform a complete site redesign a month ago. Could this be an effect from the redesign? We also noticed drop in Domain Authority, but our competitors suffered a similar (if not greater) drop as well, so we wondered if it could be due in part to the algorithm update. If anyone could shed some light on the situation I would be so appreciative! Thanks!
Web Design | | Joanne_Pendon0 -
404's and a drop in Rank - Site maps? Data Highlighter?
I managed an old (2006 design) ticket site that was hosted and run by the same company that handled our point of sale. (Think, really crappy, customer had to click through three pages to get to the tickets, etc.) In Mid February, we migrated that old site to a new, more powerful site, built by a company that handles sites exclusively for ticket brokers. (My site: TheTicketKing. - dot - com) Before migration, I set up 301's for all the pages that we had currently ranked for, and had inbound links pointing to, etc. The CMS allowed me to set every one of those landing pages up with fresh content, so I created unique content for all of them, ran them through the Moz grader before launch, etc. We launched the site in Mid February, and it seemed like Google responded well. All the pages that we had 301's set up for stayed up fairly well in rank, and some even reached higher positions, while some took a few weeks to get back up to where they were before. Google was also giving us an average of 8-10K impressions per day, compared to 3000 per day with the old site. I started to notice a slow drop in impressions in mid April (after two months of love from Google,) and we lost rank on all our non branded pages around 4/23. Our branded terms are still fine, we didn't get a message from Google, and I reached out to the company that manages our site, asking if they had any issues with their other clients. They suggested that I resubmit our sitemaps. I did, and saw everything bump back up (impressions and rank) for just one week. Now we're back in the basement with all the non branded terms once again. I realize that Google could have penalized us without giving us a message, but what got me somewhat optimistic was the fact that resubmitting our sitemaps did bring us back up for around a week. One other thing that I was working on with the site just before the drop was Google's data highlighter. I submitted a set of pages that now come back with errors, after Google seemed to be fine with the data set before I submitted it. So now I'm looking at over 300 data highlighter errors when I'm in WMT. I deleted that set, but I still get the error listings in WMT, as if Google is still trying to understand those pages. Would that have an effect on our rank? Finally I do see that our 404's have risen steadily since the migration, to over 1000 now, and the people who manage the CMS tell me that it would have no effect on rank overall. And we're going to continue to get 404's as the nature of a ticket site would dictate? (Not sure on that, but that's what I was told.) Would anyone care to chime in on these thoughts, or any other clues as to my drop?
Web Design | | Ticket_King0 -
What causes rankings to drop while moving a site.
Hi, we recently moved a PHP based site from one web developer to another (switched hosting providers as well). Amidst the move our rankings drastically dropped and our citation and trust flow were literally cut in half as per Majestic SEO. What could have caused this sudden drop?
Web Design | | Syed_Raza0 -
Switched From Wordpress, Traffic Dropped In Half
Hello, Thank you for taking a look at my issue. My site: http://www.getrightmusic.com A month ago, I switched from Wordpress to ExpressionEngine. The reason being I wanted a more powerful membership functionality with media uploading. After I switched, my traffic basically dropped in half. I was averaging around 4-6,000 unique visitors per day and now I am at about 2,000 per day. I resubmitted a new sitemap to Google webmasters. I also set up 301 redirects on my top 80 urls that were ranking well and driving traffic in Google. Not only did Google kick me off of my top spots in the SERP's, but I no longer get indexed as quickly as I used to. With the old Wordpress site I would get url's indexed within minutes. Now they aren't even getting indexed really at all. Is this a normal occurrence when switching site designs and systems? Do you think Google will just take a little time before they give me back some respect? Is there anything I should be doing to get back to ranking and getting indexed faster? Thanks for any help or any insight you may have. Jesse
Web Design | | getrightmusic0