Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Content in Accordion doesn't rank as well as Content in Text box?
-
Does content rank better in a full view text layout, rather than in a clickable accordion?
I read somewhere because users need to click into an accordion it may not rank as well, as it may be considered hidden on the page - is this true?
accordion example: see features: https://www.workday.com/en-us/applications/student.html
-
Google will not treat content that is concealed behind tabs, accordions, or any other element where JavaScript is used to reveal content, in the same way as content that is visible as standard. However, it will still be indexed, so pages may rank for search phrases related to content contained within the hidden sections.
Why does Google devalue hidden content?
Google’s focus is on ensuring that the user experience within its search results is as good as possible. If the algorithm gave full weight to content hidden using JavaScript, this could be compromised.
For example, say a user searches for a term that is matched on a page but only in the hidden section. The user then clicks the search result to go through to that page but can’t immediately see the information they’re looking for because it’s hidden. They give up and return to the search results or head to another website.
This, in Google’s assessment, would not be a high quality user experience and the content within the hidden sections is therefore down-weighted.
In Summary
- Hiding content within tabs, accordions, or other elements that rely on JavaScript to reveal it to users is likely to be treated differently by Google, and assigned far less importance
- Websites, therefore, must take a considered approach and use this method only to hide content that is of secondary importance to the primary topic of the page, or that covers related topics
-
Hi there,
Absolutely not. In fact, I believe content in accordions outranks content on a page, although not for technical reasons.
Accordions are easier to fit into a page and can answer multiple user inquiries at once without throwing a wall of text at your visitors as they browse. Google reads accordions just the same as it reads open text. The difference comes with user interactions, metrics and satisfaction metrics.
Think about it like this:
You are browsing for pricing of a product. You also want to know shipping details and whether said product is safe to use for your 4-year old.
Your search returns 2 companies in your area that provide said product.
The first website throws 3,000 words at you in blocks, requiring you to scroll for what feels like hours without a clear indication of where to find the answer to your questions.
The second website can be scrolled in about 2 seconds and features an accordion which features headlines and direct answers to your questions without the need to view other content. Now we're cooking with gas.
In addition, accordion content lends itself to direct-answer formats which in turn lend themselves to showcase on SERP's. So not only will rankings improve, but so will traffic (there are tons of studies showing that Top 10 rankings = traffic, but few people realize that meta data and snippets can improve your odds of trapping 1st page traffic better than positioning).
Over time, this website will generate more and more authority for this product and relevant search queries, overtaking the other.
To answer your question directly - Google treats both forms of content equally, but (all else being equal) user metrics will provide greater link building potential, greater readership, more shares, etc. for the one featuring an accordion setup.
Look forward to what others have to say on this,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | Dec 3, 2017, 12:49 AM | jgresalfi0 -
301 Re-directing 'empty' domains
Hello, My client had purchased a few domains and 301 re-directed them, pointing to our main website. As far as I am aware the 'empty domains' are brand related but no content has ever been displayed on them, and I doubt they have much authority. The issue here is that we took a dive in ranking for our main keyword, I had a look on ahrefs and found the below: | www.empty-domain/our-keyword | 30 | 19 | 1 | fb 0
Technical SEO | Apr 21, 2017, 10:36 AM | SO_UK
G+ 0
in 4 | REDIRECT 301 TO www.main-domain/our-keyword | 8 Feb '175 d | The ranking dip happened at the same time as the re-direct was re-discovered / re-crawled. Could the 'empty' URL in question been causing us any issues? I understand that this is terrible practice for 301 redirects, I was hoping someone in the community could shed light on any possible solution for this.0 -
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | Jul 28, 2016, 12:43 PM | ccox10 -
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | May 24, 2016, 8:33 AM | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | Nov 4, 2013, 9:33 PM | fslocal0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | Sep 17, 2013, 6:19 AM | jez0000 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | Apr 2, 2013, 1:54 PM | ahockley0 -
Error: Missing Meta Description Tag on pages I can't find in order to correct
This seems silly, but I have errors on blog URLs in our WordPress site that I don't know how to access because they are not in our Dashboard. We are using All in One SEO. The errors are for blog archive dates, authors and just simply 'blog'. Here are samples: http://www.fateyes.com/2012/10/
Technical SEO | Jan 12, 2013, 4:32 AM | gfiedel
http://www.fateyes.com/author/gina-fiedel/
http://www.fateyes.com/blog/ Does anyone know how to input descriptions for pages like these?
Thanks!!0