Tabbed Content Revisited
-
Hi-diddly-ho SEO gurus, quick question.
I just saw this article and wanted to get thoughts from the people here. https://www.searchenginejournal.com/google-says-now-ok-put-content-behind-tabs/178020/
I am constantly at war with our UX guy on this subject because he believes, along with our CEO, that tabbed and accordion style information is better from THE UX standpoint. Less clutter on a page but with information still readily available. I am not here to argue that point but was wondering if you agree with the article posted here. I had to inform them their roll needed to be slowed until I could get something a little more concrete on the matter.
-
Hi Cassie,
I would agree with the article - I have been doing this for months now. In fact, I have found that accordion content not only improves UX but also SEO - since Google can read all information on tabs/accordion layouts now, you can put a heck of a lot more information above the fold for better UX and for improved SEO (since tab titles and content give you options for relevancy and semantic keywords).
My favourite uses for this are in the e-commerce industry, since ranking product pages or category pages can be extremely tricky, especially if you or your client want to use duplicate content. Content placement can be difficult to decide on, but accordions and tabs give you the option to front-load your content without forcing your visitors to scroll for an era to find your products.
Long story short, tabs and accordions improve UX (all information present, but visitors choose what to view) and SEO (easier relevancy, answer all questions for better user metrics, decrease bounce rates, etc.). Every time I have used this strategy it has resulted in ranking improvements and better user metrics on GA. I would try split testing it for yourself if you are still unsure - try using this method for 1 page with similar metrics to another page on your website. Track both pages for rankings, drop-off rates, etc. and see which works better.
Hope this helps and feel free to reach out if you have further questions or need clarification!
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rewriting content dilemma? should i include keywords ?
hi, i am rewriting content since it had 300 words , lower than top 10s i did some keywords research, on ahrefs, moz, keywordtool . io, semrush. and there are lot of variations like, main keyword is (ios 12) has variations like (ios 12 download), (ios 12 install), (ios 12 not supported) and so on, people have been searching all these terms with ios 12, it feels like if my page talks about all these variations. i have more chances to rank. does it make any sense?
Intermediate & Advanced SEO | | SIMON-CULL0 -
Similar content, targeting different states
I have read many answers regarding not having duplicated pages target different states (cities). Here is the problem. We have same content that will serve different pages in some provinces in Canada that we can't change allot intentionally. We don't want these pages compete within the same province. What would be the best approach not to get penalized and keep SERP? Initially we though about hreflang, but we can't really do it on the provice/state attributes. Thanks in advance!
Intermediate & Advanced SEO | | MSaffou20180 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Duplicate Content Issues :(
I am wondering how we can solve our duplicate content issues. Here is the thing: There are so many ways you can write a description about a used watch. http://beckertime.com/product/mens-rolex-air-king-no-date-stainless-steel-watch-wsilver-dial-5500/ http://beckertime.com/product/mens-rolex-air-king-stainless-steel-date-watch-wblue-dial-5500/ Whats different between these two? The dial color. We have a lot of the same model numbers but with different conditions, dial colors, and bands.. What ideas do you have?
Intermediate & Advanced SEO | | KingRosales0 -
How do I Syndicating Content for SEO Benefit?
Right now, I am working on one E-Commerce website. I have found same content on that E-Commerce website from manufacturer website. You can visit following pages to know more about it. http://www.vistastores.com/casablanca-sectional-sofa-with-ottoman-ci-1236-moc.html http://www.abbyson.com/room/contemporary/casablanca-detail http://www.vistastores.com/contemporary-coffee-table-in-american-white-oak-with-black-lacquer-element-ft55cfa.html http://www.furnitech.com/ft55cfa.html I don't want to go with Robots.txt, Meta Robots NOINDEX & Canonical tag. Because, There are 5000+ products available on website with duplicate content. So, I am thinking to add Source URL on each product page with Do follow attribute. Do you think? That will help me to save my website from duplicate content penalty? OR How do I Syndicating Content for SEO Benefit?
Intermediate & Advanced SEO | | CommercePundit0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0