Which is more effective: JQuery + CSS for Tabbed Content or Create Unique Pages for each tab.
-
We are building a from-scratch directory site and trying to determine the best way to structure our pages. Each general listing page has four sections of specific information.
What is a better strategy for SEO: Using tabs (e.g. JQuery + CSS) and putting all content on one page (and will all of the content still be indexible using JQuery?) OR creating unique pages for each section.
- JQuery: sitename.com/listing-name#section1
- Unique Pages: sitename.com/listing-name/section1
If I go with option one, I can risk not being crawlable by google if they can't read through the scripting. However, I feel like the individual pages will not rank if there's a small amount of content for each section. Is it better to keep all the content on one page and focus on building links to that? Or better to build out the section pages and worry about adding quality content to them so that long term there is more specificity for long tail search and better quality search experience on Google?
We are also set up to have "../listing-type/listing-name" but are considering removing 'listing type and just having "../listing-name/". Do you think this more advantageous for boosting rankings?
I know that was like five questions. I've been doing a lot of research and these are the things that I'm still scratching my head about. Some general direction would be really great!
Thank You!
-
Thanks Casey. I'm interested see if there is any varying opinion. I've had a few voted in favor of your methodology, but a couple of critics for the alternative.
I think one page with all of the content will work well for us. Plus, the user experience should improve since we won't be having to load a new page each time the user wants to see additional information.
-
Hi Grant,
There are multiple ways of going about this I am sure, but here is my take.
To me, this sort of depends on the content of all 4 tabs and if they are relevant and valuable for the user on this page. Here are a couple of questions to ask yourself:
- Does the user really want to load a new page to see a small section what may or may not have belonged on the previous page?
- Does it make since for a user to go to a new page? (is there a ton of content in these sections)
- Is each section targeting a new keyword, or supporting the main keyword?
jQuery + CSS will be just fine
As long as your developer knows what he is doing, loading jQuery(or better yet pure css) tabs Google will index all of the content on this page. Google should see sitename.com/listing-name#section1 as sitename.com/listing-name. Just make sure the code structure is setup to support any content hierarchy.
**../listing-type/listing-name/ vs ../listing-name/ **
I think this could come down to what these listings are.. If this was say a real estate website it would make since to set it up like:
- ../house/123-main-st/
- ../apartment/432-main-st/
If it makes since to add a listing type I say go for it.
Again, this can differ for what type of content you are providing, but this should provide you with a good sense of general direction.
Thanks,
Casey
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Taken a canonical off a page to let it rank with new unique content - what more can I do?
A week ago, I took a canonical off of a page that was pointing to the homepage for a very big, generic search term for my brand as we felt that it could have been harming our rankings (as it wasn't a true canonical page). A week in and our rankings for the term have dropped 7 positions out of page 1 and the page we want to rank instead is nowhere to be seen. Do I hang fire? As such a big search term, it's affecting traffic, but I don't want to make any rash decisions. Here's a bit more info: For arguments sake, let's call the search term we're going after 'Boots', with the URL where the canonical was placed of /boots. The canonical went to the root domain as we sell, well... boots. At the time, the homepage was ranking for Boots on page 1 and we wanted to change this so that the Boots page ranked for that term... all logical right? We did the following: Took off mentions of Boots from meta on the homepage and made sure it was optimised for on the boots page. Took the canonical off of /boots. Used GSC to fetch & ask Google to recrawl "/boots". Resubmitted the sitemap. Do I hang fire on running back to the safety of ranking for boots on the homepage? Do I risk keyword cannibalisation by adding the search terms back to the homepage?
Intermediate & Advanced SEO | | Kelly_Edwards0 -
Duplicate Pages #!
Hi guys, Currently have duplicate pages accross a website e.g. https://archierose.com.au/shop/cart**#!** https://archierose.com.au/shop/cart The only difference is the URL 1 has a hashtag and exclamation tag. Everything else is the same. We were thinking of adding rel canonical tags on the #! versions of the page to the correct URLs. But Google doens't seem to be indexing the #! versions anyway. Does anyone know why this is the case? If Google is not indexing them, is there any point adding rel canonical tags? Cheers, Chris https://archierose.com.au/shop/cart#!
Intermediate & Advanced SEO | | jayoliverwright0 -
Should I be using meta robots tags on thank you pages with little content?
I'm working on a website with hundreds of thank you pages, does it make sense to no follow, no index these pages since there's little content on them? I'm thinking this should save me some crawl budget overall but is there any risk in cutting out the internal links found on the thank you pages? (These are only standard site-wide footer and navigation links.) Thanks!
Intermediate & Advanced SEO | | GSO0 -
Page structure and how to optimize old content
SITE STRUCTURE I am trying to optimize the structure of our site Dreamestatehuahin.com. Getting a visible sitemap of my page make me realized it was not a pyramid as I expected it to be but instead very flat. I Would be happy for some advise on how to structure my site in future aswell how to optimize certain place on the page that i think need a change. 1: structure on posts. Maybe I misunderstand how post works in wordpress or something happen with my theme. When I look at my page sitemap my page is VERY flat because permalinks setting I chose the setting as post name (recommended in most articles). http://www.dreamestatehuahin.com/sample-post What I actually believed was that post name was place after /blog/ like: http://www.dreamestatehuahin.com/blog/sample-post I would be a good idea to do like this right? Should I add some SEO text on the top of my blog page before the actually posts. Or would this be a bad idea due to pagination causing double content? Could one do 4 blogs in one site and replace the name “blog” in the url with a keywords http://www.dreamestatehuahin.com/real-estate-announcement/sample-post http://www.dreamestatehuahin.com/hua-hin-attractions/sample-post 2) Pages Based on property type From our top menu, i have made links under for sael using wordpress property types http://www.dreamestatehuahin.com/property-type/villa/ http://www.dreamestatehuahin.com/property-type/hot-deals/ http://www.dreamestatehuahin.com/property-type/condominium/ Earlier I found that these pages created duplictaon of titles due to pagenation so I deleted the h1 What would you do with these pages. Should I optimize them with a text and h1. maybe it is possible to add some title and text content for the top of the first page only (the one page that are linked to our top menu) http://www.dreamestatehuahin.com/property-type/villa and not to page 2, 3, 4….. http://www.dreamestatehuahin.com/property-type/villa/page/2/ b) Also maybe I should rename the property types WOuld it make sence to change name of the property types from etc villa to villas for sale or even better villas for sale hua hin Then the above urls will look like this instead: http://www.dreamestatehuahin.com/property-type/villas-for-sale/ Or Maybe renaming a property type would result in many 404 errors and not be worth the effort? 3) LINKING + REPOSTING OUR “PROPERTY” PAGES AND DO A 301 REDIRECT? a) Would It be good idea to link back from all properties description to one of our 5 optimized landingpages (for the keyword home/house/condo/villa) for sale in Hua Hin? http://www.dreamestatehuahin.com/property-hua-hin/ http://www.dreamestatehuahin.com/house-for-sale-hua-hin/ b) Also so far we haven’t been really good about optimizing each property (no keywords, optimized titles or descriptions) etc. http://www.dreamestatehuahin.com/property/baan-suksamran/ I wonder if it would be worth the effort to optimize content of each of the old properties )photos-text) on our page? Or maybe post the old properties again in a new optimized version and do a 301 redirect from the old post?
Intermediate & Advanced SEO | | nm19770 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
What SEO tactics are effective for optimising a site where content is changing very frequently (for example an online newspaper)?
I have always worked with sites where content has a reasonably long life-span but need to now consider SEO for a site where content is changing very rapidly. I have read that Google will re-spider your content more frequently if it finds that it is changing frequently but are there effective ways to let the search engines know as new articles are published? Also, if content is removed within only a day or two of being published, can this have a negative impact on SEO?
Intermediate & Advanced SEO | | Greenie0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0