Directory Structure
-
Hi,
We are creating a new content directory for online courses hosted on our site. Like a typical directory, we have high level categories and then more granular subcategories. A course will typically only be in one high level category and then multiple subcategories. What would be the best URL structure for an individual course? Should we force users to pick one 'master' subcategory that gets included in their URL? Or should we just not include the subcategory at all in the URL? Right now we've been thinking about:
OurUrl.com/upper-category/sub-category/course-title
or
-
Is a course going to be tagged as being about two or more kind of categories or just one? For instance, an SEO course could be put both as "Internet" and "Marketing".
if you reflect your is the same problem that some eCommerce have with products' classification, so the solution I suggest you is to do what in eCommerce is common to do in these case or to avoid these cases:
www.domain/course-title/
That way you avoid any possible duplication due to assigning a course to more than category and avoid potential links to both course's pages and - now that it is important - that the eventual social sharing and signals get split in two identical pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on site structure
My client is a nationwide company. They provide building maintenance services in 7 different cities. In each city they provide a different range of services. They currently have a single service page for each service and no mention on that page of the cities they offer the service. The service pages are getting no SERP visibility. We are running Paid Search and recommending SEO. I'm wondering whether it would be beneficial to build out specific service pages for each city so the content is more relevant to both users and search engines. What is best practice in this situation? Client wants to dominate SERPs in each market for the services they offer.
On-Page Optimization | | SEOinSunnyNelson0 -
Use of subfolders for my casino directory
Been trying to find an answer to this for some time, so I hope someone can lend me a hand.. I've started building out my casino website Nerdybet, which will list regulated gaming vendors within each US state. A part of the idea is to create a high level web directory of legal betting sites. Now, say I want to list all legal casinos within a specific US state. What's the most search engine friendly way to build out the hierarchy? To clarify, this will be the main casino directory page: https://www.nerdybet.com/gambling-sites-directory Based on that, I can now pick either: a) Nerdybet.com/gambling-sites-directory/stateX
On-Page Optimization | | llevy
b)Nerdybet.com/stateX Which method would you say is best? And why so? Thank you.1 -
URL structure for professional services across multiple industries
I am working with a company who does consulting work across multiple industries, but the services are essentially the same. Example Services: They implement "Customer Relationship Management" systems and "Data Archiving" Solutions. Example Industries: The services above can each apply to "Oil & Gas" or "Retail". Example URL Structures: mysite.com/oil-gas <-- This page would also contain links to all of the services provided to the Oil & Gas industry. mysite.com/oil-gas/customer-relationship-management-system mysite.com/retail mysite.com/retail/customer-relationship-management-system This seems like the best way to go, as long as i'm writing unique content, for each industry, for each service (i.e. I need to explain how a CRM solution solves specific problems in retail and OTHER specific problems in Oil & Gas). While there will certainly be some overlap, this approach seems logical to me. The URL length isn't too long either, which is nice. The company currently solely focuses on services in URL structure (not a very deep site): mysite.com/customer-relationship-management-system mysite.com/data-archiving Since they have already worked with hundreds of clients in multiple industries, it seems smarter to start focusing more on individual customer segments. Would anyone else do this differently? Thanks, Alex
On-Page Optimization | | MeasureEverything0 -
URL Structure
What's the best way to set up a url structure? When a user goes through the funnel should it show it in the url? Like this: domain.com/thickness/high-density/1-mil-plastic-bags (1 mil plastic bags is a subcategory - when the user is at this page they will see many products. When they select one - it brings them to a product detail page which I think should be done like this: domain.com/product-name regardless of the funnel that brought them there. Does this make sense?) or **domain.com/1-mil-plastic-bags ** Also, is there a limit of how many "/" could be used?
On-Page Optimization | | EcomLkwd0 -
URLs and folder structure for an E-commerce
Hi there !-) I´m helping a friend who has a e-commerce about nail polish in Brazil. I´m a little in doubt about the urls and folder structure. Two questions: 1. There are 10 products per category and 50 categories. Should I put them all in the root folder or creat 2 major categories ( 25 sub-categories each one)? 2. Whats the better product page url ( the store has around 500) nailpolish.com/IMPORT/BRAND/NAME-OF-THE-PRODUCT OR nailpolish.com/COMPLETE-NAME-OF-THE-PRODUCT Whats the best recomandation?
On-Page Optimization | | SeoMartin10 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
How Google differentiates web sites like directories?
Hi, I want to ask how google differentiates web sites like directories or company listing websites? How it understands that is a normal thing to have many links in a directory site? Are there some guides links about what to do and avoid and how to make SEO optimization for a directory web site.
On-Page Optimization | | vladokan0 -
Complex navigation structure leaving me puzzled with Meta keywords! Would love some help...
Hi there So I have a main navigation, It includes 5 categories Each category contains 4-6 sub categories Within these sub categories, there are 6 - 10 sub sub categories Its a rather complex navigation but allows the end user to land exactly where they want without much mooching. Now my issue is the use of keywords. Should I be feeding the keywords used in the main category through to the sub category and the sub sub category as they are all linked or should I use unique keywords for each sub/sub/sub category? I have added an image of the nav layout so you can see how it works. I hope that makes sense? Could love some help! dHve8.jpg
On-Page Optimization | | onlineforequine0