Website architecture - levels vs filters and authority loss - Enterprise SEO
-
Hi Everyone,
I am participating in the development of a marketplace website where the main channel will be traffic via SEO. We have encountered the directories (levels) vs filters situation.
1. Does everyone still agree that if we have too many levels, authority is loss as you do down through the levels? Does everyone agree that there should be a max of 3 levels and never 4.
Example 1
www.domain.com/level1/level2/level3
vs
www.domain.com/level1
In theory, the content on "level 3" will have a lower DA than the content on "level1".
2. Does everyone agree that for enterprise SEO (huge marketplace websites) filters are a better idea than levels?
Example 2
www.domain.com/level1/level2/level3
vs
www.domain.com/filter-option1
In theory, the content on "level 3" will have a lower DA than the content on "filter-option1".
Thanks so much in advance
-
Hi Ryan,
Thank you for your advise. The marketplace is going to be a big site so I think we will go with using both filters (mostly for product segmentation) and folders for blog pages, contact pages and so forth. Thanks again.
-
Hi Andy and Everett,
Thanks for getting back to me. Your "Follow the winner" advise I think is spot on and from what I see "winners" are using filters more than folders for products. The site I am working on is going to be a big Marketplace (more than 450.00 products) and so I think I will go with the filters approach mostly. We will be adding region as a filter. I will use folders for blogs, contact pages, and so forth.
Thanks again..the site should be live in about a year so.!
-
Folders tend to be useful for large structural sections of a site, especially when addressing issues that may arise within a given area, i.e. using folders for blog, forum, FAQ, etc., due to how each one may be running different software and need different analysis with Analytics.
Once those different sections are identified, using filters within those sections is fine, and often a default naming convention for a given software solution. Ultimately SEO is a consideration in URL structure, but not the only one you're going to make. As Andy says above, it's largely dependent on the needs of the site. Plus, getting page-level engagement and sharing will outweigh the URL structure benefits. Cheers!
-
Hi Carla,
As a website gets larger, it makes sense to have a tiered structure rather than place everything one 'click' from the root. Yes, PA will diminish as you get deeper into a site, but a carefully planned internal linking plan will put paid to that - as will building links to internal pages.
As for filters vs levels, it depends hugely on the site, niche, users and what other competitors are also doing ('follow the winners' type of thing), as well as what makes the most sense for the site structure. There is no definitive answer to this one.
As for a 4th level, if a common sense structure calls for a 4th level, then don't try and bypass it and ruin usability for the sake of one less click. I have worked with many sites that have more than 4 tiers and never encountered an issue.
I hope this helps?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Planning to transition to a new website domain - should I press pause on SEO initiatives?
Hello - my company is planning to transition to a new website domain sometime this year, probably about six months from now. Our current website does not currently get much organic traffic from unbranded search terms. I would really like to fix that by publishing lots of new blog posts and trying to get more backlinks. But with the website transition on the horizon, I'm wondering if I should hold off on posting new pages and getting backlinks for the time being. Then once the new website is live, I can start to ramp things up. What would you do in this situation? Also, does anyone know of any thorough guides or walk-throughs that cover all of the best practices (re: SEO) when migrating to a new website domain?
Intermediate & Advanced SEO | | collinburkewg0 -
Does creating too many parent pages damage my website's SEO?
I need to know how to keep my website structure well organised and ensure Google still recognises the key pages. I work for a travel company which needs to give customers various pieces of information on our website and this needs to be well organised in terms of structure. For example, customers need information on airport pick-ups and drop-offs for each of our destinations but this isn't something that needs to rank on Google. Logically for site structure would be to create a parent page: thedragontrip.com/transfers/india Is creating parent pages for unimportant content a bad idea?
Intermediate & Advanced SEO | | nicolewretham1 -
Need references to a company that can transition our 1000 page website from Http to Https without breaking our SEO backlinks and site structure
Hi Ya'll I'm looking for a company or independent who can transition our website from http to https. I want to make sure they know what they're doing with a Wordpress website. More importantly, i want to make sure they don't break any seo juice from external sources while internally nothing gets broken. Anyone have any good recommendations? You can reply back or DM me. Best, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Microsite Subfolder URL vs Redirected TLD for best SEO
We have a healthcare microsite that is in a subfolder off a hospital site.They wanted to keep their TLD and redirect from the subfolder URL. Even with good on-page SEO, link building, etc., they're not organically ranking as well as we think they should be. ie. They have http://our-business-name.com vs. http://hospital.org/our-business-name/ For best SEO value, are they better off having only their homepage as TLD and not redirect any interior pages but display as subfolder URL? ie. Keep homepage as http://our-business-name.com but use hospital urls for interior pages http://hospital.org/our-business-name/about/ Or is there some better way to handle this?
Intermediate & Advanced SEO | | IT-dmd0 -
SEO Best Practice for a multi-language and multi-country website
Hello Moz Community, I hope someone could help me identify the best action to take on an on-page optimization confusion I am currently having. The website I am currently trying to optimize is http://www.riafinancial.com/locations/us/home.aspx. There is an option to view a country specific version of the page, or language version (there are 2 drop down menus on the top, for country or for language). When viewing a country specific version of the page, the URL changes depending on country selected. Some country versions also updates the content to the language of that country, but some remain English. Example, when viewing the France version of the page (http://www.riafinancial.com/locations/FR/home.aspx), the content is updated to french version, but when viewing the China version (http://www.riafinancial.com/locations/CN/home.aspx), the content is in English. This is because we have not yet translated for all countries (this will eventually be all translated). Now, when viewing by language, the URL does NOT change. Example, in http://www.riafinancial.com/locations/us/home.aspx, you can choose French, German, Italian, Polish, etc. The content of the page will change based on language chosen, but the URL (including page titles, meta-descriptions) will not change. My question is, how should I approach this for on-page optimization? Canonical? Hreflang? Any input, feedback, recommendation, suggestion will be greatly appreciated. Thanks! Sharon
Intermediate & Advanced SEO | | RiaMT0 -
Combining 2 Websites
Any assistance/feedback is greatly appreciated. The scenario: We currently own two website, and we'd like to combine them and eliminate some expenses. Although the content is very similar in nature, it is not exact. www.KF.com that is managed by a third-party provider & www.KFA.com that is managed by the manufacturer of the product we sell. (*sites url's are not accurate) We have ended the contract of KF.com, however, this site has the best SERP/SEO.
Intermediate & Advanced SEO | | FX4nWOO
We assume we'll take a hit, no matter what we do - however when it comes to SEO, but what is the right move to make? Do a domain "Transfer/Redirect" of KF to KFA.com or Do we simply change the KFA.com to KF.com? Still very much a rookie when it comes to this stuff. I do have the ability to SEO the KFA.com webiste. Hoping this makes sense - and apologize for the bad url's just not sure I can actually post the true addresses. Thanks in advance.0 -
Very basic - domain authority vs page authority
what does that mean and how is that information valuable? thank you
Intermediate & Advanced SEO | | thirsty31 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0