Single Folder vs Root
-
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks.
lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer
I should note this site will have over a dozen city locations, with different practices.
-
My Friend,
I think that is fine. I would do that.
I wish you all the best in your project!
-
Dont overblow it really. I'm working on that too right now with positive effects, i.e /subject/another-subject/, it would be good if you link all the independent pages from /subject/ as well including a dropdown menu on /subject/ with all /another-subjects/.
-
Agreed, thanks!
-
Thanks for the great reply. Yes, quite a few practice areas. So it sounds like I should go the city folder route.
Follow up question; think I should do /westcehster-attorney/slip-and-fall-accident-lawyer, or am I getting a little spammy?
-
I recommend Joseph's approach. There are many benefits to this approach: manageability, scalability, and seo. You can address all the practice areas available in specific locations as well as rank the firm more strongly in each location by key of relevance.
-
Hello Friend,
Good question.
Are they only doing car accident cases? I assume that they are doing more.
Doing a folder for the city will allow you to create a hub city page that should link out to different practices for that city, and they should all link back to support the hub page. See how they did it.
https://mirmanlawyers.com/westchester/ (tier 2, pillar page, hub page)
https://mirmanlawyers.com/westchester/car-accident-lawyer/
https://mirmanlawyers.com/westchester/slip-and-fall-accident-lawyer/
If you only have one practice to focus one, I suggest you go for the. lawsite.com/los-angeles-car-accident-lawyer, but if you have many practices, I would go for lawsite.com/los-angeles/car-accident-lawyer and create a valuable sub-page for each practice and each location.
I wish you the best of luck with your project!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a root domain get SEO power from its subdomains?
Hi there! I'd appreciate your help with the following case: a) Current 10-year-old website (community) on root domain "example.com" (250,000 incoming quality-backlinks) will move to the new subdomain "newsub.example.com" (301 redirects to the new subdomain for all current subfolders) b) A new website (shop) will launch on the root domain "example.com" Question: Will the new website on "example.com" get SEO power from the old website on "newsub.example.com"? SEO power = linkjuice/authority/trust/history/etc. from the 250,000 backlinks. What I'm trying to achieve: Maintain the built-up SEO power for the root domain "example.com" Thanks for sharing your thoughts on this! P.S. Plenty has been written about subdomains inheriting from their root domains (so please don't share input on the subdomain vs. subfolder debate). But I can't find satisfactory info about the other way around (root domains inheriting from their subdomains), e.g. if wikia.com gets SEO power from its subdomains superman.wikia.com, starwars.wikia.com, etc.)
Intermediate & Advanced SEO | | ebebeb0 -
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
Subdomain blog vs. subfolder blog in 2013
So I've read the posts here: http://moz.com/community/q/subdomain-blog-vs-subfolder-blog-in-2013 and many others, Matt Cutts video, etc. Does anyone have direct experience that its still best practice to use the sub folder? (hopefully a moz employee can chime in?) I have a client looking to use hubspot. They are preaching with the Matt Cutts video. I'm in charge of SEO / marketing and am at odds with them now. I'd like to present the client with more info than "in my experience in the past I've seen subdirectories work." Any help? Articles? etc?
Intermediate & Advanced SEO | | no6thgear0 -
Can Keyword-Stuffing on a Single Page Penalize My Entire Site?
Hi forum! I want to improve my internal linking through adding keyword-rich anchor text to my search results pages (my site has an internal search engine for products). For example, if I were a shoes store, my product search engine results are currently:
Intermediate & Advanced SEO | | Travis-W
-Running
-Hiking
-Walking
-Track and I want to make them actual keyword-terms by changing them to:
-Running Shoes
-Hiking Shoes
-Walking Shoes
-Track Shoes This creates a problem - the keyword "shoes" is stuffed on the page. I don't care how well these dynamic search results pages appear in search, only the actual product pages. Is it okay to keyword stuff on these pages, or would it penalize my entire site?0 -
Approximate linking root domains we need based on these metrics
Our top 4 competitors for a single term we're targeting has the following metrics: PA 45, DA 89, 6 linking root domains to page, 40,000 linking root domains to domain PA 53, DA 100, 3 linking root domains to page, 1.6 million to domain PA 32 DA 37, 4 linking root domains to page, 200 to domain PA 55 DA 66, 6 linking root domains to page, 3300 to domain All other optimization is about the same, except in (2) they have half of the keyword phrase in the domain and the whole keyword phrase in the URL. Also everybody else has title and meta description with the plural form, and the singular is what I typed in. We have the whole keyword phrase in the domain. The above 4 sites were internal pages, ours is a home page rank. Our metrics: PA 33, DA 22, 30 linking root domains to page, 43 linking root domains to site How tough will it be for us to compete? How many strong linking root domains will it take?
Intermediate & Advanced SEO | | BobGW0 -
Query / Discussion on Subdomain and Root domain passing authority etc
I've seen Rands video on subdomains and best pratices at
Intermediate & Advanced SEO | | James77
http://www.seomoz.org/blog/whiteboard-friday-the-microsite-mistake
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites I have a question/theory though and it is related to an issue I am having. We have built our website, and now we are looking at adding 3rd party forums and blogs etc (all part of one CMS). The problem is these need to to be on a seperate subdomain to work correctly (I won't go into the specific IT details but this is what I have been advised by my IT guru's). So I can have something like:
http://cms.mysite.com/forum/ Obviously after reading Rands post and other stuff this is far from ideal. However I have another Idea - run the CMS from root and the main website from the www. subdomain. EG
www.mysite.com
mysite.com/blog Now my theory is that because so many website (possibly the majority - especially smaller sites) don't use 301 redirects between root and www. that search engines may make an exception in this case and treat them both as the same domain, so it could possibly be a way of getting round the issue. This is just a theory of mine, based solely on my thoughts that there are so many websites out there that don't 301 root to www. or vice versa, that possibly it would be in the SE's self interest to make an exception and count these as one domain, not 2. What are your thoughts on this and has there been any tests done to see if this is the case or not? Thanks0 -
Subdomains vs. Subfolders for unique categories & topics
Hello, We are in the process of redesigning and migrating 5 previously separate websites (all different niche topics, including dining, entertainment, retail, real estate, etc.) under one umbrella site for the property in which they exist. From the property homepage, you will now be able to access all of the individual category sites within. As each niche microsite will be focused on a different topic, I am wondering whether it is best for SEO that we use subdomains such as category.mainsite.com or subfolders mainsite.com/category. I have seen it done both ways on large corporate sites (ie: Ikea uses subdomains for different country sites, and Apple uses subfolders), so I am wondering what makes the most sense for this particular umbrella site. Any help is greatly appreciated. Thanks, Melissa
Intermediate & Advanced SEO | | grapevinemktg0