Subdomain question for law firm in Indiana, Michigan, and New Mexico.
-
Hi Gang,
Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc.
We currently are set up with the main site as:
http://www.2keller.com (Indiana)
Subdomains as:
http://michigan.2keller.com (Michigan)
http://newmexico.2keller.com (New Mexico)
My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought.
Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it.
Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage.
Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly.
I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales?
I guess I'm wondering if there are some things I'm overlooking here?
Thanks guys/gals!
-
Crazy, I have quite a bit of experience with this exact scenario: law firms using geo subdomains to target specific areas.
Here's my findings and suggestions based on actual results and experience:- SEO on domain.com benefits atlanta.domain.com. This is a fact. If Starbucks decided to create subdomains tomorrow for every location, their subdomains would benefit from 91 DA. That's how Findlaw, lawyers.com and all those guys get first page placement with high DA and low PA.
- Digital Diameter is right, subdomains are more effective and directories are more efficient. UNLESS you have a really good multi-site CMS. Then you can be equally efficient and more effective.
I hope this answers your question, if you want some help or have any other questions, PM me.
-
Much appreciated... Can you see the reply above I sent to Mike and offer your thoughts?
-
Thanks, Mike. I agree with your reply, but I suppose my main concern is more associated with whether or not our site becomes too convoluted as we begin geo-targeting states and the major cities within them. It would seem to be an organizational nightmare, making sure that users are getting the experience they expect when visiting the site. Users in New Mexico don't care about Indiana law, copy, and vice-versa. There are so many topics related to specific states, and there's so much content, I worry about it becomes haphazzrd when restricted to one domain. Thoughts?
-
Subdomains (more effective):
In short the benefit is that Google will see each subdomain as a locally focused, independent site.
However, this is also the disadvantage of subdomains.
While they are more likely to be seen as locally focused, each subdomain will have to be managed, provided with unique content and links so it can quickly become much more effort.
Folders (more efficient):
Folders offer much more synergy as they are seen as a single site, but they are also seen as less local / independently targets than subdomains.
-
Randal,
I think in this instance first and foremost lets talk about url structure.From an organic search perspective structuring urls in this way (http://michigan.2keller.com) will hinder any positive seo you do on your main url. Google would view your current url structure as individual domains, therefore none of the seo strategy done on 2keller.com will transfer to the other domains.How the url is structured should not have any affect on how your add the content. We deal with national clients with multiple locations all the time. How you want to structure this is http://www.2keller.com/Michigan or http://www.2keller.com/newmexico. This would allow your team to only have to do search marketing work once and would add efficiency's to your work flow.
I know your main concern is the amount of state specific content. You can still create the pages the exact same way as before from a content perspective. Just have a solid internal linking structure on 2keller.com guiding people to the proper relevant pages or you could use geo targeting allowing the site to recognize IP address and auto-direct people to the right area. Hope this helps. Let us know if you have any questions.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
302 Redirect Question
After running a site crawl. I found two 302 redirects. The two redirects go from: site.com to www.site.com & site.com/products to www.site.com/products How do I fix the 302 redirect and change it to a 301 redirect? I have no clue where to start. Thanks.
Technical SEO | | Ryan_1320 -
Reverse proxy a successful blog from subdomain to subfolder?
I have an ecommerce site that we'll call confusedseo.com. I created a WordPress blog and CNAME'd it to blog.confusedseo.com. Since then, the blog has earned a PageRank of 3 and a decent amount of organic traffic. I am considering a reverse proxy to forward blog.confusedseo.com to confusedseo.com/blog/. As I understand it, this will greatly help the "link juice" of the root domain. However, I'm concerned about any potential harm done to the existing SEO value of the blog. What, if anything, should I be doing to ensure that the reverse proxy doesn't hurt my "juice" rather than help it?
Technical SEO | | bedbugsupply0 -
We just recently moved site domains, and I tried to set up a new campaign for the new root domain, but it threw an error?
It threw an error saying we cannot access the SERPs of this site? Any reason why? It is an https:// site instead of the http://, but even our older domain had an https://
Technical SEO | | josh1230 -
Back Link Question
Hi Folks, Our domain (www.alabu.com) has been around since 2000. We've accumulated a lot of back links over the years, many of which I don't recognize and didn't ask for. I've been reading on here recently about "cleaning up" back links. I do see a lot of ours that just aren't relevant and I don't know why they decided to link to us. We haven't gotten a warning from google or anything like that, but I wonder, how do I know if we could benefit from cleaning up our back links? Is there a benefit to it even if google hasn't warned us? Thanks! Hal
Technical SEO | | AlabuSkinCare0 -
Google Knowledge Graph related question
I have a client who is facing age discrimination in the film industry. (Big surprise there.) The problem is, when you type in his name, Google's new Knowledge Graph displays a brief bio about him to the right of the search results. This bio snippet includes his year of birth. Wikipedia is credited as the source for the bio information about him, and yet, his Wikipedia entry doesn't include his age or birth date. Neither does his iMDb bio. So the question is, How can he figure out where Google is getting that birthdate from? He wants to try and remove it, not falsify it. Thanks for any help you can offer.
Technical SEO | | JamesAMartin0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0 -
301 Redirect Question
I'm working on a site that has a lot of indexed pages and backlinks to both domain.com and www.domain.com. Will using a 301 redirect to send domain.com to www.domain.com merge all of the indexed pages and links over to www.domain.com, thereby strengthening the www?
Technical SEO | | Yo_Adrian0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0