Domain vs Subdomain for Multi-Location Practice
-
I have a client who has 2 locations (Orlando & Tampa) and would like to keep the current domain for both locations (DA 29). We want to target additional cities within each service area (Orlando & Tampa). Each service area would target 2 cities on the main pages and 4-5 cities with "SEO" pages which contains unique content specific to the given city.
Would I be better off creating sub domains (www.orlando.domain.com & www.tampa.domain.com), creating subfolders (www.domain.com/orlando, etc) or keeping the domain as is and create SEO pages specific to each city? We want to spread the domain authority to both locations.
-
Hi Jason,
As for as the bots are concerned, there is no difference between subdomains or subfolders in terms of SEO. I've always gone with subfolders, myself, as I find them easier to manage than a bunch of different subdomains, but you are free to use whichever format with which you are most comfortable.
Just make sure you are creating really good, unique content for these city landing pages to avoid looking manipulative. Remember, true Local rankings hinge on physical location - not service area - so these city landing pages will need o be thought of as more organic than local, and all traditional SEO tactics apply.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection of 100 domain to Main domain affects SEO?
Hi guys, An email software vendor managed by a different area of my company redirected 100 domains used for unsolicited email campaigns to my main domain. These domains are very likely to get blacklisted at some point. My SEO tool now is showing me all those domains as "linking" to my main site as do-follow links. The vendor states that this will not affect my main domain/website in any way. I'm highly concerned. I would appreciate your professional opinion about this. Thanks!!
Intermediate & Advanced SEO | | anagentile0 -
How to Evaluate Original Domain Authority vs. Recent 'HTTPS' Duplicate for Potential Domain Migration?
Hello Everyone, So our site has used ‘http’ for the domain since the start. Everything has been set up for this structure and Google is only indexing these pages. Just recently a second version was created on ‘httpS’. We know having both up is the worst case scenario but now that both are up is it worth just switching over or would the original domain authority warrant just keeping it on ‘http’ and redirecting the ‘httpS’ version? Assuming speed and other elements wouldn’t be an issue and it's done correctly. Our thought was if we could do this quickly it would be easier to just redirect the ‘httpS’ version but was not sure if the Pros of ‘httpS’ would be worth the resources. Any help or insight would be appreciated. Please let us know if there are any further details we could provide that might help. Looking forward to hearing from all of you! Thank you in advance for the help. Best,
Intermediate & Advanced SEO | | Ben-R1 -
Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
I've been trying to figure out why my site www.stephita.com has lost it's google ranking the past few years. I had originally thought it was due to the Panda updates, but now I'm concerned it might be because of the Penguin update. Hard for me to pinpoint, as I haven't been actively looking at my traffic stats the past years. So here's what I just noticed. On my Google Search Console - Links to your Site, I discovered there are 301 domains, where over 75% seem to be spammy. I didn't actively create those links. I'm using the MOZ - Open site Explorer tool to audit my site, and I noticed there is a smaller set of LINKING DOMAINS, at about 70 right now. Is there a reason, why MOZ wouldn't necessarily find all 300 domains? What's the BEST way to clean this up??? I saw there's a DISAVOW option in the Google Search Console, but it states it's not the best way, as I should be contacting the webmasters of all the domains, which is I assume impossible to get a real person on the other end to REMOVE these link references. HELP! 🙂 What should I do?
Intermediate & Advanced SEO | | TysonWong0 -
Consolidating two separate domains and redirecting towards a new replatformed domain
A client has two different sites selling the same products with the same content, they would like to replatform onto Magento while redirecting those 2 sites to the new URL. The question is, besides monitoring the 301 redirects is there anything else to take into consideration when consolidating two sites into one new site?
Intermediate & Advanced SEO | | RocketWeb0 -
Should I serve images from the same Top level domain as the current domain?
We run a multidomain e-commerce website that targets each country respectively: .be -> Belgium .co.uk -> United Kingdom etc... .com for all other countries We also serve our product images via a media subdomain eg. "media.ourdomain.be/image.jpg"
Intermediate & Advanced SEO | | jef2220
This means that all TLD's contain the images of the .be media subdomain. Which is acually seen as an outbound link. We are considering to change this setup so that it serves the images from the same domain as the current TLD, which would make more sense: .be will serve images from media.ourdomain.be .co.uk -> media.ourdomain.co.uk etc.. My question is: Does google image search take the extension of the TLD into consideration? So that for example German users will be more likely to see an image that is served on a .de domain?0 -
What are recommended best practices for hosting native advertising sponsored content on domains for large publishers?
On top of clear on-page sponsored content copy would you add meta robots to noindex native advertising? Google recently came out against this type of content in GNews, is there a similar directive for the main index?
Intermediate & Advanced SEO | | hankazarian0 -
If we add noindex to a subdomain, will the traffic to that subdomain still generate domain authority for the primary domain?
We are trying to decide whether a password protected site, that we will noindex, should be set up as a subdomain or if it should be its own domain. The determining factor here is whether or not having that noindexed subdomain will increase domain authority since its noindexed. Any ideas???
Intermediate & Advanced SEO | | grayloon0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0