Subdirectory vs. Subdomain
-
I work for a large franchise organization that is weighing the pros and cons of using subdomains versus subdirectories for our franchisee locations. What are the pros and cons of each approach?
-
As far as the post-panda world goes, I found this post to be helpful for both beginner and advanced SEOs alike: http://www.whitefireseo.com/site-architecture/subdomain-or-subfolder-post-panda/360/
-
Sure - I dont like giving out my own clients here, but a quick search for say, Seattle Hotels, a very competitive niche, shows the following:
#3 just under tripadvisor and expedia: http://www.starwoodhotels.com/whotels/property/overview/index.html?propertyID=1154
#4 (and #1 in map pack): http://www.fairmont.com/seattle/
These hotels may or may not be franchises, but the business model makes no difference to Google. The important thing is that these two sites are both ranking in various cities for " <city>hotels". Each uses a slightly different technique (starwood using a subfolder that uses dynamic property ID urls, and fairmont that simply uses a subfolder.</city>
-
Jared,
Can you give me examples of franchises you know that rank well using this method?
thanks!
LP
-
Nakul,
Thank you for your reply!
Answer to your questions:
-
Franchisees could request changes through corporate that would have to be approved and implemented by corporate website manager.
-
It would be difficult to create and manage content for 250+ locations and not have duplicate content. I believe we will have a significant amount of duplicate content except for city name.
-
We would like main site to rank well for non geo specific search "product" and franchise content to rank well for geo specific search "product dallas"
I would appreciate your response based on these clarifications!
thanks!
-
-
Great points here by both Matt and Nakul. Of particular importance is understanding who will have access to edit, and site duplication. If you want each of your franchise owners to have editing capabilities for the own store, then it may be easier to use subdomains from a permissions perspective.
As for duplicate content, you'll need to worry about that regardless of whether you use a sub or a folder.
As a case study, Ive worked two very large projects like this, and in both cases I used subfolders for the locations. The folders were the city name in both cases. Each location is positioned position 1 or 2 across the board, and each location shows well in the map packs.
-
For a franchise scenario, it would be best to use a sub-domain, however it does depend on other things as well.
1. Would franchises be able to control / add / edit any content on these sites ?
2. Any duplicate content issues between multiple sites ?
3. Do all of them need to be indexed in the search engines and getting/targetting search traffic ?As a general rule, for SEO, sub-folders are much better, but in your scenario, it's a different case.
You might also want to refer to other similar questions here on SEOMOZ.
- http://www.seomoz.org/q/blogs-are-best-when-hosted-on-domain-subdomain-or
- http://www.seomoz.org/q/setting-up-a-company-blog-subdomain-or-new-url
- http://www.seomoz.org/q/blog-vs-blog
and the post from Matt Cutts
http://www.mattcutts.com/blog/subdomains-and-subdirectories/
-
As far as I understand, subdomains are good when you want each subdomain judged on its own merit. For instance, Wordpress uses me.wordpress.com and you.wordpress.com That means my site may not bring everyone down if I get penalized. However, if you had wordpress.com/me and wordpress.com/you, we're on the same domain so you'd find it harder to improve your own SEO while I am bringing you down.
So if you're trying to separate content, use subdomains. If it's all one company/organization and you are confident all franchisees will do the right thing, use subfolders. If you wanted say
The SEO work would be 2x (or 50x if you had 50 franchises) but it would stand alone. That means if the Boston Franchise did a lot of work, they'd be much better off than the ATL one who did less. If the main franchise.com is doing all the seo for everyone, you'll want to use subfolders.
At least that's my understanding of the difference!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hundreds of Subdomains under a powerful domain
Hello, I own a domain FeedsPortal - com As you can see from the link profile, it has some fantastic referring domains and links. Because of this, it has a DA of 86, with a good CF/TF. The problem is that nearly all of these powerful links are for the sub domains under the main domain. For example, it has a link on this page on MSN... https://www.msn.com/en-us/news/offbeat/finland-home-of-the-dollar103000-speeding-ticket/ar-AA9GA9i?ocid=ansAtlantic11 On this MSN article, it has a link to http://feedproxy.google.com/~r/TheAtlantic/~3/PK4bw0Tkkps/story01.htm (a link under Google.com), which then forwards to my domain under a sub domain http://theatlantic.feedsportal.com/c/34375/f/..... I have many hundreds of sub domains like this. I have a feeling redirecting all non-existent sub domains to the homepage would be a bad idea for SEO. Does anyone else see of a way to do this without harming my SEO? I suppose the only way to do it properly would be to write articles about each subdomain. For example, http://theatlantic.feedsportal.com, write an article about The Atlantic, then forward all traffic meant for theatlantic.feedsportal.com to feedsportal.com/10-reasons-why-the-atlantic-is-great/ Does anyone else have an idea of how to at least get a list of the non-existant sub domains that have links so I can maybe create articles for each sub domain? Or is there a simpler way to do this. Thanks!
Intermediate & Advanced SEO | | thinkingdif0 -
URL Too Long vs. 301 Redirect
We have a small number of content pages where the urls paths were setup before we started looking really hard at SEO. The paths are longer than recommended (but not super crazy IMHO) and some of the pages get a decent amount of traffic. Moz suggests updating the URLs to make them shorter but I wonder if anyone has experience with the tradeoffs here. Is it better to mark those issues to be ignored and just use good URLs going forward or would you suggest updating the URLs to something shorter and implementing a 301 redirect?
Intermediate & Advanced SEO | | russell_ms0 -
Subdomain Place Holder
So long story short - we are rolling out a new website earlier than expected. Unfortunately, we are being rushed and in order to make the deadline, we have decided to create a www2. subdomain and release our HTML only version of the site for the next 2 weeks. During that time, the HTML site will be ported over to a Drupal 8 instance, and resume its www. domain. My question is - will a temporary (302) from www to ww2 and then back to www screw the proverbial pooch? Is there a better way to implement a temporary site? Feel free to probe with some questions - I know I could be clearer here 😉 Thanks community!
Intermediate & Advanced SEO | | BDS20160 -
Lower quality new domain link vs higher quality repeat domain link
First time poster here with a dilemma that head scratching and spreadsheets can't solve! I'm trying to work out whether to focus on getting links from new domains or to nurture relationships with the bigger sites in our business and get more links. Of the two links below which does the community here think would be more valuable a signal to Google? Both would be links from within relevant text/post copy. Link 1. Site DA 30. No links currently from this domain. Link 2. Site DA 60. Many links over last 12 months already from this domain. I suspect link 1 but given the enormous disparity in ranking power am I correct?! Thanks for any considered opinions out there! Matthew
Intermediate & Advanced SEO | | mat20150 -
What has a better chance of ranking alongside my main site for my company name, a subdomain or new domain?
Hi Moz, Do search engines really treat subdomains as separate domains in this regard? Or are we more likely to get more real estate on the first page with a new domain? Our goal is to have our main site and this new subdomain or domain ranking in positions 1 and 2 for our company name. This is going to be a careers site/portal. Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Intermediate & Advanced SEO | | JHSpecialty0 -
Which link url placement to buy - High PR vs. High PA?
I'm about to buy one directory link (just the one!) but can't decide which URL to place my link on in that directory because of the varying metrics - which is better of the below (bearing in mind my own site is still a PR0 sitewide)? www.exampledirectory.com/categoryA/subtategory1/
Intermediate & Advanced SEO | | emerald
Metrics: 21 linking domains, PA 44, DA 59, PR0 www.exampledirectory.com/categoryA/
Metrics:1 linking domain, PA 35, DA 59, PR5 I know PR is no longer relevant and usually ignore this metric (except for possible penalties) and just focus on Seomoz toolbar metrics, but as my own site itself is PA:37 and DA:28 homepage but PR0 completely sitewide (over 6 months old but relatively new site), I thought this might help to balance things. Thanks for your advice.0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0