Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Microsite on subdomain vs. subdirectory
-
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances.
What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
-
I think the footer is the best way to interlink the websites in a non-obtrusive way for users. This should make your main corporate site your top linking site to each subdomain - and this is something you should be able to verify in a tool like Google Webmaster Tools. I do not have any specific examples to support this, but this is a common web practice.
This is not 100% related, but Google recently suggested using Footer links as one way to associate your web content with your Google profile account:
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=1408986
So you can figure if Google looks to footer links to associate authorship - they would likely do the same to relate sites together.
-
Hi Ryan,
Your question is quite interesting. I, myself, went through the article one more time. I have no facts to back up the following, but I hope that it will contribute. FIrst I would go and validate them on webmaster tools. If they are inteded to hit a certain market, I will select that geographical location. Also, I think you have litte to worry about. I imagine that google won't pass certain trust to subdomains, depending on the site. If the number of subdomains is considerable, I would say that they have pretty slim chances of getting some push from the main site. Take for example free webhosting services. They could rank and have decent page rank, if people show interest to the particular subdomain, but is highly unlikely taht to be caused by the authority of the main site.
I haven't seen free hosting subdomain rank well for a long time now. On the other hand you have student and academic accounts on university sites. They all go with subfolders and rank pretty well for highly specific topics. If I have to give a short answer, I would say that is the type of site that makes the difference for google. If your site is considers a casual business website and you are developing a new market then you might not have a problem. If you use sudbomains for specifying product, then you might be ok again.
Google use subdomain for all their major products. For Google pages they used a separate domain. They now redirects to a subdomain sites.google.com. However, they will never give subdomains for personal use. There might be something to that. They do a 301 redirect from a subdomain on googlepages.com to sites.google.com/site/. So what they offer is a 301 redirect to a sub-sub folder, located on a subdomain on Google.
-
Ok. That makes sense. The way our company would use it is having a microsite for specific, focused topics - large enough that warrant their own site. They are clearly part of our overall brand, unlike the Disney properties example. On each of these sites, there will almost always be a link back to the main/corporate website, usually in the footer.
Do you think having one or two links on every page pointing back to company.com would be sufficient to notify search engines that the two are associated, and ultimately give some search value to the subdomain hosted microsite from the main domain?
Are there any studies or evidence supporting any of this?
-
Interlinking is definitely a factor - but content is what matters.
Take the Disney brands that live on Go.com:
They all live on Go.com but Google surely knows they are really separate sites that cover different topics. Same for any blogspot.com, typepad.com, etc. hosted blog. The millions of blogs there cover a wide range of topics and search engines understand that they are not related just because they share the same host domain.
On the other end of the spectrum - if your site just has two subdomains - let's say www.website.com and blog.website.com ... which cover the same topics and link to one another, search engines would more likely associate those two addresses.
-
I don't have an answer to your question, but if you're looking for some more reading about subdomains vs. TLDs, here is a presentation given at MozCon: http://www.distilled.net/blog/seo/mozcon-international-seo/. The slideshow has some info about it, and a bunch of other good stuff.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CcTLD + Subdirectory for languages
Hey, a client has as .de domain with subdirectories for different languages, so domain.de/de, domain.de/en, domain.de/fr etc. hreflang Tags are implemented, so each subdirectory of each language references to the other languages, so for domain.de/en it is: My question is about the combination of ccTLD + language subdirectory. Do you think this is problematic for Google and should be replaced with .com + language subdirectory? We have lots a high quality domains (from countries with corresponding languages) linking to .de/de and .de/en, some links on .de/fr & .de/es and 0 links pointing to .de/cn. Thanks in advance!
Technical SEO | | Julisn
Julian0 -
Subdomain or subfolder?
Hello, We are working on a new site. The idea of the site is to have an ecommerce shop, but the homepage will be a content page, basically a blog page.
Technical SEO | | pinder325
My developer wants to have the blog (home) page on a subdomain, so blog.example.com, because it will be easier to make a nice content page this way, and the the rest of the site will just be on the root domain (example.com). I'm just worried that this will be bad for our SEO efforts. I've always thought it was better to use a sub folder rather than a subdomain. If we get links to the content on the subdomain, will the link juice flow to the shop, on the root domain? What are your thoughts?0 -
Removing site subdomains from Google search
Hi everyone, I hope you are having a good week? My website has several subdomains that I had shut down some time back and pages on these subdomains are still appearing in the Google search result pages. I want all the URLs from these subdomains to stop appearing in the Google search result pages and I was hoping to see if anyone can help me with this. The subdomains are no longer under my control as I don't have web hosting for these sites (so these subdomain sites just show a default hosting server page). Because of this, I cannot verify these in search console and submit a url/site removal request to Google. In total, there are about 70 pages from these subdomains showing up in Google at the moment and I'm concerned in case these pages have any negative impacts on my SEO. Thanks for taking the time to read my post.
Technical SEO | | QuantumWeb620 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Robots.txt on http vs. https
We recently changed our domain from http to https. When a user enters any URL on http, there is an global 301 redirect to the same page on https. I cannot find instructions about what to do with robots.txt. Now that https is the canonical version, should I block the http-Version with robots.txt? Strangely, I cannot find a single ressource about this...
Technical SEO | | zeepartner0 -
Meta Description VS Rich Snippets
Hello everyone, I have one question: there is a way to tell Google to take the meta description for the search results instead of the rich snippets? I already read some posts here in moz, but no answer was found. In the post was said that if you have keywords in the meta google may take this information instead, but it's not like this as i have keywords in the meta tags. The fact is that, in this way, the descriptions are not compelling at all, as they were intended to be. If it's not worth for ranking, so why google does not allow at least to have it's own website descriptions in their search results? I undestand that spam issues may be an answer, but in this way it penalizes also not spammy websites that may convert more if with a much more compelling description than the snippets. What do you think? and there is any way to fix this problem? Thanks!
Technical SEO | | socialengaged
Eugenio0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | | agencycentral0 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0