Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Microsite on subdomain vs. subdirectory
-
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances.
What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
-
I think the footer is the best way to interlink the websites in a non-obtrusive way for users. This should make your main corporate site your top linking site to each subdomain - and this is something you should be able to verify in a tool like Google Webmaster Tools. I do not have any specific examples to support this, but this is a common web practice.
This is not 100% related, but Google recently suggested using Footer links as one way to associate your web content with your Google profile account:
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=1408986
So you can figure if Google looks to footer links to associate authorship - they would likely do the same to relate sites together.
-
Hi Ryan,
Your question is quite interesting. I, myself, went through the article one more time. I have no facts to back up the following, but I hope that it will contribute. FIrst I would go and validate them on webmaster tools. If they are inteded to hit a certain market, I will select that geographical location. Also, I think you have litte to worry about. I imagine that google won't pass certain trust to subdomains, depending on the site. If the number of subdomains is considerable, I would say that they have pretty slim chances of getting some push from the main site. Take for example free webhosting services. They could rank and have decent page rank, if people show interest to the particular subdomain, but is highly unlikely taht to be caused by the authority of the main site.
I haven't seen free hosting subdomain rank well for a long time now. On the other hand you have student and academic accounts on university sites. They all go with subfolders and rank pretty well for highly specific topics. If I have to give a short answer, I would say that is the type of site that makes the difference for google. If your site is considers a casual business website and you are developing a new market then you might not have a problem. If you use sudbomains for specifying product, then you might be ok again.
Google use subdomain for all their major products. For Google pages they used a separate domain. They now redirects to a subdomain sites.google.com. However, they will never give subdomains for personal use. There might be something to that. They do a 301 redirect from a subdomain on googlepages.com to sites.google.com/site/. So what they offer is a 301 redirect to a sub-sub folder, located on a subdomain on Google.
-
Ok. That makes sense. The way our company would use it is having a microsite for specific, focused topics - large enough that warrant their own site. They are clearly part of our overall brand, unlike the Disney properties example. On each of these sites, there will almost always be a link back to the main/corporate website, usually in the footer.
Do you think having one or two links on every page pointing back to company.com would be sufficient to notify search engines that the two are associated, and ultimately give some search value to the subdomain hosted microsite from the main domain?
Are there any studies or evidence supporting any of this?
-
Interlinking is definitely a factor - but content is what matters.
Take the Disney brands that live on Go.com:
They all live on Go.com but Google surely knows they are really separate sites that cover different topics. Same for any blogspot.com, typepad.com, etc. hosted blog. The millions of blogs there cover a wide range of topics and search engines understand that they are not related just because they share the same host domain.
On the other end of the spectrum - if your site just has two subdomains - let's say www.website.com and blog.website.com ... which cover the same topics and link to one another, search engines would more likely associate those two addresses.
-
I don't have an answer to your question, but if you're looking for some more reading about subdomains vs. TLDs, here is a presentation given at MozCon: http://www.distilled.net/blog/seo/mozcon-international-seo/. The slideshow has some info about it, and a bunch of other good stuff.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Google Indexed a version of my site w/ MX record subdomain
We're doing a site audit and found "internal" links to a page in search console that appear to be from a subdomain of our site based on our MX record. We use Google Mail internally. The links ultimately redirect to our correct preferred subdomain "www", but I am concerned as to why this is happening and if it can have any negative SEO implications. Example of one of the links: Links aspmx3.googlemail.com.sullivansolarpower.com/about/solar-power-blog/daniel-sullivan/renewable-energy-and-electric-cars-are-not-political-footballs I did a site operator search, site:aspmx3.googlemail.com.sullivansolarpower.com on google and it returns several results.
Technical SEO | | SS.Digital0 -
How much difference does .co.uk vs .com for SEO make?
My Website has a .com domain. However I have noticed that for local businesses all of them have a .co.uk (UK business) TLD (check plumbers southampton for example). I have also noticed that on checking my serp rankings, I'm on page 1 if searched on Google.com but page 2 if searched on google.co.uk. Now being UK based I would assume most of my customers will be redirected to google.co.uk so I'm wondering how much of an impact this actually makes? Would it be worth purchasing .co.uk domain and transferring my website to that? Or run them both at the same time and set up 301 direct on my .com to .co.uk? Thanks
Technical SEO | | Marvellous0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | | agencycentral0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
Exact match subdomains
Hi, I have seen significant SEO benefits from owning exact match domains and was wondering whether exact match subdomains hold the same (or some) of these benefits? eg. halloweencostumes.co.uk vs. halloween [dot] costumes.co.uk Many thanks.
Technical SEO | | martyc0 -
Ror.xml vs sitemap.xml
Hey Mozzers, So I've been reading somethings lately and some are saying that the top search engines do not use ror.xml sitemap but focus just on the sitemap.xml. Is that true? Do you use ror? if so, for what purpose, products, "special articles", other uses? Can sitemap be sufficient for all of those? Thank you, Vadim
Technical SEO | | vijayvasu0