Subdomains vs. Subfolders for unique categories & topics
-
Hello,
We are in the process of redesigning and migrating 5 previously separate websites (all different niche topics, including dining, entertainment, retail, real estate, etc.) under one umbrella site for the property in which they exist. From the property homepage, you will now be able to access all of the individual category sites within.
As each niche microsite will be focused on a different topic, I am wondering whether it is best for SEO that we use subdomains such as category.mainsite.com or subfolders mainsite.com/category.
I have seen it done both ways on large corporate sites (ie: Ikea uses subdomains for different country sites, and Apple uses subfolders), so I am wondering what makes the most sense for this particular umbrella site.
Any help is greatly appreciated.
Thanks, Melissa
-
Its a no questions asked answer- SUBFOLDERS-Â by creating subdomains it's like saying your a separate entity and by doing that you will not be sharing all the value from the main domain and all the domains combined ie. pagerank, pages indexed, links, etc.
-
Yeah, I'm a subfolders kind of guy too, but here's SEOmoz weighing in on the subject of microsites and talking about the same sort of question you are - http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
-
Definitely use subfolders!
By using your subfolders you pass all link juice and PR to the main domain which is mainsite.com. If you have say 10 subdomains, you are splitting the authority/link-juice/PR.
I know there are others who say Google passes the juice between a domain and subdomain, and it does pass some, but not all.
You cannot go wrong using subfolders. You build up one main domain, it's clean, user-friendly, etc...
SUBFOLDERS!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Control Over Subdomains - What Will the Effect Be?
Hello all, I work for a university and I my small team is responsible for the digital marketing, website, etc. We recently had a big initiative on SEO and generating traffic to our website. The issue I am having is that my department only "owns" the www subdomain. There are lots of other subdomains out there. For example, a specific department can have its own subdomain at department.domain.com and students can have their own webpage at students.domain.com, etc. I know the possibilities of domain cannibilization, but has any one run into long term problems with a similar situation or had success in altering the views of a large organization? If I do get the opportunity to help some of these other domains, what is best to help our overall domain authority? Should the focus be on removing similar content to the www subdomain or cleaning up errors? Some of these subdomains have hundreds of 4XX errors.
Intermediate & Advanced SEO | | Jeff_Bender0 -
Site with both subfolders and subdomains
Hi everyone,
Intermediate & Advanced SEO | | medi_
I'm working on a website that has a quite extensive subfolder structure for product and multilingual purposes.
domain.com/en
domain.com/it
domain.com/fr
domain.com/en/category
domain.com/it/category
domain.com/fr/category
domain.com/en/category/product
domain.com/it/category/product
domain.com/fr/category/product
domain.com/en/category/product/region
domain.com/it/category/product/region
domain.com/fr/category/product/region
and so on... We will soon be launching a completely different service, which would make the subfolder structure become even more complex. As John Mueller recently stated that Subdomains and Subfolders are treated the same by Google, I am now considering building that new service under subdomains for product reason, and for the sake of clarity. 1- Would my subdomains inherit the authority of my main domain?Â
2- Do I have to keep the language folders with the subdomain structure?
e.g.:
new-service.domain.com/en
nouveau-service.domain.com/fr
nuovo-servizio.domain.com/it OR
new-service.domain.com
nouveau-service.domain.com
nuovo-servizio.domain.com Looking forward to reading you!0 -
Sub-domain vs Root domain
I have recently taken over a website (website A) that has a domain authority of 33/100 and is linked to from 39 root domains. I have not yet selected any keywords to target so am currently unsure of ranking positions. However, website A is for a division of a company that has its own separate website (website B) which has a domain authority of 58/100 and over 1000 legitimate linking root domains. I have the option of moving website A to a sub-domain of website B. I also have the option of having website B provide a followed link to website A. So, my question is, for SEO purposes, is my website better off remaining on its own existing domain or is it likely to rank higher as a sub-domain of website B? I am sure there are pros and cons for both options but some opinions would be much appreciated.
Intermediate & Advanced SEO | | BallyhooLtd0 -
Heading Tags & Content Count
Hi everyone I am looking into this page on our site http://www.key.co.uk/en/key/sack-trucks Just comparing it against competitors in SEMRush, the tool shows a wordcount of this page for over 4089 words, compared with http://www.wickes.co.uk/Wickes-Green-General-Purpose-Sack-Truck-200kg/p/500302 which only has 2658 - it has a lot more written content than our page - where is this word count coming from? Also looking at the same page on our site Woorank suggests we have the word 'sack truck' in the h1 and title too many times - it's only there once, but its this showing because its an exact match keyword? I'm just wondering if there is something wrong with the html or how the page is being crawed?
Intermediate & Advanced SEO | | BeckyKey0 -
Subdomains vs. Subfolders vs. New Site
Hello geniuses!!! Here's my Friday puzzle: We have a plastic surgery client who already has a website that's performing fairly well and is driving in leads. She is going to be offering a highly specialized skincare program for cancer patients, and wants a new logo, new website and new promo materials all for this new skincare program. So here's the thing - my gut reaction says NO NEW WEBSITE! NO SUBDOMAIN! because of everything I've read about moving things on and off subdomains, etc (I just studied this: http://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday). And, why wouldn't we want to use the authority of her current site, right? While she doesn't necessarily have a high authority domain - we're not talking WebMD, here - she does have some authority that we've built over time. But, because this is a pretty separate product from her general plastic surgery practice, what would you guys do? Since we'll be creating a logo and skincare "look and feel" for this product, and there will likely be a lot of information involved with it, I don't think we'll be able to just create one page. Is it smart to: a) build a separate site in a subfolder of her current site? (plasticsurgerypractice.com/skincare) b) build a subdomain? (skincare.plasticsurgerypractice.com) c) build her a new site (plasticsurgeryskincare.com)
Intermediate & Advanced SEO | | RachelEm0 -
Links to www vs non-www
I was having speed issues when I ran a test under Google Page Speed test and, as a result, switched to using Google Page Speed Service. This meant I had to switch my site from the non-www to the www. Since the switch my page is running faster but my ranking has dropped. What I'm trying to find out is the drop due to all of my previous links going to the non-www or is it because of the site being considered new and is more of a temporary issue. If it is a link issue I will contact everyone I can to see who will update the site address. Thanks everyone!
Intermediate & Advanced SEO | | toddmatthewca0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0