Subdomain vs root which is better for SEO
-
We run a network of sites that we are considering consolidating into one main site with multiple categories. Which would be better having each of the "topics / site" reside in subdomains or as a sub-folder off of the root? Pros and cons of each would be great.
Thanks,
TR
-
This might shed a little more light on the subject.
The sites are per video game. So they could be completely different from one another.
I like the way a subdomain looks and for this purpose, I've seen other sites doing it this way for a long time.
From where I sit subdomains seem to be the "right" way to go about this but if sub-dirs are going to be BETTER for SEO then I want to do that.
I don't mind SEOing multiple sites and I would somewhat prefer to keep things separated out like that.
I guess what I really want to know is, Is this a personal preference thing if workload isn't my concern? Or is one going to be objectively better than the other.
Thanks for the replies so far. Much appreciated.
-
Agree with Irving - it's better to have subfolders most of the time. The essential question is who will be controlling the SEO? If it's one person or company, use subfolders. If multiple people will each control their own content, use subdomains. Think about this like Wordpress.com does.
Subdomains are somewhat treated as a separate website. So if you split them, you end up doing the SEO work more than once. Subfolders are considered part of the domain so http://mysite.wordpress.com is separate from Wordpress.com while http://www.wordpress.com/info/index.html is considered part of Wordpress.com Make sense?
So if you use subdomains, each subdomain needs its own SEO. If you use subfolders all the work can be done to one site, which is usually much more efficient. The best use of subdomains is if you had say a franchise (boston.yoursite.com, chicago.yoursite.com, melbourne.yoursite.com) where each of the franchisees would be responsible for their own marketing or a site like Wordpress that allows you to build your own content (thus under many people's control.) That would be site.wordpress.com, site2.wordpress.com etc.
Wordpress doesn't "care" to SEO all those sites nor necessarily do they want to pass their own SEO juice onto those sites as subfolders so they've made them subdomains.
-
directories are better than subdomains. subdomains are seen as different sites, directory allows you to have one main site and pass the PR throughout the entire site to create a stronger enterprise.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am looking for an SEO strategy
I am looking for an SEO strategy, a step by step procedure to follow to rank my website https://infinitelabz.com . Can somebody help?
Intermediate & Advanced SEO | | KHsdhkfn0 -
SEO page descriptions on mobile - how to hide while preserving the juice for SEO?
Hi everybody, On our pages we have crafted good text paragraphs for SEO purposes. On desktop everything is fine but on mobile the paragraph of text pushes the main content really low on the page. Is there a way of hiding the text while preserving the SEO juices and not getting penalised by Google for spamming techniques? I'd appreciate any recommendations on how to deal with this. Thanks very much!
Intermediate & Advanced SEO | | Firebox0 -
Zenfolio set up for SEO
We are using zenfolio as a hosted photography/image gallery set up as http://oursite.zenfolio.com We have about 24,000 backlinks to the website however over 22,000 are from zenfolio.
Intermediate & Advanced SEO | | jazavide
Do you see issues with this set up from an organic seo perspective and so many links from one domain pointing back into the main site?
Thanks0 -
Moving popular blog from root to subdomain. Considerations & impact?
I'd like to move the popular company blog from /ecommerce-blog to blog.bigcommerce.com.WordPress application is currently living inside the application that runs the .com and is adding a large amount of files to the parent app, which results in longer deployment times than we'd like. We would use HTTP redirection to handle future requests (e.g. HTTP status code 301). How can this be handled from a WP point of view? What is the impact of SEO, rankings, links, authority? Thanks.
Intermediate & Advanced SEO | | fullstackmarketing.io0 -
Google Adsense Good for SEO?
Is there any merit to the statement that Google will give some SEO value to sites that display Adsense? Or is there absolutely no SEO value for or against a site that displays Adsense Ads? Clearly, it would benefit Google's finance to give at least a small boost to sites that display Adsense, but do they do it? My guess is no, but I'm wondering ...
Intermediate & Advanced SEO | | applesofgold0 -
Multiple stores & domains vs. One unified store (SEO pros / cons for E-Commerce)
Our company runs a number of individual online shops, specialised in particular products but all in the same genre of goods overall, with a specific and relevant domain name for each shop. At the moment the sites are separate, and not interlinked, i.e. Completely separate brands. An analogy could be something like clothing accessories (we are not in the clothing business): scarves.com, and silkties.com (our field is more niche than this) We are about to launch a related site, (e.g. handbags.com), in the same field again but without precisely overlapping products. We will produce this site on a newer, more flexible e-commerce platform, so now is a good time to consider whether we want to place all our sites together with one e-commerce system on the backend. Essentially, we need to know what the pros and cons would be of the various options facing us and how the SEO ranking is affected by the three possibilities. Option 1: continue with separate sites each with its own domains. Option 2: have multiple sites, each on their own domain, but on the same ecommerce system and visible linked together for the customer (with unified checkout) – on the top of each site could be a menu bar linking to each site: [Scarves.com] – [SilkTies.com] – [Handbags.com] The main question here is whether the multiple domains are mutually beneficial, particularly considerding how close to target keywords the individual domains are. If mutually benefitial, how does it compare to option 3: Option 3: Having recently acquired a domain name (e.g. accessories.com) which would cover the whole category together, we are presented with a third option: making one site selling all of these products in different categories. Our main concern here would be losing the ability to specifically target marketing, and losing the benefit of the domains with the key words in for what people are more likely to be searching for (e.g. 'silk tie') rather than 'accessories.' Is it worth taking the hit on losing these specific targeted domain names for the advantage of increased combined inbound links?
Intermediate & Advanced SEO | | Colage0 -
Expiring URL seo
a buddy of mine is running a niche job board and is having issues with expiring URLs. we ruled it out cuz a 301 is meant to be used when the content has moved to another page, or the page was replaced. We were thinking that we'd be just stacking duplicate content on old urls that would never be 'replaced'. Rather they have been removed and will never come back. So 410 is appropriate but maybe we overlooked something. any ideas?
Intermediate & Advanced SEO | | malachiii0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0