SubDomain vs. SubFolder
-
I know this subject has been discussed many, many times before. But it is now 2013, and Google continues to tweak and change their algo to build upon the best delivered results for users.
So the questions are:
Does Google still treat subdomains as a completely separate and unique domain from the root?
If so, is it a good SEO strategy to split up, when it fits, a website into subdomains with links pointing back to the root or main domain?
As a company we have several subdomains with some of our categories. For example our main site is www.iboats.com. This site has all our boat products. But we set up subdomains several years ago for the following:
And we have our fourms as a subdomain: forums.iboats.com
Splitting them out were originally done for SEO reasons, but now is more for better managing our main categories.
It appears that Google is treating our subdomains as part of our main root domain anyway, so I don't see the SEO value anymore. If we were to move the subdomains into subfolders of the root, I'm wondering if we might see a boost in SEO value having more pages within the main website?
I'd be interested in everyone's thoughts on this subject.
-
Are there no opinions on this topic?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
Intermediate & Advanced SEO | | 945010 -
Null Alt Image Tags vs Missing Alt Image Tags
Hi, Would it be better for organic search to have a null alt image tag programatically added to thousands of images without alt image tags or just leave them as is. The option of adding tailored alt image tags to thousands of images is not possible. Is having sitewide alt image tags really important to organic search overall or what? Right now, probably 10% of the sites images have alt img tags. A huge number of those images are pages that aren Thanks!
Intermediate & Advanced SEO | | 945010 -
International Subdomain Headache
My client set up a separate domain for their international clients, then set up separate subdomains for each country where they're active (so, for example, the original site is xx.com and the global is xxworldwide.com, with subdomains like mx.xxxworldwide.com). They auto-translated a large amount of content and put the translations on those international sites. The idea was to draw in native speakers. Now, I don't think this is a great practice, obviously, and I'm worried that it could hurt their original site (the xxx.com in the example above). My concern is that Google will see through the translated text, since it was handled with Google Translate, and penalize both sites. I don't think the canonical tag applies here, since Google recommends a no-follow for autotranslated text, but I've also never dealt with this type of situation before. Anyways, if you made it through all of that, congratulations. My question is whether xxx.com is getting any negative effects other than a potential loss of link juice -- and whether there's any legitimate way to present auto-translated text with a few minor changes without incurring a penalty.
Intermediate & Advanced SEO | | Ask44435230 -
Domain.com/postname vs. Domain.com/blog/postname
I am wondering what is the best practice regarding blogs? I read that it would be best to structure a website like a pyramide instead of a flat panckage But I have seen many blogs where the post shows right after the domain name. Domain.com/postname instead of Domains/blog/postname My point is that if a website has many post then the structure will get very flat and this will maybe make your most optimized and important pages less important to google domain.com/page a) What do you think about this, which one of the two blog solutions do you prefer and why? b) in context to blog If for instance you had a keyword like Copenhagen property would you then consider renaming your blog to realetateagent.com/Copenhagen-property-news/post-name c) Would write a little intro like 200 words for the page 1 of your blog and add in some keywords.
Intermediate & Advanced SEO | | nm19770 -
Cross Domain Rel Canonical tags vs. Rel Canonical Tags for internal webpages
Today I noticed that one of my colleagues was pointing rel canonical tags to a third party domain on a few specific pages on a client's website. This was a standard rel canonical tag that was written Up to this point I haven't seen too many webmasters point a rel canonical to a third party domain. However after doing some reading in the Google Webmaster Tools blog I realized that cross domain rel canonicals are indeed a viable strategy to avoid duplicate content. My question is this; should rel canonical tags be written the same way when dealing with internal duplicate content vs. external duplicate content? Would a rel=author tag be more appropriate when addressing 3rd party website duplicate content issues? Any feedback would be appreciated.
Intermediate & Advanced SEO | | VanguardCommunications0 -
Create new subdomain or new site for new Niche Product?
We have an existing large site with strong, relevant traffic, including excellent SEO traffic. The company wants to launch a new business offering, specifically targeted at the "small business" segment. Because the "small business" customer is substantially different from the traditional "large corporation" customer, the company has decided to create a completely independent microsite for the "small business" market. Purely from a Marketing and Communications standpoint, this makes sense. From an SEO perspective, we have 2 options: Create the new "small business" microsite on a subdomain of the existing site, and benefit from the strong domain authority and trust of the existing site. Build the microsite on a separate domain with exact primary keyword match in the domain name. My sense is that option #1 is by far the better option in the short and long run. Am I correct? Thanks in advance!
Intermediate & Advanced SEO | | axelk0 -
Subdomain v. subdirectory v. other domain for blogs
I have a good amount of content on our main domain ( http://m00.biz/w4Ljfr ) let's say for discussion it's doctors.com and as you can see, much of it is in subdirectories. Traditionally this was the approach. Now I have some other content on subdomains but it's primarily directories and databases. Now I see that Google is giving subdomains their own SER as if they are a separate site and competitors are locking in the top few results merely by having their content on subdomains. Now I have an opportunity of doing two things: 1. Current content: moving all the content of the past few years on their own subdomain (forum, blog), and I'll be moving forum software anyways. Not sure about our own guide, which has been up there for a while. 2. New content: putting up some new blogs/magazines such as "Doctor's Handbook." Let's say that is a common phrase. I can choose between the following: (a) www.doctors.com/handbook/ (b) handbook.doctors.com/ (c) www.doctorshandbook.com I've got a bit of a quandary here, not sure of the best course of action and am curious to hear from many of you who have handled situations like this before.
Intermediate & Advanced SEO | | attorney0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0