Blog on a subdomain vs subfolder?
-
Hi,
Does anyone have data to show that a subfolder is better than a subdomain for a blog? From what I've read, it sounds like both are a viable option but you choose subdomain if you want to build your blog as a distinct entity. Do you get ranked more quickly with a subfolder? Do you see X% more lift? Has anyone tested or seen tests around this subject?
Any input is appreciated! Thanks in advance.
-
Thanks for the insight, I appreciate the well-thought response.
-
A subdomain is viewed as a separate domain. A subfolder is viewed as your existing domain.
If you owned "domain.com" you could sell a subdomain to others for any and all possible combinations. This is basically what wordpress.com does, for example. When you make yoursite.wordpress.com, you don't get any the benefits of wordpress' domain authority.
From a SEO perspective, there isn't much difference between yoursite.wordpress.com and www.yoursite.com. It's a different domain. It's really difficult to say if Google has any extremely minor boost in any way to a subdomain, but I am not aware of any.
When you offer a blog in a subfolder, it is part of your site. It inherits your DA, and any links to the blog can add to your site's overall DA.
The bottom line is, if this blog is directly related to your site and will be focused on the same topics as your site, you would most likely prefer it to be a subfolder. If this blog is not related to your site, or will discuss off-topic issues, then you would prefer it as a subdomain.
This topic has been discussed many times and Google is your friend (in this case). You can take a look at the SEOmoz article: http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites or use Google to search the many articles on this topic: http://lmgtfy.com/?q=blog+subdomain+vs+subfolder
Do you get ranked more quickly with a subfolder?
Your site's ranking would be based on your DA factors. The higher your DA, the more important your site's content is to Google, the more often you should be crawled. The subdomain would be seen as a brand new site. Even to submit a sitemap for it you would have to first verify it as a new site with Google. Based on these factors I would say Yes, if you had an established site your blog would be ranked faster as a folder on the existing site as opposed to a new subdomain.
Do you see X% more lift?
That is the definition of DA. How likely pages on your existing site are likely to rank well. If you have an established site with good DA, and you add a blog then yes your pages should rank better on the main site when compared to the same article on a newly created subdomain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
SSL for subdomain is good or bad?
Hello, We have SSL certificate for our domain only for *.website.com, And now, we have few subdomains, as you know, we have two choices: 1. Using HTTPS for subdomain https://me.website.com, while it has problem with https://www.me.website.com (SSL error) 2. Using HTTP for subdomain, which has www and non-www with redirects. Which one is good for us?
Technical SEO | | Anetwork0 -
Removal of date archive pages on the blog
I'm currently building a site which currently has an archive of blog posts by month/year but from a design perspective would rather not have these on the new website. Is the correct practice to 301 these to the main blog index page? Allow them to 404? Or actually to keep them after all. Many thanks in advance Andrew
Technical SEO | | AndieF0 -
Subdomains Issue
Hi , We have created sub domains of our site to target various Geo´s. For example, geo, uk.site.com, de.site,com and all these sub domains have the same content as main domain. Will it affect our SEO Rankings? How can we solve this if it affects our rankings?
Technical SEO | | mikerbrt240 -
Secure Vs Non-Secure Redirects
I have a client who has a lot of duplicate pages on their site. The pages are secure and then non secure counterparts. Not sure why they have this in place but i recomended that they redirect on to the other or vice versa using 301 redirects. I am getting some questions as to why they should do this. Does anyone have a good document outlining the reasoning behind this? For me its just a matter of cleaning up duplicate content but wondering if there is any technical data out there.
Technical SEO | | gkellyiii0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0 -
Promoting a blog or a blog article
Hi what is the best way to promote a blog or a blog article. What i want to do is to find a site when i can put part of the article on that site and then have a link going to my blog for the article. Can anyone recommend any sites that do this please or the best ways to promote a new article from a blog
Technical SEO | | ClaireH-1848860 -
External Microsite VS Internal Folder
We would like to create either a new website or a new section of our existing website that will feature (in time) a lot of content including a forum, video training, tutorials and downloadable resources. Logistically, it would be much easier to create this in a new site (we'll call it newproduct.com) and refer people to the new site. We would, however, like to keep all of that content on our existing site for the sake of content building and SEO. Should we: Duplicate the content and use no index no follow and/or rel canonical? Host all of the content on our site and set up a vanity domain (www.newproduct.com) to point people to the deep linked area (www.mainsite.com/product/newproductinfo)? Host the content only on an external site with the occasional link back to our main site? I realize there are other options but they're mostly variants of the above. Our main objectives are to make it easy for people to get to while leveraging the new content for SEO purposes. What are the pros and cons of these different approaches? What seems to make the most sense? Thank you!
Technical SEO | | BeijerElectronics0