Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blog.mysite.com or mysite.com/blog?
-
Hi, I'm just curious what the majority think of what's the best way to start a blog on your website for SEO benefits. Is it better to have it under a sub domain or a directory? Or does it even matter?
-
From everything I've read, I agree that your safest bet is to go with the subfolder.
-
I agree with Tim and Adam and that said, sub-folders are better as a general rule of thumb for sure.
You might also want to refer to other similar questions here on SEOMOZ.
-
http://www.seomoz.org/q/blogs-are-best-when-hosted-on-domain-subdomain-or* http://www.seomoz.org/q/setting-up-a-company-blog-subdomain-or-new-url* http://www.seomoz.org/q/blog-vs-blog
and the post from Matt Cutts as well as the article from Rand that Adam mentioned.
-
-
I think Adam has hit the nail on the head. We recently moved our blog site from a subdomain to a subfolder and 301'd all the old URL’s with the intention that any entries that users find genuinely useful or interesting will be potentially linked to, thereby providing a benefit to the root domain.
As long as your blog is tightly related to your core business activity then I would go down the subfolder root although, in all honesty, I think subdomains potentially look a little more professional.
-
Hi Tim,
I generally prefer to go with the subfolder option (mysite.com/blog) rather than the subdomain (blog.mysite.com). The reason I prefer this option is because having the blog in a subfolder means that it will benefit from the value of the root domain. In other words, links that are obtained by the root domain will pass that value to the subfolders. However, a subdomain is treated as a separate site and therefore not much value is passed via the root.
Rand provides an excellent answer in a previous Q&A of a similar topic:
Hope that helps,
Adam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
How to Handle duplicate pages/titles in Wordpress
The wordpress blog causes problems with page titles. If you go to the second page of blog posts it there's a different URL but with the same page title. for example: page 1: site/blog page 2: site/blog/page/2 Each page gets flagged for duplicate page titles. Thanks in advance for your thoughts,
On-Page Optimization | | heymarshall1 -
How to overcome blog page 1, 2, 3, etc having no or duplicate meta info?
As the above what is the best way to overcome having the same meta info on your blog pages (not blog posts) So if you have 25 blog posts per page once you exceed this number you then move onto a second blog page, then when you get to 50 you then move onto a 3rd blog page etc etc So if you have thousands f blog pages what is the best method to deal with this rather than having to write 100s of different meta titkes & descriptions? Cheers
On-Page Optimization | | webguru20141 -
Best SEO Extension/Plugin for NOPCommerce Site?
Hi I am working for a client who is using NOPCommerce. It doesn't look like they have a SEO Plugin in - although you can add meta descriptions to Products - which works fine, the Product categories have SEO components too but do not seem to work and all 'other' content /CMS pages have no SEO components whatsoever. Does anyone know of a plugin which would resolve this? (PS never used NOPCommerce before!)
On-Page Optimization | | AllieMc0 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
Google Page Rank of my site has dropped from 4/10 to 3/10
Google Page rank of my website has been dropped after Panda Update. Can anyone help me out to tell me the possible reasons about the same. We have tried to make our website more lively and user friendly. We have indulged some graphics to make it more attractive. But it seems it backfired us. my site is http://www.myrealdata.com as well as Google page ranking of my Quickbooks hosting page has been dropped as well. It would be great if someone can help me out with expert suggestions.
On-Page Optimization | | SangeetaC1 -
Best SEO structure for blog
What is the best SEO page/link structure for a blog with, say 100 posts that grows at a rate of 4 per month? Each post is 500+ words with charts/graphics; they're not simple one paragraph postings. Rather than use a CMS I have a hand crafted HTML/CSS blog (for tighter integration with the parent site, some dynamic data effects, and in general to have total control). I have a sidebar with headlines from all prior posts, and my blog home page is a 1 line summary of each article. I feel that after 100 articles the sidebar and home page have too many links on them. What is the optimal way to split them up? They are all covering the same niche topic that my site is about. I thought of making the side bar and home page only have the most recent 25 postings, and then create an archive directory for older posts. But categorizing by time doesn't really help someone looking for a specific topic. I could tag each entry with 2-3 keywords and then make the sidebar a sorted list of tags. Clicking on a tag would then show an intermediate index of all articles that have that tag, and then you could click on an article title to read the whole article. Or is there some other strategy that is optimal for SEO and the indexing robots? Is it bad to have a blog that is too heirarchical (where articles are 3 levels down from the root domain) or too flat (if there are 100s of entries)? Thanks for any thoughts or pointers.
On-Page Optimization | | scanlin0