Subdomain blog vs. subfolder blog in 2013
-
So I've read the posts here:
http://moz.com/community/q/subdomain-blog-vs-subfolder-blog-in-2013
and many others, Matt Cutts video, etc.
Does anyone have direct experience that its still best practice to use the sub folder? (hopefully a moz employee can chime in?)
I have a client looking to use hubspot. They are preaching with the Matt Cutts video. I'm in charge of SEO / marketing and am at odds with them now. I'd like to present the client with more info than "in my experience in the past I've seen subdirectories work."
Any help? Articles? etc?
-
I'm associated with a site that ranked fairly well. Earlier in the summer, the blog was moved from a subfolder to a subdomain for various reasons. While the reasons seemed valid at the time, the site's traffic plummeted about 1-2 weeks later. We've still been trying to analyze as many other changes were made a few weeks prior; however, the arrows are pointing to the subfolder to subdomain change which may have really caused this plague. We're now looking into moving it back to see if it will resolve the problem.
-
This does not influence my opinion about anything.
-
Google does not calculate DA
-
I have first-hand experience that merging a subdomain into a folder on a domain can have a kickass effect on your rankings.
-
-
I just tested:
and hubspot.com
both have the same DA in OSE.
I also tested support.hostgator.com and hostgator.com
those have the same DA.
-
If you got Jesse and PhD sayin' something, best go with it.
-
Well yes. I mean it's quite simple - Linking to a subdomain does not pass authority to the root domain. It's easy to test on any site you can find me that has a subdomain. Plug it into OSE and you have yourself two different DAs for that very reason.
It's something I don't see ever changing. There's a reason sub domains are treated separately in terms of incoming links; they are their own entity and I believe this will always be the case. Can't think of why it wouldn't.
-
Thanks guys. I know everyone in our industry is pro sub directories. I guess what I am looking for is irrefutable case studies / fact. Have you guys tested this post 2012? Is there any evidence from 2013 that this is still the case?
-
I second that. You use the blog to build the authority of the main domain.
-
Using a subdirectory will cause all of the potential link juice to flow to your root domain. If you go with a subdomain, the potential links gained from awesome blog content won't do your actual domain any good as far as ranking organically for your targeted keywords.
That's the short version. Subdirectories all the way (assuming this is what you're gaming at of course.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Moz recommend subdomains for language-specific websites?
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites: Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website). Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Intermediate & Advanced SEO | | AdamThompson0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Short Url vs Medium Urls ?
Hello Moooooooooooz ! I got a SEO fight today and though the best would be to involve more people into the fight ! 😛 Do you think it's better to get A- company.com/services/service1.html or B- company/service1.html I was for A as services is also googled to find the service1. I also think that it's better to help google to understand where the service is on the website My friend was for B as URL has to stay as short as possible What do you think ? ps: I can create the URL I want using Joomla and Sh404. The websites has 4 different categoies: /about, /services/ products, /projects Tks ! 🙂
Intermediate & Advanced SEO | | AymanH0 -
Changing a subdomain to a full domain to rank for a keyword
We have been attempting to get our blogsite to rank for our business name(Instabill). We are now considering changing the url from blog.instabill.com to something like instabillblog.com. I have following concerns about the change; Will changing the domain really be that helpful (i.e. will the change get our blog on page one for the term instabill) We have over 350 pages of content on our blog. Will changing the domain have possible negative effects ( I was thinking of using url updater in webmaster tools and creating a permanent 301 redirect from the older url to the new) Having never changed a url for a site with this much content and seo value for my company I would like to know the following from someone who has made mistakes here before; what not to do what steps you would take to make the transition easier Any help here will be greatly appreciated. cheers, Instabill
Intermediate & Advanced SEO | | Instabill0 -
Subdomain for every us state?
Hi, one of our clients has an idea of making subdomains from his main website to sell his online advertisements in all states in USA. f.e: texas.web.com atlanta.web.com He wants to have a subdomain for every state and there to be information related only or mainly to this state? I am not sure about is this a good idea? What is your opinion about it?
Intermediate & Advanced SEO | | vladokan0 -
Use of subdomains, subdirectories or both?
Hello, i would like your advice on a dilemma i am facing. I am working a new project that is going to release soon, thats a network of users with personal profiles seperated in categories for example lets say the categories are colors. So let say i am a member and i belong in red color categorie and i got a page where i update my personal information/cv/resume as well as a personal blog thats on that page. So the main site is giving the option to user to search for members by the criteria of color. My first idea is that all users should own a subdomain (and this is how its developed so far) thats easy to use and since the domain name is really small (just 3 letters) i believe subdomain worth since personal site will be easy to remember. My dilemma is should all users own a subdomain, a subdirectory or both and if both witch one should be the canonical? Since it said that search engines treat subdomains as different stand-alone sites, whats best for the main site? to show multiple search results with profiles in subdomains or subdirectories? What if i use both? meaning in search results i use search directory url for each profile while same time each profile owns a subdomains as well? and if so which one should be the canonical? Thanks in advance, C
Intermediate & Advanced SEO | | HaCos0 -
Does Google prefer Wordpress Blogs?
In creating a regular brochure website such as one for a dentist or doctor, do you see any SEO benefit to having it based in a Wordpress blog? I do see the SEO benefit of having an actual blog on the site and continually updating that, but simply using the Wordpress platform as a CMS - does that give the site any benefit? If there is a benefit, is there a way to duplicate that advantage without going through the trouble of creating a Wordpress template for the site? Maybe just publishing a sitemap.xml, and feed, etc? Thanks! Tom
Intermediate & Advanced SEO | | TomBristol0 -
Putting A Blog On A Sub-Domain The Right Thing To Do?
Going to setup a blog for a 4 year old ecommerce website and was wondering if it would be a good idea to put a blog on the sub domain or just a folder like www.domain.com.au/blog I'll be using the blog to Link bait articles Social bookmark traffic Linking keywords to products on the ecommerce site. I wanted to know if The link juice would be greater if we cross link from sub-domain to main domain? Any major dis-advantages in having it on a sub-domain vs folder? Any other major differences? Cheers!
Intermediate & Advanced SEO | | upick-1623910