Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The Great Subdomain vs. Subfolder Debate, what is the best answer?
-
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday.
While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders.
https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50
Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference.
http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/
Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important.
Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller.
Does anybody else have any thoughts and/or insight about this?
-
I think Mueller's main point may be that if you treat your subdomains separately from your main site, Google will treat them differently as well. For example, if you have three subdomains - www, blog and cloud - but all of them have different navigation, css and limited interlinking and little keyword theme commonality, Google will treat them as separate sites and you will suffer the dreaded subdomain issue.
BUT if you integrate the three domains well - same nav, same look & feel and lots of good contextual anchor text interlinking, Google will treat it as the same site and the subdomain issue will become moot.
Has anyone done any testing with those variables?
-
Yup! All the case studies I showed above (and plenty since) have demonstrated that you can boost traffic by moving from the subdomain to a subfolder.
-
Great thread! What about a situation where a blog already sits on a subdomain (bearing in mind it hasn't been driving a significant amount of traffic as the site is fairly new). My recommendation would be to move to subfolder, would you agree?
Thank you!
-
This is my new favorite quote... "I understand that Google's representatives have the authority of working at Google going for them, but I also believe they're wrong." (Rand Fishkin)
-
Greetings All,
So the debate goes on and I personally think the value of subfolders versus directories certainly makes sense especially from a linking, age and juice perspective. I do notice in most articles they talk about the benefits for subfolders as it relates to blogs. In past tests and studies, you have shed any insight into how this may affect ecommerce as it relates to countries.
We currently have each country on a subdomain and can run it through webmaster tools and geotarget the country however are considering switching to subfolders, based on all the articles we've read. This would in such drive many more links back to each new subfolder assuming the majority of our links are from "www". It would seem to make sense to switch to subfolders and would be especially helpful as new sub-folders were launched.
I was just wondering if the same argument can be made when it comes to ecommerce and country specific sites. Each site (currently different subdomains) uses a different language and currency. Meta and content is different for each. We launched "www" over 15 years ago but in the past 2 years have introduced various subdomains (ie new languages). As we enter into new countries, we are considering switching everything over to subfolders (obviously with 301'ing the subdomains over to the new subfolders so we dont lose all our existing links).
Im assuming since your studies indicate, you'd think this to be a good idea however all the talk has not been so much about countries and ecommerce. Any one have any light or information they can share with regards to the topic??
Thnkxs
-
Hi Rosemary - thankfully, I have data, not just opinions to back up my arguments:
- In 2014, Moz moved our Beginner's Guide to SEO from guides.moz.com to moz.com itself. Rankings rose immediately, with no other changes. We ranked higher not only for "seo guide" (outranking Google themselves) but also for "beginners guide" a very broad phrase.
- Check out https://iwantmyname.com/blog/2015/01/seo-penalties-of-moving-our-blog-to-a-subdomain.html - goes into very clear detail about how what Google says about subdomains doesn't match up with realities
- Check out some additional great comments in this thread, including a number from site owners who moved away from subdomains and saw ranking benefits, or who moved to them and saw ranking losses: https://inbound.org/discuss/it-s-2014-what-s-the-latest-thinking-on-sub-domains-vs-sub-directories
- There's another good thread (with some more examples) here: https://inbound.org/blog/the-sub-domain-vs-sub-directory-seo-debate-explained-in-one-flow-chart
Ultimately, it's up to you. I understand that Google's representatives have the authority of working at Google going for them, but I also believe they're wrong. It could be that there's no specific element that penalized subdomains and maybe they're viewed the same in Google's thinking, but there are real ways in which subdomains inherit authority that stay unique to those subdomains and it IS NOT passed between multiple subdomains evenly or equally. I have no horse in this race other than to want to help you and other site owners from struggling against rankings losses - and we've just seen too many when moving to a subdomain and too many gains moving to a subfolder not to be wary.
-
Hi,
I've not seen any comment from Googlers regarding this debate. I realize I'm keeping this in the Moz-sphere, which isn't quite what you're looking for, but this quote is from Moz's domain setup guide:
"Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)."
I think that quote is pretty compelling towards the subdirectory side of this quandry. I also recommend checking out the comments on the Whiteboard Friday link you posted, there is plenty of evidence there as well.
Unfortunately, this debate will probably go on forever until we get definitive word from Google.
-
Can you share some details why you want to "move" the store locator to a subdomain? That makes me think it is already operational in a subfolder at the moment. In general, I would recommend not moving content unless there is a very good reason for it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
Hreflang in vs. sitemap?
Hi all, I decided to identify alternate language pages of my site via sitemap to save our development team some time. I also like the idea of having leaner markup. However, my site has many alternate language and country page variations, so after creating a sitemap that includes mostly tier 1 and tier 2 level URLs, i now have a sitemap file that's 17mb. I did a couple google searches to see is sitemap file size can ever be an issue and found a discussion or two that suggested keeping the size small and a really old article that recommended keeping it < 10mb. Does the sitemap file size matter? GWT has verified the sitemap and appears to be indexing the URLs fine. Are there any particular benefits to specifying alternate versions of a URL in vs. sitemap? Thanks, -Eugene
Intermediate & Advanced SEO | | eugene_bgb0 -
Best practice for expandable content
We are in the middle of having new pages added to our website. On our website we will have a information section containing various details about a product, this information will be several paragraphs long. we were wanting to show the first paragraph and have a read more button to show the rest of the content that is hidden. Whats googles view on this, is this bad for seo?
Intermediate & Advanced SEO | | Alexogilvie0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Would it be better to Start Over vs doing a Website Migration?
Hey guys /gals I have a question please. I have a computer repair business that does extremely well in search and is on the front page of google for anything computer repair related. However, I am currently re-branding my company and have completely redesigned every aspect of the UI and the SEO Site structure as well as the fact that I have completely written vastly different content and different title tag lines and meta descriptions for each page. So basically when doing a migration we know that we want to keep our content, titles, headlines and meta descriptions the same as to not lose our page rank. Seeing that I have completely went against the grain in all directions on a much needed company re-branding and everything is completely different from the old site is it even worthwhile 301 redirecting my old urls to the new ones that would (best) correspond with the new? In the plainest English, would I do better at Ranking the New Website QUICKER without doing 301 redirects from the OLD to the NEW? In an EXTREME instance like what I have done, would the Domain Migration IMPEDED me ranking the new site seeing how nothing is the same? I have build a Rock solid SILO Site Architecture on the New site which is WordPress using the Thesis Framework and the old domain is built on JOOMLA 1.5 Thank fellas Marshall
Intermediate & Advanced SEO | | MarshallThompson0 -
What is the best URL structure for categories?
A client's site currently uses the URL structure: www.website.com/�tegory%/%postname% Which I think is optimised fairly well, as the categories are keywords being targeted. However, as they are using a category hierarchy, often times the URL looks like this: www.website.com/parent-category/child-category/some-post-titles-are-quite-long-as-they-are-long-tail-terms Best practise often dictates (such as point 3 in this Moz article) that shorter URLs are better for several reasons. So I'm left with a few options: Remove the category from the URL Flatten the category hierarchy Shorten post titles two a word or two - which would hurt my long tail search term traffic. Leave it as it is What do we think is the best route to take? Thanks in advance!
Intermediate & Advanced SEO | | underscorelive0 -
Domain Name Change - Best Practices?
Good day guys, We got a restaurant that is changing its name and domain. However they are keeping the same server location, same content and same pages (we are just changing the logo on the website). It just has to go a new domain. We don't want to lose the value of the current site, and we want to avoid any duplicate penalties. Could you please advise of the best practices of doing a domain name change? Thank you.
Intermediate & Advanced SEO | | Michael-Goode0 -
Best way to merge 2 ecommerce sites
Our Client owns two ecommerce websites. Website A sells 20 related brands. Website has improving search rank, but not normally on the second to fourth page of google. Website B was purchased from a competitor. It has 1 brand (also sold on site A). Search results are normally high on the first page of google. Client wants to consider merging the two sites. We are looking at options. Option 1: Do nothing, site B dominates it’s brand, but this will not do anything to boost site A. Option 2: keep both sites running, but put lots of canonical tags on site B pointing to site A Option 3: close down site B and make a lot of 301 redirects to site A Option 4: ??? Any thoughts on this would be great. We want to do this in a way that boosts site A as much as possible without losing sales on the one brand that site B sells.
Intermediate & Advanced SEO | | EugeneF0