Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomains vs directories on existing website with good search traffic
-
Hello everyone,
I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it.
We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go.
So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits.
All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains.
Thank you in advance.
-
Hi Damien,
So if I'm reading this correctly, the consultant is saying that due to the size of the site (tens of thousands of pages), and the need to categorise its content, that subdomains are the best choice.
I would say that there are far bigger websites using categories within subfolders, notably big retailers, e.g.
http://www.marksandspencer.com/c/beauty, http://www.marksandspencer.com/c/food-and-wine, http://www.marksandspencer.com/c/mands-bank
http://www.waitrose.com/home/inspiration.html, http://www.waitrose.com/home/wine.html, http://www.waitrose.com/content/waitrose/en/home/tv/highlights.html (<-- the last one being a crappy version, but a subdomain nonetheless)
and so do websites that deal with providing content for very different audiences:
http://www.ncaa.com/schools/tampa, http://www.ncaa.com/championships/lacrosse-men/d1/tickets, http://www.ncaa.com/news/swimming-men/article/2014-03-29/golden-bears-and-coach-david-durden-earn-third-national-title, http://www.ncaa.com/stats/football/fbs
Has the consultant provided examples of other websites doing this that would take on the same structure?
There are hundreds of examples of websites whose structure / categories are properly understood despite existing in subdirectories, so I'm still sceptical that this is a necessity.
This is not to say that a subdomain approach wouldn't work and is definitively bad or anything, I'm just not really convinced that the reasoning is strong enough to move content away from the root domain.
I disagree about user experience - from a user's perspective, the only difference between subfolders and subdomains is the URL they can see in the address bar. The rest is aesthetic. You can do or not do everything you'd do with the design of a website using subdirectories that you'd do with a website(s) employing subdomains. For example, just because content sits on www.icy-veins.com/wow/, its navigation wouldn't have to link to www.icy-veins.com/hearthstone/ or mention the other brand in any way if you don't want to. You can still have separate conversion funnels, newsletter sign-ups, advertising pages, etc.
-
Thank you for shedding more light on the matter. Here are the reasons why our consultant thought that subdomains would be better:
In the case of ICY VEINS the matter is clear, subdomains will be the best of course of action and I will quickly explain why
- The domain has over 10,000+ pages (my scan is still running looking at 66,000+ addresses already) which put it in a whole new category. For smaller sites and even local business sites sub directories will always be the better choice
- Sub Domains will allow you to categorize the different categories of your website. The sub domains in mind are all relating to the gaming industry so it still makes it relevant to the global theme of the website.
- Splitting up the different categories into subdomains will allow search engine to better differentiate the areas of your website (see attached image named icy-veins SERP – Sitelink.png). At the moment Google do not properly categorize your areas of your website and uses your most popular visited areas as the given site links in the search engine results page)
- However noting that you already have the sub directory /heartstone a .htaccess 301 redirect for that whole directory will have to be set in place for any current. This will ensure that any inbound links from other sites will be automatically redirected to the correct sub domain and index page. Failing to implement the redirect will cause that the correct Page Authority and Domain Authority not to carry over to the sub domain. Technically heartstone.icy-veins.com and icy-veins.com is to separate domains according to the DNS that is why it is important to ensure that the redirects is in place to carry over any “seo juice” that the old directory had.
- Sub domains enhances the user experience of your visitors by keeping to separate themes and topics. This will have a positive impact on your bounce rate (which is currently sitting at 38% for the last 30 days) and better funnelling for goal conversions (i.e. donate | newsletter signup | advertise on our website
- Essentially you are focusing on different products for the same brand
In the end of the day it comes down to your personal preference although sub domains will be a better choice to ensure that your different products are split up and reflects better with the search engine results pages.
-
Hi Damien,
There are cases where subdomains are very necessary or inevitable, usually because of technical limitations (and even then, they can usually be worked around via practices like reverse proxy). When you see subdomains in the wild and aren't sure why they're being used, they will often just be legacies - old set-ups that no one wants to change because it would require redirecting old URLs, which is inadvisable if those URLs don't need to be redirected and if they rank well.
In this case, I'd be really interested to know why the SEO was adamant that the new structure should use subdomains and not subdirectories. Google is much better at working with new subdomains now than it was in years past, but if there is no good reason to use them, subdirectories are still the safer option for SEO purposes, and the content housed on subdirectories should automatically inherit authority from the parent domain. New subdomains seem to be far less likely to inherit this authority, as other responders have said above.
Find out exactly why the SEO wanted subdomains - if their reasoning isn't solid, you may want to place this content in subdirectories and place 301 redirects from the subdomains to the subdirectories. If you are going to do these redirects, doing them sooner rather than later is advisable as redirection usually comes with a short period of lower rankings / traffic.
On that note, redirection does usually result with that short period of traffic loss, but that should happen quite quickly and be fixing itself in 2+ weeks, not getting worse.
-
Unfortunately yes you will need to 301 the subdomains back to the folder structure.
-
Thank you Dean and Caitlin! So, I guess the next step would be to revert the change and switch to directories (using 301-redirects from the subdomains), right?
-
I agree with Dean above. Subdomains split your authority. Basically, this means that Google considers wow.icey-veins.com and hearthstone.iceyveins.com as two separate websites in their book. For this reason, folders would have been the far better solution - the site's authority would have remained the same and any additional folders added to the site and resulting links to that folder would have continued to build up the website's authority.
Don't get me wrong, there are a number of websites that utilize subdomains (typically very large sites). In fact, it use to be very common in year's past. However, subdomains are no longer seen as SEO best practice. ^Caitlin
-
The advice to use sub domains is a wow in it self from an SEO point of view. Sub domains do not pass authority so it's basically like having a new domain for each sub domain.Folders would have been a far better solution in my opinion.
Interesting debate regarding the learning page re domains on Moz here: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain cannibalization
Hi, I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example: Root domain: example.com
Intermediate & Advanced SEO | | Mat_C
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,... Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains. The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop. Thanks.0 -
Landing pages for paid traffic and the use of noindex vs canonical
A client of mine has a lot of differentiated landing pages with only a few changes on each, but with the same intent and goal as the generic version. The generic version of the landing page is included in navigation, sitemap and is indexed on Google. The purpose of the differentiated landing pages is to include the city and some minor changes in the text/imagery to best fit the Adwords text. Other than that, the intent and purpose of the pages are the same as the main / generic page. They are not to be indexed, nor am I trying to have hidden pages linking to the generic and indexed one (I'm not going the blackhat way). So – I want to avoid that the duplicate landing pages are being indexed (obviously), but I'm not sure if I should use noindex (nofollow as well?) or rel=canonical, since these landing pages are localized campaign versions of the generic page with more or less only paid traffic to them. I don't want to be accidentally penalized, but I still need the generic / main page to rank as high as possible... What would be your recommendation on this issue?
Intermediate & Advanced SEO | | ostesmorbrod0 -
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
M.ExampleSite vs mobile.ExampleSite vs ExampleSite.com
Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.
Intermediate & Advanced SEO | | 945010 -
The Great Subdomain vs. Subfolder Debate, what is the best answer?
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday. While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders. https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50 Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference. http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/ Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important. http://www.websitemagazine.com/content/blogs/posts/archive/2015/03/10/seo-inquiry-subdomains-subdirectories.aspx Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller. Does anybody else have any thoughts and/or insight about this?
Intermediate & Advanced SEO | | RosemaryB3 -
Two websites vs each other owned by same company
My client owns a brand and came to me with two ecommerce websites. One website sells his specific brand product and the other sells general products in his niche (including his branded product). Question is my client wants to rank each website for basically the same set of keywords. We have two choices I'd like feedback on- Choice 1 is to rank both websites for same keyword groupings so even if they are both on page 1 of the serps then they take up more real estate and share of voice. are there any negative possibilities here? Choice 2 is to recommend a shift in the position of the general industry website to bring it further away from the industry niche by focusing on different keywords so they don't compete with each other in the serps. I'm for choice 1, what about you?
Intermediate & Advanced SEO | | Rich_Coffman0 -
Blog subdomain not redirecting
Over the last few weeks I have been focused on fixing high and medium priority issues, as reported by the Moz crawler, after a recent transition to WordPress. I've made great progress, getting the high priority issues down from several hundred (various reasons, but many duplicates for things like non-www and www versions) to just five last week. And then there's this weeks report. For reasons I can't fathom, I am suddenly getting hundreds of duplicate content pages of the form http://blog.<domain>.com</domain> (being duplicates with the http://www.<domain>.com</domain> versions). I'm really unclear on why these suddenly appeared. I host my own WordPress site ie WordPress.org stuff. In Options / General everything refers to http://www.<domain>.com</domain> and has done for a number of weeks. I have no idea why the blog versions of the pages have suddenly appeared. FWIW, the non-www version of my pages still redirect to the www version, as I would expect. I'm obviously pretty concerned by this so any pointers greatly appreciated. Thanks. Mark
Intermediate & Advanced SEO | | MarkWill0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0