Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomains vs directories on existing website with good search traffic
-
Hello everyone,
I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it.
We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go.
So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits.
All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains.
Thank you in advance.
-
Hi Damien,
So if I'm reading this correctly, the consultant is saying that due to the size of the site (tens of thousands of pages), and the need to categorise its content, that subdomains are the best choice.
I would say that there are far bigger websites using categories within subfolders, notably big retailers, e.g.
http://www.marksandspencer.com/c/beauty, http://www.marksandspencer.com/c/food-and-wine, http://www.marksandspencer.com/c/mands-bank
http://www.waitrose.com/home/inspiration.html, http://www.waitrose.com/home/wine.html, http://www.waitrose.com/content/waitrose/en/home/tv/highlights.html (<-- the last one being a crappy version, but a subdomain nonetheless)
and so do websites that deal with providing content for very different audiences:
http://www.ncaa.com/schools/tampa, http://www.ncaa.com/championships/lacrosse-men/d1/tickets, http://www.ncaa.com/news/swimming-men/article/2014-03-29/golden-bears-and-coach-david-durden-earn-third-national-title, http://www.ncaa.com/stats/football/fbs
Has the consultant provided examples of other websites doing this that would take on the same structure?
There are hundreds of examples of websites whose structure / categories are properly understood despite existing in subdirectories, so I'm still sceptical that this is a necessity.
This is not to say that a subdomain approach wouldn't work and is definitively bad or anything, I'm just not really convinced that the reasoning is strong enough to move content away from the root domain.
I disagree about user experience - from a user's perspective, the only difference between subfolders and subdomains is the URL they can see in the address bar. The rest is aesthetic. You can do or not do everything you'd do with the design of a website using subdirectories that you'd do with a website(s) employing subdomains. For example, just because content sits on www.icy-veins.com/wow/, its navigation wouldn't have to link to www.icy-veins.com/hearthstone/ or mention the other brand in any way if you don't want to. You can still have separate conversion funnels, newsletter sign-ups, advertising pages, etc.
-
Thank you for shedding more light on the matter. Here are the reasons why our consultant thought that subdomains would be better:
In the case of ICY VEINS the matter is clear, subdomains will be the best of course of action and I will quickly explain why
- The domain has over 10,000+ pages (my scan is still running looking at 66,000+ addresses already) which put it in a whole new category. For smaller sites and even local business sites sub directories will always be the better choice
- Sub Domains will allow you to categorize the different categories of your website. The sub domains in mind are all relating to the gaming industry so it still makes it relevant to the global theme of the website.
- Splitting up the different categories into subdomains will allow search engine to better differentiate the areas of your website (see attached image named icy-veins SERP – Sitelink.png). At the moment Google do not properly categorize your areas of your website and uses your most popular visited areas as the given site links in the search engine results page)
- However noting that you already have the sub directory /heartstone a .htaccess 301 redirect for that whole directory will have to be set in place for any current. This will ensure that any inbound links from other sites will be automatically redirected to the correct sub domain and index page. Failing to implement the redirect will cause that the correct Page Authority and Domain Authority not to carry over to the sub domain. Technically heartstone.icy-veins.com and icy-veins.com is to separate domains according to the DNS that is why it is important to ensure that the redirects is in place to carry over any “seo juice” that the old directory had.
- Sub domains enhances the user experience of your visitors by keeping to separate themes and topics. This will have a positive impact on your bounce rate (which is currently sitting at 38% for the last 30 days) and better funnelling for goal conversions (i.e. donate | newsletter signup | advertise on our website
- Essentially you are focusing on different products for the same brand
In the end of the day it comes down to your personal preference although sub domains will be a better choice to ensure that your different products are split up and reflects better with the search engine results pages.
-
Hi Damien,
There are cases where subdomains are very necessary or inevitable, usually because of technical limitations (and even then, they can usually be worked around via practices like reverse proxy). When you see subdomains in the wild and aren't sure why they're being used, they will often just be legacies - old set-ups that no one wants to change because it would require redirecting old URLs, which is inadvisable if those URLs don't need to be redirected and if they rank well.
In this case, I'd be really interested to know why the SEO was adamant that the new structure should use subdomains and not subdirectories. Google is much better at working with new subdomains now than it was in years past, but if there is no good reason to use them, subdirectories are still the safer option for SEO purposes, and the content housed on subdirectories should automatically inherit authority from the parent domain. New subdomains seem to be far less likely to inherit this authority, as other responders have said above.
Find out exactly why the SEO wanted subdomains - if their reasoning isn't solid, you may want to place this content in subdirectories and place 301 redirects from the subdomains to the subdirectories. If you are going to do these redirects, doing them sooner rather than later is advisable as redirection usually comes with a short period of lower rankings / traffic.
On that note, redirection does usually result with that short period of traffic loss, but that should happen quite quickly and be fixing itself in 2+ weeks, not getting worse.
-
Unfortunately yes you will need to 301 the subdomains back to the folder structure.
-
Thank you Dean and Caitlin! So, I guess the next step would be to revert the change and switch to directories (using 301-redirects from the subdomains), right?
-
I agree with Dean above. Subdomains split your authority. Basically, this means that Google considers wow.icey-veins.com and hearthstone.iceyveins.com as two separate websites in their book. For this reason, folders would have been the far better solution - the site's authority would have remained the same and any additional folders added to the site and resulting links to that folder would have continued to build up the website's authority.
Don't get me wrong, there are a number of websites that utilize subdomains (typically very large sites). In fact, it use to be very common in year's past. However, subdomains are no longer seen as SEO best practice. ^Caitlin
-
The advice to use sub domains is a wow in it self from an SEO point of view. Sub domains do not pass authority so it's basically like having a new domain for each sub domain.Folders would have been a far better solution in my opinion.
Interesting debate regarding the learning page re domains on Moz here: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my website uses CDN does thousands of 301 redirect can harm the website performance?
Hi, If my website uses CDN does thousands of 301 redirect can harm the website performance? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
How to rank if you are an aggregator or a directory of resource?
Most of the SEO suggestions (great quality content, long form content, engagement rate/time on the page, authority inbound links ) apply to content oriented site. But what should you do if you are an aggregator or a resource directory? You aim is to send the user faster to other site they are looking for or provide ranking about the resources. In fact at a very basic level you are competing for search engine traffic because they are doing same things. You may have done a hand crafted, human created resource that is better than what algorithms are showing. And your site likely to have lot more outgoing links than content. You know you are better (or getting better) since repeat visitors keep coming back. So in these days of Search engines, what a resource directory or aggregator site do to rank? Because even directories need first time visitors till they start coming back again.
Intermediate & Advanced SEO | | Maayboli0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
301 Redirect of subdomain?
Fellow Mozzers, I'm having a hard time wrapping my brain around a redirect issue and thought it was worth posing the question to the Moz community. I did a search first but couldn't find the exact answer I was looking for. How does a 301 redirect work when you redirect a sub domain example.homepage.com to www.homepage.com but you keep the sub directories of example.homepage.com/page-1 active and are trying to rank them? I'm dealing with a current project where this is happening and this doesn't make sense to me, to redirect the subdomain if you're also trying to rank/create search traffic for pages, sub directories on example.homepage.com. This also get's into the debate of if a sub domain site is viewed as it's own website and therefore has to rank itself. If this is true, it seems like we're kind of killing the authority of the site by redirecting it. Additionally, www.homepage.com has a much stronger link profile than example.homepage.com I hope this makes sense. Any thoughts are appreciated. Thanks for your time.
Intermediate & Advanced SEO | | SMG-Texas0 -
My website (non-adult) is not appearing in Google search results when i have safe search settings on. How can i fix this?
Hi, I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated. Thanks
Intermediate & Advanced SEO | | CupidTeam0 -
Turning off a subdomain
Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed. We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority. Any thoughts would be much appreciated!
Intermediate & Advanced SEO | | Blink-SEO0 -
Disavow Subdomain?
Hi all, I've been checking and it seems like there are only 2 options when disavowing links with Google's tool. Disavow the link: http://spam.example.com/stuff/content.htm Disavow the domain: domain: example.com What can I do if I want do disavow a subdomain? i.e. spam.site.com I'm also assuming that if I were to disavow the domain it would include all subdomains? Thanks.
Intermediate & Advanced SEO | | Carlos-R0