Subdomains vs directories on existing website with good search traffic
-
Hello everyone,
I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it.
We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go.
So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits.
All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains.
Thank you in advance.
-
Hi Damien,
So if I'm reading this correctly, the consultant is saying that due to the size of the site (tens of thousands of pages), and the need to categorise its content, that subdomains are the best choice.
I would say that there are far bigger websites using categories within subfolders, notably big retailers, e.g.
http://www.marksandspencer.com/c/beauty, http://www.marksandspencer.com/c/food-and-wine, http://www.marksandspencer.com/c/mands-bank
http://www.waitrose.com/home/inspiration.html, http://www.waitrose.com/home/wine.html, http://www.waitrose.com/content/waitrose/en/home/tv/highlights.html (<-- the last one being a crappy version, but a subdomain nonetheless)
and so do websites that deal with providing content for very different audiences:
http://www.ncaa.com/schools/tampa, http://www.ncaa.com/championships/lacrosse-men/d1/tickets, http://www.ncaa.com/news/swimming-men/article/2014-03-29/golden-bears-and-coach-david-durden-earn-third-national-title, http://www.ncaa.com/stats/football/fbs
Has the consultant provided examples of other websites doing this that would take on the same structure?
There are hundreds of examples of websites whose structure / categories are properly understood despite existing in subdirectories, so I'm still sceptical that this is a necessity.
This is not to say that a subdomain approach wouldn't work and is definitively bad or anything, I'm just not really convinced that the reasoning is strong enough to move content away from the root domain.
I disagree about user experience - from a user's perspective, the only difference between subfolders and subdomains is the URL they can see in the address bar. The rest is aesthetic. You can do or not do everything you'd do with the design of a website using subdirectories that you'd do with a website(s) employing subdomains. For example, just because content sits on www.icy-veins.com/wow/, its navigation wouldn't have to link to www.icy-veins.com/hearthstone/ or mention the other brand in any way if you don't want to. You can still have separate conversion funnels, newsletter sign-ups, advertising pages, etc.
-
Thank you for shedding more light on the matter. Here are the reasons why our consultant thought that subdomains would be better:
In the case of ICY VEINS the matter is clear, subdomains will be the best of course of action and I will quickly explain why
- The domain has over 10,000+ pages (my scan is still running looking at 66,000+ addresses already) which put it in a whole new category. For smaller sites and even local business sites sub directories will always be the better choice
- Sub Domains will allow you to categorize the different categories of your website. The sub domains in mind are all relating to the gaming industry so it still makes it relevant to the global theme of the website.
- Splitting up the different categories into subdomains will allow search engine to better differentiate the areas of your website (see attached image named icy-veins SERP – Sitelink.png). At the moment Google do not properly categorize your areas of your website and uses your most popular visited areas as the given site links in the search engine results page)
- However noting that you already have the sub directory /heartstone a .htaccess 301 redirect for that whole directory will have to be set in place for any current. This will ensure that any inbound links from other sites will be automatically redirected to the correct sub domain and index page. Failing to implement the redirect will cause that the correct Page Authority and Domain Authority not to carry over to the sub domain. Technically heartstone.icy-veins.com and icy-veins.com is to separate domains according to the DNS that is why it is important to ensure that the redirects is in place to carry over any “seo juice” that the old directory had.
- Sub domains enhances the user experience of your visitors by keeping to separate themes and topics. This will have a positive impact on your bounce rate (which is currently sitting at 38% for the last 30 days) and better funnelling for goal conversions (i.e. donate | newsletter signup | advertise on our website
- Essentially you are focusing on different products for the same brand
In the end of the day it comes down to your personal preference although sub domains will be a better choice to ensure that your different products are split up and reflects better with the search engine results pages.
-
Hi Damien,
There are cases where subdomains are very necessary or inevitable, usually because of technical limitations (and even then, they can usually be worked around via practices like reverse proxy). When you see subdomains in the wild and aren't sure why they're being used, they will often just be legacies - old set-ups that no one wants to change because it would require redirecting old URLs, which is inadvisable if those URLs don't need to be redirected and if they rank well.
In this case, I'd be really interested to know why the SEO was adamant that the new structure should use subdomains and not subdirectories. Google is much better at working with new subdomains now than it was in years past, but if there is no good reason to use them, subdirectories are still the safer option for SEO purposes, and the content housed on subdirectories should automatically inherit authority from the parent domain. New subdomains seem to be far less likely to inherit this authority, as other responders have said above.
Find out exactly why the SEO wanted subdomains - if their reasoning isn't solid, you may want to place this content in subdirectories and place 301 redirects from the subdomains to the subdirectories. If you are going to do these redirects, doing them sooner rather than later is advisable as redirection usually comes with a short period of lower rankings / traffic.
On that note, redirection does usually result with that short period of traffic loss, but that should happen quite quickly and be fixing itself in 2+ weeks, not getting worse.
-
Unfortunately yes you will need to 301 the subdomains back to the folder structure.
-
Thank you Dean and Caitlin! So, I guess the next step would be to revert the change and switch to directories (using 301-redirects from the subdomains), right?
-
I agree with Dean above. Subdomains split your authority. Basically, this means that Google considers wow.icey-veins.com and hearthstone.iceyveins.com as two separate websites in their book. For this reason, folders would have been the far better solution - the site's authority would have remained the same and any additional folders added to the site and resulting links to that folder would have continued to build up the website's authority.
Don't get me wrong, there are a number of websites that utilize subdomains (typically very large sites). In fact, it use to be very common in year's past. However, subdomains are no longer seen as SEO best practice. ^Caitlin
-
The advice to use sub domains is a wow in it self from an SEO point of view. Sub domains do not pass authority so it's basically like having a new domain for each sub domain.Folders would have been a far better solution in my opinion.
Interesting debate regarding the learning page re domains on Moz here: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site with both subfolders and subdomains
Hi everyone,
Intermediate & Advanced SEO | | medi_
I'm working on a website that has a quite extensive subfolder structure for product and multilingual purposes.
domain.com/en
domain.com/it
domain.com/fr
domain.com/en/category
domain.com/it/category
domain.com/fr/category
domain.com/en/category/product
domain.com/it/category/product
domain.com/fr/category/product
domain.com/en/category/product/region
domain.com/it/category/product/region
domain.com/fr/category/product/region
and so on... We will soon be launching a completely different service, which would make the subfolder structure become even more complex. As John Mueller recently stated that Subdomains and Subfolders are treated the same by Google, I am now considering building that new service under subdomains for product reason, and for the sake of clarity. 1- Would my subdomains inherit the authority of my main domain?
2- Do I have to keep the language folders with the subdomain structure?
e.g.:
new-service.domain.com/en
nouveau-service.domain.com/fr
nuovo-servizio.domain.com/it OR
new-service.domain.com
nouveau-service.domain.com
nuovo-servizio.domain.com Looking forward to reading you!0 -
Having issues crawling a website
We looked to use the Screaming Frog Tool to crawl this website and get a list of all meta-titles from the site, however, it only resulted with the one result - the homepage. We then sought to obtain a list of the URLs of the site by creating a sitemap using https://www.xml-sitemaps.com/. Once again however, we just go the one result - the homepage. There is something that seems to be restricting these tools from crawling all pages. If you anyone can shed some light as to what this could be, we'd be most appreciative.
Intermediate & Advanced SEO | | Gavo0 -
Previously blacklisted website still not appearing on Google searches.
We have a client who before us, had a website that was blacklisted by Google. After we created their new website, we submitted an appeal through Google's Webmaster Tools, and it was approved. One year later, they are still unable to rank for anything on Google. The keyword we are attempting to rank for on their home page is "Day in the Life Legal Videos" which shouldn't be too difficult to rank for after a year. But their website cannot be found. What else can we do to repair this previously blacklisted website after we're already been approved by Google? Here is the website in question: https://www.verdictvideos.com/
Intermediate & Advanced SEO | | rodneywarner0 -
Linking to External Websites?
Is it good to link external websites from every page. Since, the on-page grader shows there should be one link pointing to an external source. I have a website that can point to an external website from every page using the brand name of the specific site like deal sites do have. Is it worth having external link on every page, of-course with a no-follow tag?
Intermediate & Advanced SEO | | welcomecure0 -
An improved search box within the search results - Results?
Hello~ Does anyone have any positive traffic results to share since implementing this? Thanks! MS
Intermediate & Advanced SEO | | MargaritaS0 -
Hreflang in vs. sitemap?
Hi all, I decided to identify alternate language pages of my site via sitemap to save our development team some time. I also like the idea of having leaner markup. However, my site has many alternate language and country page variations, so after creating a sitemap that includes mostly tier 1 and tier 2 level URLs, i now have a sitemap file that's 17mb. I did a couple google searches to see is sitemap file size can ever be an issue and found a discussion or two that suggested keeping the size small and a really old article that recommended keeping it < 10mb. Does the sitemap file size matter? GWT has verified the sitemap and appears to be indexing the URLs fine. Are there any particular benefits to specifying alternate versions of a URL in vs. sitemap? Thanks, -Eugene
Intermediate & Advanced SEO | | eugene_bgb0 -
4 websites with same content?
I have 4 websites (1 Main, 3 duplicate) with same content. Now I want to change the content for duplicate websites and main website will remain the same content. Is there any problem with my thinking?
Intermediate & Advanced SEO | | marknorman0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0