Subdomains vs directories on existing website with good search traffic
-
Hello everyone,
I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it.
We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go.
So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits.
All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains.
Thank you in advance.
-
Hi Damien,
So if I'm reading this correctly, the consultant is saying that due to the size of the site (tens of thousands of pages), and the need to categorise its content, that subdomains are the best choice.
I would say that there are far bigger websites using categories within subfolders, notably big retailers, e.g.
http://www.marksandspencer.com/c/beauty, http://www.marksandspencer.com/c/food-and-wine, http://www.marksandspencer.com/c/mands-bank
http://www.waitrose.com/home/inspiration.html, http://www.waitrose.com/home/wine.html, http://www.waitrose.com/content/waitrose/en/home/tv/highlights.html (<-- the last one being a crappy version, but a subdomain nonetheless)
and so do websites that deal with providing content for very different audiences:
http://www.ncaa.com/schools/tampa, http://www.ncaa.com/championships/lacrosse-men/d1/tickets, http://www.ncaa.com/news/swimming-men/article/2014-03-29/golden-bears-and-coach-david-durden-earn-third-national-title, http://www.ncaa.com/stats/football/fbs
Has the consultant provided examples of other websites doing this that would take on the same structure?
There are hundreds of examples of websites whose structure / categories are properly understood despite existing in subdirectories, so I'm still sceptical that this is a necessity.
This is not to say that a subdomain approach wouldn't work and is definitively bad or anything, I'm just not really convinced that the reasoning is strong enough to move content away from the root domain.
I disagree about user experience - from a user's perspective, the only difference between subfolders and subdomains is the URL they can see in the address bar. The rest is aesthetic. You can do or not do everything you'd do with the design of a website using subdirectories that you'd do with a website(s) employing subdomains. For example, just because content sits on www.icy-veins.com/wow/, its navigation wouldn't have to link to www.icy-veins.com/hearthstone/ or mention the other brand in any way if you don't want to. You can still have separate conversion funnels, newsletter sign-ups, advertising pages, etc.
-
Thank you for shedding more light on the matter. Here are the reasons why our consultant thought that subdomains would be better:
In the case of ICY VEINS the matter is clear, subdomains will be the best of course of action and I will quickly explain why
- The domain has over 10,000+ pages (my scan is still running looking at 66,000+ addresses already) which put it in a whole new category. For smaller sites and even local business sites sub directories will always be the better choice
- Sub Domains will allow you to categorize the different categories of your website. The sub domains in mind are all relating to the gaming industry so it still makes it relevant to the global theme of the website.
- Splitting up the different categories into subdomains will allow search engine to better differentiate the areas of your website (see attached image named icy-veins SERP – Sitelink.png). At the moment Google do not properly categorize your areas of your website and uses your most popular visited areas as the given site links in the search engine results page)
- However noting that you already have the sub directory /heartstone a .htaccess 301 redirect for that whole directory will have to be set in place for any current. This will ensure that any inbound links from other sites will be automatically redirected to the correct sub domain and index page. Failing to implement the redirect will cause that the correct Page Authority and Domain Authority not to carry over to the sub domain. Technically heartstone.icy-veins.com and icy-veins.com is to separate domains according to the DNS that is why it is important to ensure that the redirects is in place to carry over any “seo juice” that the old directory had.
- Sub domains enhances the user experience of your visitors by keeping to separate themes and topics. This will have a positive impact on your bounce rate (which is currently sitting at 38% for the last 30 days) and better funnelling for goal conversions (i.e. donate | newsletter signup | advertise on our website
- Essentially you are focusing on different products for the same brand
In the end of the day it comes down to your personal preference although sub domains will be a better choice to ensure that your different products are split up and reflects better with the search engine results pages.
-
Hi Damien,
There are cases where subdomains are very necessary or inevitable, usually because of technical limitations (and even then, they can usually be worked around via practices like reverse proxy). When you see subdomains in the wild and aren't sure why they're being used, they will often just be legacies - old set-ups that no one wants to change because it would require redirecting old URLs, which is inadvisable if those URLs don't need to be redirected and if they rank well.
In this case, I'd be really interested to know why the SEO was adamant that the new structure should use subdomains and not subdirectories. Google is much better at working with new subdomains now than it was in years past, but if there is no good reason to use them, subdirectories are still the safer option for SEO purposes, and the content housed on subdirectories should automatically inherit authority from the parent domain. New subdomains seem to be far less likely to inherit this authority, as other responders have said above.
Find out exactly why the SEO wanted subdomains - if their reasoning isn't solid, you may want to place this content in subdirectories and place 301 redirects from the subdomains to the subdirectories. If you are going to do these redirects, doing them sooner rather than later is advisable as redirection usually comes with a short period of lower rankings / traffic.
On that note, redirection does usually result with that short period of traffic loss, but that should happen quite quickly and be fixing itself in 2+ weeks, not getting worse.
-
Unfortunately yes you will need to 301 the subdomains back to the folder structure.
-
Thank you Dean and Caitlin! So, I guess the next step would be to revert the change and switch to directories (using 301-redirects from the subdomains), right?
-
I agree with Dean above. Subdomains split your authority. Basically, this means that Google considers wow.icey-veins.com and hearthstone.iceyveins.com as two separate websites in their book. For this reason, folders would have been the far better solution - the site's authority would have remained the same and any additional folders added to the site and resulting links to that folder would have continued to build up the website's authority.
Don't get me wrong, there are a number of websites that utilize subdomains (typically very large sites). In fact, it use to be very common in year's past. However, subdomains are no longer seen as SEO best practice. ^Caitlin
-
The advice to use sub domains is a wow in it self from an SEO point of view. Sub domains do not pass authority so it's basically like having a new domain for each sub domain.Folders would have been a far better solution in my opinion.
Interesting debate regarding the learning page re domains on Moz here: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will deindexing a subdomain negate the benefit of backlinks leading to that subdomain?
My client has a subdomain from their main site where their online waiver tool lives. Currently, all the waivers generated by users are creating indexed pages, I feel they should deindex that subdomain entirely. However, a lot of their backlinks are from their clients linking to their waivers. If they end up deindexing their subdomain, will they lose the SEO benefit of backlinks pointing to that subdomain? Thanks! Jay
Intermediate & Advanced SEO | | MCC_DSM0 -
Why is hosting good for SEO?
I've heard a few people mention this now. I have seen hosting packages range from £5 to £1000 per month, and I understand that each comes with their own amounts of storage space, bandwidth and all. Now I understand that page speed is important to SEO and the type of hosting will dictate your page speed, but other than this why is hosting important to SEO?
Intermediate & Advanced SEO | | moon-boots0 -
Having issues crawling a website
We looked to use the Screaming Frog Tool to crawl this website and get a list of all meta-titles from the site, however, it only resulted with the one result - the homepage. We then sought to obtain a list of the URLs of the site by creating a sitemap using https://www.xml-sitemaps.com/. Once again however, we just go the one result - the homepage. There is something that seems to be restricting these tools from crawling all pages. If you anyone can shed some light as to what this could be, we'd be most appreciative.
Intermediate & Advanced SEO | | Gavo0 -
Website using search term as URL brand name to cheat Google
Google has come a long way over the past 5 years, the quality updates have really helped bring top quality content to the top that is relevant for users search terms, although there is one really ANNOYING thing that still has not been fixed. Websites using brand name as service search term to manipulate Google I have got a real example but I wouldn't like to use it in case the brand mentions flags up in their tools and they spot this post, but take this search for example "Service+Location" You will get 'service+location.com' rank #1 Why? Heaven knows. They have less than 100 backlinks which are of a very low, spammy quality from directories. The content is poor compared to the competition and the competitors have amazing link profiles, great social engagement, much better website user experience and the data does not prove anything. All the competitors are targeting the same search term but yet the worst site is ranking the highest. Why on earth is Google not fixing this issue. This page we are seeing rank #1 do not even deserve to be ranking on the first 5 pages.
Intermediate & Advanced SEO | | Jseddon920 -
My website is not indexing
Hello Experts As i search site :http://www.louisvuittonhandbagss.com or just entering http://www.louisvuittonhandbagss.com on Google i am not getting my website . I have done following steps 1. I have submitted sitemaps and indexed all the site maps 2.i have used GWT feature fetch as Google . 3. I have submitted my website to top social book marking websites and to some classified sites also . Pleae
Intermediate & Advanced SEO | | aschauhan5210 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
My website (non-adult) is not appearing in Google search results when i have safe search settings on. How can i fix this?
Hi, I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated. Thanks
Intermediate & Advanced SEO | | CupidTeam0 -
Massive decreases in traffic
Hi i've been looking at the affects of googles algorithmic updates over the last couple years and the impact on sites/competitors i have been monitoring in the space. Two sites which surprised me, in having a dramatic decline in search traffic were: kriskris.com (over 200k visitors to around 10k) only-cookware.com (from 40k visitors at its peak to only around 1000k) (semrush traffic data attached) Both sites have great quality content and social signals. The only thing i can think of is a over-optimization of anchor text, and types of links. dnrm0Oa.png cuaLzrI.png
Intermediate & Advanced SEO | | monster990