Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomains vs directories on existing website with good search traffic
-
Hello everyone,
I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it.
We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go.
So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits.
All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains.
Thank you in advance.
-
Hi Damien,
So if I'm reading this correctly, the consultant is saying that due to the size of the site (tens of thousands of pages), and the need to categorise its content, that subdomains are the best choice.
I would say that there are far bigger websites using categories within subfolders, notably big retailers, e.g.
http://www.marksandspencer.com/c/beauty, http://www.marksandspencer.com/c/food-and-wine, http://www.marksandspencer.com/c/mands-bank
http://www.waitrose.com/home/inspiration.html, http://www.waitrose.com/home/wine.html, http://www.waitrose.com/content/waitrose/en/home/tv/highlights.html (<-- the last one being a crappy version, but a subdomain nonetheless)
and so do websites that deal with providing content for very different audiences:
http://www.ncaa.com/schools/tampa, http://www.ncaa.com/championships/lacrosse-men/d1/tickets, http://www.ncaa.com/news/swimming-men/article/2014-03-29/golden-bears-and-coach-david-durden-earn-third-national-title, http://www.ncaa.com/stats/football/fbs
Has the consultant provided examples of other websites doing this that would take on the same structure?
There are hundreds of examples of websites whose structure / categories are properly understood despite existing in subdirectories, so I'm still sceptical that this is a necessity.
This is not to say that a subdomain approach wouldn't work and is definitively bad or anything, I'm just not really convinced that the reasoning is strong enough to move content away from the root domain.
I disagree about user experience - from a user's perspective, the only difference between subfolders and subdomains is the URL they can see in the address bar. The rest is aesthetic. You can do or not do everything you'd do with the design of a website using subdirectories that you'd do with a website(s) employing subdomains. For example, just because content sits on www.icy-veins.com/wow/, its navigation wouldn't have to link to www.icy-veins.com/hearthstone/ or mention the other brand in any way if you don't want to. You can still have separate conversion funnels, newsletter sign-ups, advertising pages, etc.
-
Thank you for shedding more light on the matter. Here are the reasons why our consultant thought that subdomains would be better:
In the case of ICY VEINS the matter is clear, subdomains will be the best of course of action and I will quickly explain why
- The domain has over 10,000+ pages (my scan is still running looking at 66,000+ addresses already) which put it in a whole new category. For smaller sites and even local business sites sub directories will always be the better choice
- Sub Domains will allow you to categorize the different categories of your website. The sub domains in mind are all relating to the gaming industry so it still makes it relevant to the global theme of the website.
- Splitting up the different categories into subdomains will allow search engine to better differentiate the areas of your website (see attached image named icy-veins SERP – Sitelink.png). At the moment Google do not properly categorize your areas of your website and uses your most popular visited areas as the given site links in the search engine results page)
- However noting that you already have the sub directory /heartstone a .htaccess 301 redirect for that whole directory will have to be set in place for any current. This will ensure that any inbound links from other sites will be automatically redirected to the correct sub domain and index page. Failing to implement the redirect will cause that the correct Page Authority and Domain Authority not to carry over to the sub domain. Technically heartstone.icy-veins.com and icy-veins.com is to separate domains according to the DNS that is why it is important to ensure that the redirects is in place to carry over any “seo juice” that the old directory had.
- Sub domains enhances the user experience of your visitors by keeping to separate themes and topics. This will have a positive impact on your bounce rate (which is currently sitting at 38% for the last 30 days) and better funnelling for goal conversions (i.e. donate | newsletter signup | advertise on our website
- Essentially you are focusing on different products for the same brand
In the end of the day it comes down to your personal preference although sub domains will be a better choice to ensure that your different products are split up and reflects better with the search engine results pages.
-
Hi Damien,
There are cases where subdomains are very necessary or inevitable, usually because of technical limitations (and even then, they can usually be worked around via practices like reverse proxy). When you see subdomains in the wild and aren't sure why they're being used, they will often just be legacies - old set-ups that no one wants to change because it would require redirecting old URLs, which is inadvisable if those URLs don't need to be redirected and if they rank well.
In this case, I'd be really interested to know why the SEO was adamant that the new structure should use subdomains and not subdirectories. Google is much better at working with new subdomains now than it was in years past, but if there is no good reason to use them, subdirectories are still the safer option for SEO purposes, and the content housed on subdirectories should automatically inherit authority from the parent domain. New subdomains seem to be far less likely to inherit this authority, as other responders have said above.
Find out exactly why the SEO wanted subdomains - if their reasoning isn't solid, you may want to place this content in subdirectories and place 301 redirects from the subdomains to the subdirectories. If you are going to do these redirects, doing them sooner rather than later is advisable as redirection usually comes with a short period of lower rankings / traffic.
On that note, redirection does usually result with that short period of traffic loss, but that should happen quite quickly and be fixing itself in 2+ weeks, not getting worse.
-
Unfortunately yes you will need to 301 the subdomains back to the folder structure.
-
Thank you Dean and Caitlin! So, I guess the next step would be to revert the change and switch to directories (using 301-redirects from the subdomains), right?
-
I agree with Dean above. Subdomains split your authority. Basically, this means that Google considers wow.icey-veins.com and hearthstone.iceyveins.com as two separate websites in their book. For this reason, folders would have been the far better solution - the site's authority would have remained the same and any additional folders added to the site and resulting links to that folder would have continued to build up the website's authority.
Don't get me wrong, there are a number of websites that utilize subdomains (typically very large sites). In fact, it use to be very common in year's past. However, subdomains are no longer seen as SEO best practice. ^Caitlin
-
The advice to use sub domains is a wow in it self from an SEO point of view. Sub domains do not pass authority so it's basically like having a new domain for each sub domain.Folders would have been a far better solution in my opinion.
Interesting debate regarding the learning page re domains on Moz here: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Someone redirected his website to ours
Hi all, I have strange issue as someone redirected website http://bukmachers.pl to ours https://legalnibukmacherzy.pl We don't know exactly what to do with it. I checked backlinks and the website had some links which now redirect to us. I also checked this website on wayback machine and back in 2017 this website had some low quality content but in 2018 they made similar redirection to current one but to different website (our competitor). Can such redirection be harmful for us? Should we do something with this or leave it, as google stop encouraging to disavow low quality links.
Intermediate & Advanced SEO | | Kahuna_Charles1 -
Landing pages for paid traffic and the use of noindex vs canonical
A client of mine has a lot of differentiated landing pages with only a few changes on each, but with the same intent and goal as the generic version. The generic version of the landing page is included in navigation, sitemap and is indexed on Google. The purpose of the differentiated landing pages is to include the city and some minor changes in the text/imagery to best fit the Adwords text. Other than that, the intent and purpose of the pages are the same as the main / generic page. They are not to be indexed, nor am I trying to have hidden pages linking to the generic and indexed one (I'm not going the blackhat way). So – I want to avoid that the duplicate landing pages are being indexed (obviously), but I'm not sure if I should use noindex (nofollow as well?) or rel=canonical, since these landing pages are localized campaign versions of the generic page with more or less only paid traffic to them. I don't want to be accidentally penalized, but I still need the generic / main page to rank as high as possible... What would be your recommendation on this issue?
Intermediate & Advanced SEO | | ostesmorbrod0 -
How to Target Country Specific Website Traffic?
I have a website with .com domain but I need to generate traffic from UK? I have already set my GEO Targeting location as UK in Google Webmasters & set country location as UK in Google Analytics as well but still, i get traffic only from India. I have also set Geo-targeting code at the backend of the website. But nothing seems works. Can anyone help me how can is do this? I am unable to understand what else can be done.
Intermediate & Advanced SEO | | seoninj0 -
Moving half my website to a new website: 301?
Good Morning! We currently have two websites which are driving all of our traffic. Our end goal is to combine the two and fold them into each other. Can I redirect the duplicate content from one domain to our main domain even though the URL's are different. Ill give an example below. (The domains are not the real domains). The CEO does not want to remove the other website entirely yet, but is willing to begin some sort of consolidation process. ABCaddiction.com is the main domain which covers everything from drug addiction to dual diagnosis treatment. ABCdualdiagnosis.com is our secondary website which covers everything as well. Can I redirect the entire drug addiction half of the website to ABCaddiction.com? With the eventual goal of moving everything together.
Intermediate & Advanced SEO | | HashtagHustler0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Site wide footer links vs. single link for websites we design
I’ve been running a web design business for the past 5 years, 90% or more of the websites we build have a “web design by” link in the footer which links back to us using just our brand name or the full “web design by brand name” anchor text. I’m fully aware that site-wide footer links arent doing me much good in terms of SEO, but what Im curious to know is could they be hurting me? More specifically I’m wondering if I should do anything about the existing links or change my ways for all new projects, currently we’re still rolling them out with the site-wide footer links. I know that all other things being equal (1 link from 10 domains > 10 links from 1 domain) but is (1 link from 10 domains > 100 links from 10 domains)? I’ve got a lot of branded anchor text, which balances out my exact match and partial match keyword anchors from other link building nicely. Another thing to consider is that we host many of our clients which means there are quite a few on the same server with a shared IP. Should I? 1.) Go back into as many of the sites as I can and remove the link from all pages except the home page or a decent PA sub page- keeping a single link from the domain. 2.) Leave all the old stuff alone but start using the single link method on new sites. 3.) Scratch the site credit and just insert an exact-match anchor link in the body of the home page and hide with with CSS like my top competitor seems to be doing quite successfully. (kidding of course.... but my competitor really is doing this.)
Intermediate & Advanced SEO | | nbeske0 -
Partner Login as subdomain?
Hi MozTeam, We have a website that is used as our partner login for our Partners to see their stats, but it is located on a SEPARATE domain from our main corporate website. We currently have thousands of people logging into the external portal every month, which we are obviously not getting good SEO credit for. I am considering bringing the entire login portal into our main corporate website, so that Google sees how popular and useful our site becomes when thousands more people are visiting... We only get a few thousands organic visits to the corporate site per month and about 3x that to the partner login portal. This is why I originally thought we could benefit from bringing it into our corporate site. Challaneges: our website is in .asp but we are launching a new version of it next month, switching it to Wordpress and into .php....but the current partner login website is still in .asp! Questions: 1. How will bringing this site into the main corporate site benefit us as far as SEO? 2. What is the proper way to combine an .asp site with a .php site? 3. If we have to use an iFrame because we can't mix the two languages, will that affect our SEO benefit? Pls advise, as if this is actually a good idea, I'd like to get it launched along with the site redesign that is currently under way.
Intermediate & Advanced SEO | | DerekM880 -
De-indexed Link Directory
Howdy Guys, I'm currently working through our 4th reconsideration request and just have a couple of questions. Using Link Detox (www.linkresearchtools.com) new tool they have flagged up a 64 links that are Toxic and should be removed. After analysing them further alot / most of them are link directories that have now been de-indexed by Google. Do you think we should still ask for them to be removed or is this a pointless exercise as the links has already been removed because its been de-indexed. Would like your views on this guys.
Intermediate & Advanced SEO | | ScottBaxterWW0