Turning off a subdomain
-
Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed.
We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority.
Any thoughts would be much appreciated!
-
This is a fantastic answer, thank you.
-
Sounds like the indexing issues are causing some drops in ranking, even though good based content and domain authority are present.
Also, the .v1 site looks to be a testing platform?Could that be possible? I recently had an issue with an enterprise client site with very similar issues - with multiple testing versions of the domain up and indexable, causing massive amounts of duplicate content, indexed content and indexing issues.
I would plan to assess any content that could me migrated over to the main site from the .v1, and 301 redirect (and rel-canonical) the old .v1 site pages. Keep those in place for a few months to ensure that all the value of the 301 take effect.
By migrating some of this valuable content over (or all of it), just make sure you use both properly executed 301 redirects, and to take it a step further, apply the canonical tag on the .v1 pages with redirects to the exisiting and correct pages on the main domain. This way, we know for sure all any value is being passed.
SIDE NOTE: Having that many pages, indexed content doesn't mean the site will do well. In fact with this poor setup, the site's massive amount of page URL's might be causing more damage. Too many pages will bad page quality scores can and will bring a site down. Plan to migrate the pages or sections of the site to the main domain (that hold the most value), 301 and rel-canonical the other's, and remove the bad pages with little to no value that may be causing site wide damage in search indexing.
In dumping lots of content from the site - redirect those URL's (being dumped) to a helpful 404 page, which will try to salvage any user hitting the page, and redirecting them to back into sections of pages of the site. Also - make sure that page has a useful 'search' option that is clear to allow them to search for something they might have tried to land on through organic indexed content.
Finally, once you see indexing improve and redirect those pages automatically in the SERP's through reporting in weeks or months to come, then you can shut down the old .v1 pages without fear of losing any value you had.
It's a lengthy process and a big project, but the client (and site) should see huge value in the time you are taking to manage it. It will maintain value for the site in the long run and help build a better platform going forward.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a root domain get SEO power from its subdomains?
Hi there! I'd appreciate your help with the following case: a) Current 10-year-old website (community) on root domain "example.com" (250,000 incoming quality-backlinks) will move to the new subdomain "newsub.example.com" (301 redirects to the new subdomain for all current subfolders) b) A new website (shop) will launch on the root domain "example.com" Question: Will the new website on "example.com" get SEO power from the old website on "newsub.example.com"? SEO power = linkjuice/authority/trust/history/etc. from the 250,000 backlinks. What I'm trying to achieve: Maintain the built-up SEO power for the root domain "example.com" Thanks for sharing your thoughts on this! P.S. Plenty has been written about subdomains inheriting from their root domains (so please don't share input on the subdomain vs. subfolder debate). But I can't find satisfactory info about the other way around (root domains inheriting from their subdomains), e.g. if wikia.com gets SEO power from its subdomains superman.wikia.com, starwars.wikia.com, etc.)
Intermediate & Advanced SEO | | ebebeb0 -
Should we host our magazine on a subdomain of E-com site or its own domain?
We host a online fashion magazine on a subdomain of our e-commerce site. Currently we host the blog which is word press based on a subdomain ex: stylemag.xxxxxxx.com First question is are all the links from our blog considered internal links? They do not show in the back links profile. Also would it be better to host this on its own domain? Second question Is my main URL getting credit for the unique content published to the blog on the subdomain and if so is it helping the overall SEO of my website more then if it and the links were hosted on its own wordpress.com
Intermediate & Advanced SEO | | kushvision0 -
Disavowin a sitewide link that has Thousands of subdomains. What do we tell Google?
Hello, I have a hosting company that partnered up with a blogger template developer that allowed users to download blog templates and have my footer links placed sitewide on their website. Sitewides i know are frowned upon and that's why i went through the rigorous Link Audit months ago and emailed every webmaster who made "WEBSITENAME.Blogspot.com" 3 times each to remove the links. I'm at a point where i have 1000 sub users left that use the domain name of "blogspot.com". I used to have 3,000! Question: When i disavow these links in Webmaster tools for Google and Bing, should i upload all 1000 subdomains of "blogspot.com" individually and show Google proof that i emailed all of them individually, or is it wise to just include just 1 domain name (www.blogspot.com) so Google sees just ONE big mistake instead of 1000. This has been on my mind for a year now and I'm open to hearing your intelligent responses.
Intermediate & Advanced SEO | | Shawn1240 -
301 Redirected url to new subdomain, now the rank appears to be completely gone...
In an attempt to not feel bad for not blogging, I set up a new subdomain on my site to have a "coming soon" style page and "best of" section for my blog and video show properties. All the pages on the relaunch subdomain are done in Unbounce. http://relaunch.tommy.ismy.name The idea was that I would then take the pages on my regular domain, and one by one create landing pages that test out new design ideas (instead of going into full production web design) and redirect the traffic from the top ranked pages to the new, redesigned pages. At first, I set up the 301 through a plugin in wordpress and for the first week or so it was great. As far as I know, I did set my canonical tags up properly on that page too. However, just a couple days ago, I wasn't getting the same traffic, and my top ranked keyword that accounts for over half my traffic is nowhere to be found in at least the first 15 pages of search results. Which stinks, because I've maintained that rank for well over 2 years 😞 Clearly, something I did wasn't liked by Google, and I wonder, what did I do "wrong" and is there anything I could do to get that rank back?
Intermediate & Advanced SEO | | Thomas_m_walker0 -
SEOMOZ crawler is still crawling a subdomain despite disallow
This is for our client with a subdomain. We only want to analyze their main website as this is the one we want to SEO. The subdomain is not optimized so we know it's bound to have lots of errors. We added the disallow code when we started and it was working fine. We only saw the errors for the main domain and we were able to fix them. However, just a month ago, the errors and warnings spiked up and the errors we saw were for the subdomain. As far as our web guys are concerned. the disallow code is still there and was not touched. User-agent: rogerbot Disallow: / We would like to know if there's anything we might have unintentionally changed or something we need to do so that the SEOMOZ crawler will stop going through the subdomain. Any help is greatly appreciated!
Intermediate & Advanced SEO | | TheNorthernOffice790 -
Looking for re-assurance on this one: Sitemap approach for multi-subdomains
Hi All: Just looking for a bit of "yeah it'll be fine" reassurance on this before we go ahead and implement: We've got a main accommodation listing website under www.* and a separate travel content site using a completely different platform on blog.* (same domain - diffn't sub-domain). We pull in snippets of content from blog.* > www.* using a feed and we have cross-links going both ways, e.g. links to find accommodation in blog articles and links to blog articles from accommodation listings. Look-and-feel wise they're fully integrated. The blog.* site is a tab under the main nav. What i'd like to do is get Google (and others) to view this whole thing as one site - and attribute any SEO benefit of content on blog.* pages to the www.* domain. Make sense? So, done a bit of reading - and here's what i've come up with: Seperate sitemaps for each, both located in the root of www site www.example.com/sitemap-www www.example.com/sitemap-blog robots.txt in root of www site to have single sitemap entry: sitemap : www.example.com/sitemap-www robots.txt in root of blog site to have single sitemap entry: sitemap: www.example.com/sitemap-blog Submit both sitemaps to Webmaster tools. Does this sound reasonable? Any better approaches? Anything I'm missing? All input appreciated!
Intermediate & Advanced SEO | | AABAB0 -
Create new subdomain or new site for new Niche Product?
We have an existing large site with strong, relevant traffic, including excellent SEO traffic. The company wants to launch a new business offering, specifically targeted at the "small business" segment. Because the "small business" customer is substantially different from the traditional "large corporation" customer, the company has decided to create a completely independent microsite for the "small business" market. Purely from a Marketing and Communications standpoint, this makes sense. From an SEO perspective, we have 2 options: Create the new "small business" microsite on a subdomain of the existing site, and benefit from the strong domain authority and trust of the existing site. Build the microsite on a separate domain with exact primary keyword match in the domain name. My sense is that option #1 is by far the better option in the short and long run. Am I correct? Thanks in advance!
Intermediate & Advanced SEO | | axelk0 -
Robots.txt disallow subdomain
Hi all, I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only? Thanks in advance!
Intermediate & Advanced SEO | | Partouter0