Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Turning off a subdomain
-
Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed.
We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority.
Any thoughts would be much appreciated!
-
This is a fantastic answer, thank you.
-
Sounds like the indexing issues are causing some drops in ranking, even though good based content and domain authority are present.
Also, the .v1 site looks to be a testing platform?Could that be possible? I recently had an issue with an enterprise client site with very similar issues - with multiple testing versions of the domain up and indexable, causing massive amounts of duplicate content, indexed content and indexing issues.
I would plan to assess any content that could me migrated over to the main site from the .v1, and 301 redirect (and rel-canonical) the old .v1 site pages. Keep those in place for a few months to ensure that all the value of the 301 take effect.
By migrating some of this valuable content over (or all of it), just make sure you use both properly executed 301 redirects, and to take it a step further, apply the canonical tag on the .v1 pages with redirects to the exisiting and correct pages on the main domain. This way, we know for sure all any value is being passed.
SIDE NOTE: Having that many pages, indexed content doesn't mean the site will do well. In fact with this poor setup, the site's massive amount of page URL's might be causing more damage. Too many pages will bad page quality scores can and will bring a site down. Plan to migrate the pages or sections of the site to the main domain (that hold the most value), 301 and rel-canonical the other's, and remove the bad pages with little to no value that may be causing site wide damage in search indexing.
In dumping lots of content from the site - redirect those URL's (being dumped) to a helpful 404 page, which will try to salvage any user hitting the page, and redirecting them to back into sections of pages of the site. Also - make sure that page has a useful 'search' option that is clear to allow them to search for something they might have tried to land on through organic indexed content.
Finally, once you see indexing improve and redirect those pages automatically in the SERP's through reporting in weeks or months to come, then you can shut down the old .v1 pages without fear of losing any value you had.
It's a lengthy process and a big project, but the client (and site) should see huge value in the time you are taking to manage it. It will maintain value for the site in the long run and help build a better platform going forward.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
Should I run my Shopify store on a subdomain or buy a new domain for it?
I'm planning to set up a subdomain for my Shopify store but I'm not sure if this is the right approach. Should I purchase a separate domain for it? I'm running Wordpress on my website and want to keep it that way. I want to use Shopify for the ecommerce side. I want to link the store from the top nav and of course I'll use CTA's in a variety of ways to point to merchandise and other things on the store side. Thanks for any help you can offer.
Intermediate & Advanced SEO | | ims20160 -
Blog subdomain not redirecting
Over the last few weeks I have been focused on fixing high and medium priority issues, as reported by the Moz crawler, after a recent transition to WordPress. I've made great progress, getting the high priority issues down from several hundred (various reasons, but many duplicates for things like non-www and www versions) to just five last week. And then there's this weeks report. For reasons I can't fathom, I am suddenly getting hundreds of duplicate content pages of the form http://blog.<domain>.com</domain> (being duplicates with the http://www.<domain>.com</domain> versions). I'm really unclear on why these suddenly appeared. I host my own WordPress site ie WordPress.org stuff. In Options / General everything refers to http://www.<domain>.com</domain> and has done for a number of weeks. I have no idea why the blog versions of the pages have suddenly appeared. FWIW, the non-www version of my pages still redirect to the www version, as I would expect. I'm obviously pretty concerned by this so any pointers greatly appreciated. Thanks. Mark
Intermediate & Advanced SEO | | MarkWill0 -
Merging two different domains - subdomain or subfolder?
My company has two sites on different domains. We are considering merging the sites into one and keeping only the dominant domain. The dominate site is already a sub-domain of a larger organization so the new sub-domain would be two levels deep. I realize this is a little abstract so below is an example Dominant company site: company.root-domain.com Secondary company site: other-root-domain.com When they merge, everything will be on company.root-domain.com. Should it be other.company.root-domain.com or company.root-domain.com/other Note: The other site has several hundred pages. Both sites have strong authority and link profiles. I want to maintain as much of the value on the other site as possible with the merge.
Intermediate & Advanced SEO | | SEI0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
Subdomain Blog Sitemap link - Add it to regular domain?
Example of setup:
Intermediate & Advanced SEO | | EEE3
www.fancydomain.com
blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!0 -
Follow or nofollow to subdomain
Hi, I run a hotel booking site and the booking engine is setup on a subdomain.
Intermediate & Advanced SEO | | vmotuz
The subdomain is disabled from being indexed in robots.txt Should the links from the main domain have a nofollow to the subdomain? What are you thoughts? Thanks!0 -
Subdomain for every us state?
Hi, one of our clients has an idea of making subdomains from his main website to sell his online advertisements in all states in USA. f.e: texas.web.com atlanta.web.com He wants to have a subdomain for every state and there to be information related only or mainly to this state? I am not sure about is this a good idea? What is your opinion about it?
Intermediate & Advanced SEO | | vladokan0