I have consolidated my subdomains into subfolders; should i remove the subdomains also?
-
Hi,
I have consolidated my website's many sub-domains into sub-folders. We had ~20 sub-domains. The sub-domains and the root domain shared the same code-base and sub-domain specific features were controlled using Apache's SetEnv directive in the httpd.conf includes.
Now they are consolidated. I have redirected all of the sub-domains to the relevant sub-folder; so http://sub1.mysite.com now 301 redirects to http://www.mysite.com/sub1.The redirects happen in the htaccess file and all sub-domains and the root are still pointing to the same code-base.
However, the Moz campaign tracker still occasionally tells me that i have optimisation opportunities at http://sub1.mysite.com and in both Bing and Google webmaster tools traffic to those sub-domains is mentioned.
Should i delete the subdomains?
Cheers
-
Robots.txt! That makes sense. I'll do that.
Thanks for your response.
-
If everything is redirecting properly, you shouldn't have an issue, and the subdomain still being present will actually be helpful to any folks with an outdated link or bookmark.
You may want to add noindex to the pages on those subdomains, or block them with robots.txt. Over time, that would get them out of the search results in favor of the new, up-to-date subfolders. Removing them, though, isn't necessary.
-
Hi Jimmy,
Thanks for responding and for the advice. When i hit the old sub-domain with Screaming Frog i get 301 moved permanently and nothing further is crawled. I Frog-crawled a number of the old URLs and they are all redirecting correctly with 301s. It looks like there is no duplicate content but i am still a bit uneasy that the old sub-domain shows up in Moz and the webmaster tools..
-
Hi,
To avoid any duplicate content, not having the subdomains is the safest way to do so.
Check the subdomains with something like the screaming frog, for what is returned, if there is content then you will need to redirect that content.
Kind Regards
Jimmy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Better to Remove Toxic/Low Quality Links Before Building New High Quality Links?
Recently an SEO audit from a reputable SEO firm identified almost 50% of the incoming links to my site as toxic, 40% suspicious and 5% of good quality. The SEO firm believes it imperative to remove links from the toxic domains. Should I remove toxic links before building new one? Or should we first work on building new links before removing the toxic ones? My site only has 442 subdomains with links pointing to it. I am concerned that there may be a drop in ranking if links from the toxic domains are removed before new quality ones are in place. For a bit of background my site has a MOZ Domain authority of 27, a Moz page authority of 38. It receives about 4,000 unique visitors per month through organic search. About 150 subdomains that link to my site have a Majestic SEO citation flow of zero and a Majestic SEO trust flow of zero. They are pretty low quality. However I don't know if I am better off removing them first or building new quality links before I disavow more than a third of the links to the site. Any ideas? Thanks,
Technical SEO | | Kingalan1
Alan0 -
Toxic Link Removal
Greetings Moz Community: Recently I received an site audit from a MOZ certified SEO firm. The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy. After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed. Only after the removal process is complete do they think it would be appropriate to start building new links. Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)? For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources. While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic. I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012. Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up. Any thoughts??? THANKS,
Technical SEO | | Kingalan1
Alan0 -
Remove Directory
www.enakliyat.com.tr/detaylar/page-6255 www.enakliyat.com.tr/detaylar/page-6240 www.enakliyat.com.tr/detaylar/page-6253 www.enakliyat.com.tr/detaylar/page-6255 ..... ..... ..... We have so many page like this page and we want to remove all this page from Google search results. How can I remove from Google Webmaster Tools Help!
Technical SEO | | iskq0 -
Subdomains Issue
Hi , We have created sub domains of our site to target various Geo´s. For example, geo, uk.site.com, de.site,com and all these sub domains have the same content as main domain. Will it affect our SEO Rankings? How can we solve this if it affects our rankings?
Technical SEO | | mikerbrt240 -
A week ago I asked how to remove duplicate files and duplicate titles
Three weeks ago we had a very large number of site errors revealed by crawl diagostics. These errors related purely to the presence of both http://domain name and http://www.domain name. We used the rel canonical tag in the head of our index page to direct all to the www. preference, and we have no improvement. Matters got worse two weeks ago and I checked with Google Webmaster and found that Google had somehow lost our preference choice. A week ago I asked how to overcome this problem and received good advice about how to re-enter our preference for the www.tag with Google. This we did and it was accepted. We aso submitted a new sitemap.xml which was also acceptable to Google. Today, a week later we find that we have even more duplicate content (over 10,000 duplicate errors) showing up in the latest diagnostic crawl. Does anyone have any ideas? (Getting a bit desperate.)
Technical SEO | | FFTCOUK0 -
Campaigns Domain and Subdomain... ?
I made two separate campaigns before I understood the meaning of "subdomain". I make one campaign for my www.com and another for my .com. I now realize I should have made the .com the domain and the www. the subdomain in the same campaign. Is there a way to edit this? Thanks!
Technical SEO | | musicforkids0 -
Directory Indexed in Google, that I dont want, How to remove?
Hi One of my own websites, having a slight issue, Google have indexed over 500+ pages and files from a template directory from my eCommerce website. In google webmaster tools, getting over 580 crawl errors mostly these ones below I went into my robots text file and added Disallow: /skins*
Technical SEO | | rfksolutionsltd
Disallow: /skin1* Will this block Google from searching them again? and how do I go about getting the 500 pages that are already indexed taken out? Any help would be great | http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscription_priceincart.tpl | 403 error | Jan 15, 2012 |
| http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscription_info_inlist.tpl | 403 error | Jan 15, 2012 |
| http://www.rfkprintsolutions.co.uk/skin1/modules/Subscriptions/subscriptions_admin.tpl | 403 error | Jan 15, 2012 |0 -
Subdomain Robots.txt
If I have a subdomain (a blog) that is having tags and categories indexed when they should not be, because they are creating duplicate content. Can I block them using a robots.txt file? Can I/do I need to have a separate robots file for my subdomain? If so, how would I format it? Do I need to specify that it is a subdomain robots file, or will the search engines automatically pick this up? Thanks!
Technical SEO | | JohnECF0