I have consolidated my subdomains into subfolders; should i remove the subdomains also?
-
Hi,
I have consolidated my website's many sub-domains into sub-folders. We had ~20 sub-domains. The sub-domains and the root domain shared the same code-base and sub-domain specific features were controlled using Apache's SetEnv directive in the httpd.conf includes.
Now they are consolidated. I have redirected all of the sub-domains to the relevant sub-folder; so http://sub1.mysite.com now 301 redirects to http://www.mysite.com/sub1.The redirects happen in the htaccess file and all sub-domains and the root are still pointing to the same code-base.
However, the Moz campaign tracker still occasionally tells me that i have optimisation opportunities at http://sub1.mysite.com and in both Bing and Google webmaster tools traffic to those sub-domains is mentioned.
Should i delete the subdomains?
Cheers
-
Robots.txt! That makes sense. I'll do that.
Thanks for your response.
-
If everything is redirecting properly, you shouldn't have an issue, and the subdomain still being present will actually be helpful to any folks with an outdated link or bookmark.
You may want to add noindex to the pages on those subdomains, or block them with robots.txt. Over time, that would get them out of the search results in favor of the new, up-to-date subfolders. Removing them, though, isn't necessary.
-
Hi Jimmy,
Thanks for responding and for the advice. When i hit the old sub-domain with Screaming Frog i get 301 moved permanently and nothing further is crawled. I Frog-crawled a number of the old URLs and they are all redirecting correctly with 301s. It looks like there is no duplicate content but i am still a bit uneasy that the old sub-domain shows up in Moz and the webmaster tools..
-
Hi,
To avoid any duplicate content, not having the subdomains is the safest way to do so.
Check the subdomains with something like the screaming frog, for what is returned, if there is content then you will need to redirect that content.
Kind Regards
Jimmy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
SSL for subdomain is good or bad?
Hello, We have SSL certificate for our domain only for *.website.com, And now, we have few subdomains, as you know, we have two choices: 1. Using HTTPS for subdomain https://me.website.com, while it has problem with https://www.me.website.com (SSL error) 2. Using HTTP for subdomain, which has www and non-www with redirects. Which one is good for us?
Technical SEO | | Anetwork0 -
Best Topography for eCommerce Site Product Pages (flat nav/off the root OR in products subfolder) ?
Hi Im SEO'ing a Shopify site (new/not yet live) at the moment and all the products are in a 'Products' subfolder along the lines of: domain.com/products/blue-widgets/ etc I understand that many ecommerce SEO's these days go 'Flat Navigation' with all products 'off the root' rather than in a sub folder. Then they communicate product & categories/departmental relationships via breadcrumbs & other internal linking etc In the case of a platform like Shopfy is this a good idea or is it best to leave 'as is' and the 'Products' subfolder is a perfectly good place for the product pages ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Hashtag in url seems to remove the google plus one
My site has a catalogue page (catalog in US) with #anchors so that customers can get straight to them. I even link from other pages to the #anchor on the catalogue page. so I have for example: www.example.co.uk/catalogue.htm www.example.co.uk/catalogue.htm#blueitems www.example.co.uk/catalogue.htm#redtems I understand google doesn't index after the #, here is the post I found: http://moz.com/community/q/hashtag-anchor-text-within-content#reply_91192 So I shouldn't have an seo problem. BUT, if I navigate to www.example.co.uk/catalogue.htm and plusone the page it will show the plusone and then I navigate to www.example.co.uk/catalogue.htm#blueitems the plus one is gone. The same happens in reverse, if i plusone www.example.co.uk/catalogue.htm#redtems, then that plusone doesn't show in www.example.co.uk/catalogue.htm. I added rel=canonical and that fixed the plusone problem, now if you plus one /catalogue.htm#redtems it still shows on catalogue.htm This seems a bit extreme and did I do the right things??
Technical SEO | | Peter24680 -
Can I remove 301 redirects after some time?
Hello, We have an very large number of 301 redirects on our site and would like to find a way to remove some of them. Is there a time frame after which Google does not need a 301 any more? For example if A is 301 redirected to B, does Google know after a while not to serve A any more, and replaces any requests for A with B? How about any links that go to A? Or: Is the only option to have all links that pointed to A point to B and then the 301 can be removed after some time? Thank you for you you help!
Technical SEO | | Veva0 -
Should we introduce subfolders into the URLs on a new site?
A site we are working on currently gives no indication of the subfolders in the URL. Eg. the site uses: www.examplesite.com/brand-name Rather than: www.examplesite.com/popular-products/brand-name There are breadcrumbs on site to show the user what part of the site they are in and how they navigated there. We are building a new site and have to decide what route to take: Since the site is already performing relatively well in the SERPs and the URLs are nice and short this way, is it a good idea to keep them like this or is it better for usability to include the subfolders? This post suggests that we would be best off to keep the URLs as they are - particularly since less would be changed http://www.seomoz.org/blog/should-i-change-my-urls-for-seo Thanks in advance for your opinions! Liz @lizstraws
Technical SEO | | oneresult0 -
Is it possible for someone outside an organization to remove links?
The other day I used the Explorer tool to check my links. I checked again today, and the strongest links, which have been on my site for many months, are no longer listed on the report. I'm wondering if there has been some malicious activity. If the links have been removed, how do you get them back?
Technical SEO | | Stol0 -
Understanding page and subdomain metrics in OSE
When using OSE to look at the **Total External Links **of a websites homepage, I dont understand why the page and subdomain metrics are so different. For example, privacy.net has 1,992 external links at the page level and 55,371 at the subdomain level. What subdomain? www redirects to privacy.net. And they have 56,982 at the root domain level - does that mean they have around 55k deep links or what?
Technical SEO | | meterdei0