We switched the domain from www.blog.domain.com to domain.com/blog.
-
We switched the domain from www.blog.domain.com to domain.com/blog. This was done with the purpose of gaining backlinks to our main website as well along with to our blog. This set us very low in organic traffic and not to mention, lost the backlinks. For anything, they are being redirected to 301 code. Kindly suggest changes to bring back all the traffic.
-
Hi Arun
It's certainly best practice to move to a root directory. As you say visitors are then coming to one domain, not a subdomain. All you need to do is page by page redirect through a 301. When you say they are being 'redirected to 301 code' this is perfectly OK. The 301 code just tells Google that the page has moved permanently.
It takes Google a short while to recognise the new pages as replacing the old ones and for that period you can see old and new in Google, causing a short period of duplication which could affect the rankings.
You just need to sit it out - by all means, do a Fetch in Search Console to help speed up the process.
Search Console>Crawl>Fetch as Google.
Regards
Nigel
-
Let's separate your case in parts so is much easier to face the problem.
From my point of view, you did the right choice. But as any choice need some level of preparation.
(This a personal opinion based on my experience and knowledge, probably you know some or all of the tools and process I will mention, I prefer to mention instead of assuming that you know it)1-Subdomains are Essentially a Different Website
When you use the blog.website.com subdomain solution, you are essentially setting up an entirely different website. And while it is true that Google will crawl and index both of them, you are limiting the full potential of your online marketing efforts.
When you separate your website and blog, it creates two separate entities that need your attention. And now, with things like Time On Site and Bounce rate contributing to your website rankings, you can’t let users spend their time on pages that Google sees as a different domain.
When your blog and website are properly integrated, on the other hand, Google will see that the traffic to your website as a whole continues to go up. This, to Google, translates as a website that has some obvious authority and deserves higher rankings.
As long as you keep your blog in a subdirectory or subfolder, it will keep the Google bots coming to your main website to recrawl and index your site over and over.
2- SEO Considerations For Any Website Migration
In my case to ensure that any website migration goes smoothly and leads to improved business, I follow these essential recommendations. In order to improve the user experience of your website, make sure you’re putting all of that valuable data to use by reviewing:
- Top-viewed website content – Make sure you aren’t cutting content your audience loves.
- Least-viewed website content – Even the best sites have some junk, take this opportunity to drop it or improve it.
- Click maps – Looking at where people are clicking (or trying to click) can help to design an intuitive and frustration free navigation interface.
- Paths to conversion – Regardless of what your website goals are (i.e. build subscribers, generate leads), understanding the paths which your visitors are taking to key conversion points can help to optimize these paths to make it easier and more enticing for visitors to convert into customers.
Web analytics tools that you need to check:
Map Url Redirects
If your website has been around for any amount of time, there’s a good chance that you’ve built up search equity in the form of links and social shares. In addition to tight keyword optimization, these are the primary factors that help to increase the visibility of your content in search engines and since they are tied to the URLs on your site, a migration in domain or URL structure can snuff out the valuable search equity you’ve spent time and effort building
To avoid starting from SEO square one with your new website, it’s important to strategically implement 301 redirects from your old page URLs to the new ones, as this will effectively tell search engines where your new site pages are and that they are replacements for the old versions. In addition, it will ensure that people and bots who follow links to your old URLs will end up in the right place rather than an error page.
In order to map redirects effectively, start by documenting for all your existing pages:
- URL
- Page topic
- Target keyword
- Organic search traffic (I recommend looking at a minimum 6-month)
- Links to page
- keyword rank
Also document for your planned new site pages:
- URL
- Page topic
- Target keyword
Once you have these two lists compiled, the next step is to map each page on your current site to it’s planned new location on your soon-to-be-launched site. Redirect mapping isn’t rocket science, but it does take some thought (when done correctly). Fortunately, the previous exercise should give you all the information you need.
Of primary concern is topic relevance, in particular for highly trafficked and linked-to pages. When planning redirects, always consider what the experience of a visitor would be if they ended up on the redirect page rather than the original. Would it serve their needs as well or better than the old page? Would it feel confusing? Ideally, the new page should be such a seamless transition that people don’t even notice the switch.
Redirect mapping tools:
- OpenSiteExplorer – Links and social shares
- Google Analytics – Traffic
- SEMRush – Keyword rankings
- Microsoft Excel
Choose Ideal Timing
Even the best planned and executed website migrations come with some downtime and a temporary decrease in traffic (approx. 30%) and search rankings. It’s a price worth paying, as a new and improved website can drive significant improvements in business over an outdated and clunky site. However, it’s important to time the transition for when it’s likely to have the least amount of negative impact on your business.The best time of year to implement a website migration is when business is likely to be the slowest. Companies vary in the degree of seasonality they experience, but most have a ‘slow season’. You probably already know when this is, but if not, take a look at your historic yearly web traffic or revenue patterns to determine when your slow season typically occurs.
As with time of year, it also makes sense to migrate your site on a slow day of the week during off hours. For many B2B focused websites, this is late on Friday or Saturday, but make sure to make the decision based on your own analytics, as every site and audience is different.
Analytics
As mentioned earlier, a temporary decrease of approximately 30% in website search traffic and visibility can be expected in the period immediately following a migration, but it’s very important to monitor closely to make sure it is indeed temporary and that things are headed in the right direction.Make sure to keep a close eye on:
- Organic search traffic
- Visit bounce rate
- Conversion rates
- Keyword rankings
Crawl Errors
Generally, crawl errors like broken links, 404 not found pages or duplicate content will be at their lowest levels on a brand new site, but it’s still important to check and fix any errors, especially as this can be an indicator of a mistake during the migration.There are many good automated crawl tools available, but make sure you use one that can find:
- Broken links and 400 error pages
- 500 error pages
- Duplicate content
- Inaccessible content
In Summary
A website migration may seem like a lot of work, and it most certainly is (when done correctly). But the potential payoffs in an improved experience for your site visitors and increased business for you are more than worth the investment.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Optimizing blog domain for maximum rank/traffic potential
Hello wonderful Moz community! I need some advice. Here is the situation: I work in a small division within a much larger company. We each have our own domain, i.e. www.parent.com and www.child.com. We (the child) have a domain authority of 57, while our parent has a domain authority of 86. Our blog lives on blogs.parent.com/child. My understanding is that www.brand.com/blogs is better for SEO than blogs.brand.com (we had no control of directory structure decisions at the parent level). Given all that, in terms of maximizing traffic to our domain, would we be better off moving our blog to www.child.com/blogs? Here are a couple of potential pros/cons bouncing around in my newbie brain: a) By moving the blog to our domain, our whole site could benefit from having any external links our blog posts earn point back to our domain vs. our parent's domain. b) On the other hand, leaving the blog on our parent's domain and then linking to our content from posts over there might give our content a boost. (Of course, that theory is shot down if Google recognizes our parent/child relationship and doesn't reward our site with the benefit of linkbacks coming from our parent domain.) What say you? Are there other angles to this I’m not even considering? If you think the right decision is to move the blog over to our site, any suggestions on how not to screw that up? (301’s, etc.) Thanks in advance for your thoughts! -John
Technical SEO | | jomosi0 -
No descripton on Google/Yahoo/Bing, updated robots.txt - what is the turnaround time or next step for visible results?
Hello, New to the MOZ community and thrilled to be learning alongside all of you! One of our clients' sites is currently showing a 'blocked' meta description due to an old robots.txt file (eg: A description for this result is not available because of this site's robots.txt) We have updated the site's robots.txt to allow all bots. The meta tag has also been updated in WordPress (via the SEO Yoast plugin) See image here of Google listing and site URL: http://imgur.com/46wajJw I have also ensured that the most recent robots.txt has been submitted via Google Webmaster Tools. When can we expect these results to update? Is there a step I may have overlooked? Thank you,
Technical SEO | | adamhdrb
Adam 46wajJw0 -
One server, two domains - robots.txt allow for one domain but not other?
Hello, I would like to create a single server with two domains pointing to it. Ex: domain1.com -> myserver.com/ domain2.com -> myserver.com/subfolder. The goal is to create two separate sites on one server. I would like the second domain ( /subfolder) to be fully indexed / SEO friendly and have the robots txt file allow search bots to crawl. However, the first domain (server root) I would like to keep non-indexed, and the robots.txt file disallowing any bots / indexing. Does anyone have any suggestions for the best way to tackle this one? Thanks!
Technical SEO | | Dave1000 -
Ranking on google.com.au but not google.com
Hi there, we (www.refundfx.com.au) rank on google.com.au for some keywords that we target, but we do not rank at all on google.com, is that because we only use a .com.au domain and not a .com domain? We are an Australian company but our customers come from all over the world so we don't want to miss out on the google.com searches. Any help in this regard is appreciated. Thanks.
Technical SEO | | RefundFX0 -
Where to host my blog
If I cannot have my blog on my main site, what is the second best option? It is currently on Blogger, but I am wondering if it would be worth it to move it to blog.mainsite.com. Thanks!
Technical SEO | | AlightAnalytics0 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0 -
Duplicate Homepage: www.mysite.com/ and www.mysite.com/default.aspx
Hi, I have a question regarding our client's site, http://www.outsolve-hr.com/ on ASP.net. Google has indexed both www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx creating a duplicate content issue. We have added
Technical SEO | | flarson
to the default.aspx page. Now, because www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx are the same page on the actual backend the code is on the http://www.outsolve-hr.com/ when I view the code from the page loaded in a brower. Is this a problem? Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. We cannot do a 301 redirect from www.outsolve-hr.com/default.aspx to www.outsolve-hr.com/ because this causes an infinite loop because on the backend they are the same page. So my question is two-fold: Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. Is the rel="canonical" the best solution to fix the duplicate homepage issue on ASP. And lastly, if Google has not indexed duplicate pages, such as https://www.outsolve-hr.com/DEFAULT.aspx, is it a problem that they exist? Thanks in advance for your knowledge and assistance. Amy0