Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Merge 2 websites into one, using a non-existing, new domain.
-
I need to merge https://www.WebsiteA.com and https://www.WebsiteB.com to a fresh new domain (with no content) https://www.WebsiteC.com. I want to do it the best way to keep existing SEO juice.
Website A is the companies home page and built with Wordpress
Website B is the company product page and built with Wordpress
Website C will be the new site containing both website A and B, utilizing Wordpress also.
What is the best way to do this? I have research a lot and keep hitting walls on how to do it.
It's a little trickier because it's two different domains going to a brand new domain.
Thanks
-
Thanks Andy!
I'm still a little confused on how I will be copying data over from Site A to Site C. Is it just a matter of creating an empty Wordpress shell with no template and copying the data from Site A to it? (And how is that usually done?) Then would I redirect all Site A's pages to Site C's pages, just with the URL changed? I.E www.SiteA.com/large-dogs redirect to www.SiteC.com/large-dogs
I don't want all of Site B either, just about 10 pages from it, would I manually copy them over also, and how would that be done?
Thanks
-
I did just this type of thing a little over a year ago and organic traffic is up over 300% now. We made the change mainly to improve the structure of the website(s), with more logical organization and better internal linking. We did do the move all at once (thousands of pages) but it took a lot of behind-the scenes planning to be ready for that.
First came the decisions about what sections and categories made sense for our site. (Using the URL structure to guide users around the site makes it easier for them to find what they are looking for and interlinking between related posts as appropriate is also good—and this helps a lot with search engines.)
Then came the organization of posts into their new categories. To make things easier, we kept the individual path names the same (so www.siteA.com/old-category/old-post-string became www.siteC.com/new category/old-post-string) and uploaded them into their new categories when the time came.
We also used this time to do a limited content review (posts with the most traffic) and we updated a lot of these. We made the choice to keep most of our old posts, even though in our market they can get outdated quickly, to conserve any links we may have acquired. (The main site that we were directing to the new site was pretty old and had picked up a lot of links over time.)
We could have done a more complete content review before the changeover, but in part we wanted to see how these posts did under the new structure—we did get renewed life out of some of them, and we further updated and optimized those.
In conjunction with the export of the old sites to the new one, we made sure to 301 redirect all of the old posts to their counterparts on the new site. For the posts we chose not to bring over, we 301 redirected them to a related post in the same category.
We still occasionally come across things that need to be fixed—old posts that need redirecting/updating or 404 errors that need to be tracked down (one big issue we found was a lot of old pages had old links with hard paths to the old website root domains, causing a bunch of nasty internal not found errors—not good!) but overall we are happy with the change. (Up 308%!)
-
Hi,
One way to do this is to decide which site is going to be the main site (site A) sat on C and copy this data over. If you are bringing in site B, then this can sit at another structure level - you will end up with this...
Site A --> Site C, main pages
Site B --> Site C, product pagesThat then brings in everything from both sites to the new domain.
You then want to redirect both of the old sites to the new one, but don't redirect everything to the root. That isn't a good use of 301 mapping. You need to be mapping on a page level so that you will see...
www.sitea.com/about us -301- www.sitec.com/aboutus
www.siteb.com/newproducts/hammers -301- www.sitec.com/newproducts/hammersThere can be differences in the URL's - you don't need to stick with the same structure as the other sites if it doesn't make sense, but always map pages to something very similar.
Page level is the only way to go if you want to maintain a seamless transition for users as well.
Also, don't expect to hit a switch and do this all at once. You can do this over a period of time because to the user, they will just be redirected to the new pages. You will retain more link juice like this.
This is quite a lengthy process and I am sure I have missed the in-between bits, but this is the basis of what you want to be doing.
Others might chip in with other suggestions for you.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
How can you promote a sub-domain ahead of a domain on the SERPs?
I have a new client that wants to promote their subdomain uk.imagemcs.com and have their main domain imagemcs.com fall off the SERPs. Objective? Get uk.imagemcs.com to rank first for UK 'brand' searches. Do a search for 'imagem creative services' and you should see the issue (it looks like rules have been applied to the robots.txt on the main domain to exclude any bots from crawling - but since they've been indexed previously I need to take action as it doesn't look great!). I think I can do this by applying a permanent redirect from the main domain to the subdomain at domain level and then no-indexing the site - and then resubmit the sitemap. My slight concern is that this no-indexing of the main domain may impact on the visibility of the subdomains (I'm dealing with uk.imagemcs.com, but there is us.imagemcs.com and de.imagemcs.com) and was looking for some assurance that this would not be the case. My understanding is that subdomains are completely distinct from domains and as such this action should have no impact on the subdomains. I asked the question on the Webmasters Forum but haven't really got anywhere
Technical SEO | | nathangdavidson2
https://productforums.google.com/forum/#!msg/webmasters/1Avupy3Uw_o/hu6oLQntCAAJ Can anyone suggest a course of action? many thanks, Nathan0 -
Canonical homepage link uses trailing slash while default homepage uses no trailing slash, will this be an issue?
Hello, 1st off, let me explain my client in this case uses BigCommerce, and I don't have access to the backend like most other situations. So I have to rely on BG to handle certain issues. I'm curious if there is much of a difference using domain.com/ as the canonical url while BG currently is redirecting our domain to domain.com. I've been using domain.com/ consistently for the last 6 months, and since we switches stores on Friday, this issue has popped up and has me a bit worried that we'll loose somehow via link juice or overall indexing since this could confuse crawlers. Now some say that the domain url is fine using / or not, as per - https://moz.com/community/q/trailing-slash-and-rel-canonical But I also wanted to see what you all felt about this. What says you?
Technical SEO | | Deacyde0 -
Merging several sites into one - best practice
I had 2 sites on the web (www.physicseditor.de, www.texutrepacker.com) and decided to move them all under one single domain (www.codeandweb.com) Both sites were ranking very good for several keywords. I not redirected the most important pages from the old domains with a 301 redirect to the new subpages (www.texturepacker.com => www.codeandweb.com/texturepacker) Google still delivers the old domains but the redirect take people directly to the new content. I've already submitted the new site map to google webmaster tools. Pages are already in the index but do not really show up in the search results. How long does it take until google accepts the new domain and delivers the new content in the search results? Was it ok what I did? Or is there some room for improvement? SeoMoz will of course not find any information about the new page since it is not yet directly linked in google. But I can't get ranking information for the "old" pages since SeoMoz tells me that it can't crawl the old domains....
Technical SEO | | gossi740 -
301 redirects & merging two sites into one
We have a client that has two sites that rank well for different searches in their market. The main pages ranking are things like advice articles and news pieces. For various reasons, they just want one site. I believe they need to duplicate the content from the outgoing site and place it on the main site, with a 301 redirect from each old page to each new one. What happens when they eventually want to redirect the entire domain? Would these smaller, internal redirects become obsolete, therefore removing any link value they once had? I am not sure how this works or if there is a best practice way to do this. Thanks Gareth
Technical SEO | | Gmorgan0 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
SEO Benefit from Redirecting New Exact Match Domains?
Hi, All! This is a question asked in the old Q & A section, but the answer was a little ambiguous and it was about 3 years ago, so I decided to repost and let the knowledgeable SEO public answer... From David LaFerney: It’s clear that it’s much easier to get high rankings for a term if your domain is an exact match for the query. If you own several such domains that are very related such as – investmentrealestate.com, positivecashflow.com, and rentalproperty.com – would you be able to benefit from those by 301ing them to a single site, or would you have to maintain separate sites to help capture those targeted phrases? In a nutshell – SEO wise, is it worth owning multiple domains to exactly match valuable search phrases? Or do you lose the exact match benefit when you redirect?>> To clarify: redirecting an old domain with lots of history and links to a new exact match domain seems to contain SEO benefit. (You get links+exact match domain, approximately.) But the other way around? Redirecting a new exact match domain to an older domain with links? Does that do anything for the ranking of the old domain for the exact match keyword? Or absolutely nothing? (My impression has been that it's nothing, but the question came up for a client and I just wanted to make sure I wasn't missing something.) Thanks in advance!
Technical SEO | | debi_zyx0