Best practices when merging 2 domains with different themes and CMS's?
-
I have a client with 2 sites - one for an external audience and one for their ~2,000-3,000 employees. The external site (call it acme.com), built on WP with a custom theme, is pretty small. The internal site (call it acmeinternal.com) has TONS of high quality content with incredible engagement metrics, but it's built on a separate CMS with an entirely different custom theme.
The problem we're trying to solve now: Can we bring the internal site over to the external domain (acme.com and acme.com/internal, for example) so that client.com can benefit from the quantity and quality of content and behavioral metrics associated with the internal content?
The external and internal audiences, and the corresponding content for each, are both entirely mutually exclusive. A potential client of theirs who would come to acme.com would have no reason to visit acme.com/internal (we'd actually prefer to not provide navigation to it for them), and the internal audience would treat acme.com/internal as their landing page, and all the posts would then live at acme.com/internal/news/post-name.
I'm assuming there are reasons why we couldn't have half of the site on one template using one CMS, having certain SEO tags, certain HTML structure, etc where the other half of the site is using a completely different template with a different CMS with different SEO tags, different URL structure etc? To reap the reward of the great content, would we have to essentially recreate the internal site's content on the external site's cms and template? Is it even possible for the domain authority of acme.com to improve based on the engagement on acme.com/internal/_xxxx _if there's virtually zero linking back and forth between acme.com and /internal/?
Any advice would be much appreciated!
-
No problem Alex, always happy to help!
For client.com to benefit from the content migration, it will need to be linked to in some way, preferably from the nav. Perhaps this could be done using a less obvious link in the header similar to where you'd expect to find a Login link but without a link to the content search engines have no way of finding it.
This link can be external but strong internal linking practices are important too. Moz does a great job of covering this here.
In terms of improving overall site strength, it will help. As you said, the engagement metrics will send some positive signals that people actually like the site but more significantly, it's going to be a lot more niche-specific, high-quality content going on the site that helps paint the picture of exactly what you do.
-
Thanks Chris, I appreciate the thorough response!
As for the UX concern, I was actually in the process of updating the question to address this when your response came in. The external and internal audiences, and the corresponding content for each, are both entirely mutually exclusive. A potential customer of theirs who would come to client.com would have no reason to visit client.com/internal (we'd actually prefer to not provide navigation to it for them), and the internal audience would treat client.com/internal as their landing page, and all the posts (in the thousands) would then live at client.com/internal/news/post-name, as an example.
Your example of a domain being split between an ecomm platform and wp for the blog is great, I've worked on sites like that in the past but didn't even think of it here for some reason. I guess part of my concern also comes from wondering if it's even possible for the domain authority of client.com to improve based on the engagement on client.com/internal/_xxxx _if there's absolutely zero linking back and forth between client.com and /internal/?
-
Hi Alex,
You can run 2 different CMS on the same domain and from what I've seen, it doesn't really have any impact on your ranking ability if it's handled correctly. I've certainly seen plenty of websites that use one CMS to handle their products and landing pages and Wordpress for their /blog section but in all cases a deliberate effort has been made to keep them looking identical.
The biggest problem I see with that plan is that it sounds like you want the two different sections to look like two sites. While technically I suppose this would be ok, it would make for a confusing user experience!
Changing up the aesthetics slightly isn't a huge deal, you can use that to distinguish different parts of your website if that works for the UX. Moz actually does this, if you look at their Home Page, Blog and About Page you'll see that the primary nav stays exactly the same but the colour scheme and layout is slightly different on each.
Either way, I see there being two options and both involve a bit of work:
1. Update the aesthetics of your internalclient.com site before combining them so the UX is seamless
**2. **Migrate the content from one site to the other. Which direction you push it should depend on which is the most engaging in terms of aesthetics and usability.I would expect that moving your blog/article content from clientinternal.com to client.com would be mostly copy/paste and some basic style tweaks, it just depends on the volume you're talking about. Doing that for 20 posts makes it an attractive solution but if you've accumulated 2,000... not so much. There are always options like Fiverr at your disposal for this sort of grunt work.
The final point to be cautious of is your internal linking. Just make sure you spend a good amount of time mapping out your new site structure and be meticulous about it's application, testing it with something like Screaming Frog when you're done. In the end, every item should give a 200 status; no links pointing to 301's and obviously no 404s.
EDIT: I forgot to drop some helpful resources into this one for the migration - these posts have been helpful for us in the past.
Website Migration Guide - Tips For SEOs Achieving an SEO-Friendly Domain Migration - The Infographic Domain Migrations: Surviving the "Perfect Storm" of Site Changes
301 Redirects - Migrating a New Site From Development To Live
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Entire website is duplicated on 2 domains - what to do?
My client's website has 1000+ pages and a Domain Authority of 23. I have just discovered that the entire site is duplicated on a second domain (main URL = companyname.com - duplicate site URL = company-name.com). The home page of the duplicate domain has a 301 redirect going to the main domain. However, none of the 1000+ other pages have any redirect set up, so Google is indexing the entire duplicate site. I'm assuming this is a bad thing for SEO. Duplicate site has a domain Authority of 4, so I'd like to transfer whatever link juice it has, towards the main site. What's the best thing to do? Ultimately I think it would be best to delete the duplicate site. So would it be a case of adding a redirect to the htaccess file along the lines of: redirect company-name.com/?slug? to https://companyname.com/?slug? (I realise this isn't the correct syntax - but is the concept correct?) Has anyone ever dealt with this successfully?
Technical SEO | | BottleGreenWebsites0 -
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Content Based on User's IP Address
Hello, A client wants us to create a page on two different sites (www.brandA.com/content and www.brandB.com/content) with similar content and serve up specific content to users based on their IP addresses. The idea is that once a user gets to the page, the content would slightly change (mainly contact information and headers) based on their location. The problem I am seeing with this is that both brandA and brandB would be different Urls so there is a chance if their both optimized for the similar terms then they would both rank and crowd up the search results (duplicate content). Have you seen something similar? What are your thoughts and/or potential solutions? Also, do you know of any sites that are currently doing something similar?
Technical SEO | | Rauxa0 -
Merging sites, ensuring traffic doesn't die
Wondering if I could get a second opinion on this, please. I have just taken on a new client, they own about 6 different niched car experience websites (hire an Aston Martin for the day, type thing). All the six sites they have seem to perform reasonably well for the brand of car they deal with, the average DA of the sites is about 24. The client wishes to move all of these different manufacturers into one site and have sections of the site, they can then also target more generic experience day type keywords. The obvious way of dealing with this move would be to 301 the old sites to the relevant places on the new site and wait for that to rank. However, looking at the backlinks profile of the niched sites, they seem to have very few backlinks and i feel the reason they are ranking so well for all the individual manufacturers is because they all feature the name in the domain. Not exact match, but the name is there. If I am thinking right, with the 301 we want to tell Google page x is now page y, index this one instead. Because the new site has a more generic name I don't think it will enjoy any of the domain keyword benefits which are helping the sub sites, and as a result I expect the rankings and traffic to drop (at least in the short term). Am I reading this correct. Would people use a 301 in this case? The easiest thing to do would be to leave the 6 sub sites up and running on their own domain and launch the new site to run alongside them, however the client doesn't want this. Thanks, Carl
Technical SEO | | GrumpyCarl0 -
How to solve the meta : A description for this result is not available because this site's robots.txt. ?
Hi, I have many URL for commercialization that redirects 301 to an actual page of my companies' site. My URL provider say that the load for those request by bots are too much, they put robots text on the redirection server ! Strange or not? Now I have a this META description on all my URL captains that redirect 301 : A description for this result is not available because this site's robots.txt. If you have the perfect solutions could you share it with me ? Thank You.
Technical SEO | | Vale70 -
Best Practices to Choosing a Domain Name
I have the following list of domains to choose from: http://www.xxx.net/ http://www.xxx.uk/ www.es-xxx.com Which of these domain structures seem the best, or are all 3 questionable?
Technical SEO | | theLotter0 -
Url's don't want to show up in google. Please help?
Hi Mozfans 🙂 I'm doing a sitescan for a new client. http://www.vacatures.tuinbouw.nl/ It's a dutch jobsite. Now the problem is here: The url http://www.vacatures.tuinbouw.nl/vacatures/ is in google.
Technical SEO | | MaartenvandenBos
On the same page there are jobs (scroll down) with a followed link.
To a url like this: http://www.vacatures.tuinbouw.nl/vacatures/722/productie+medewerker+paprika+teelt/ The problem is that the second url don't show up in google. When i try to make a sitemap with Gsitecrawler the second url isn't in de sitemap.. :S What am i doing wrong? Thanks!0