Domain migration strategy
-
Imagine you have a large site on an aged and authoritative domain.
For commercial reasons the site has to be moved to a new domain, and in the process is going to be revamped significantly. Not an ideal starting scenario obviously to be biting off so much all at once, but unavoidable.
The plan is to run the new site in beta for about 4 weeks, giving users the opportunity to play with it and provide feedback. After that there will be a hard cut over with all URLs permanently redirected to the new domain.
The hard cut over is necessary due to business continuity reasons, and real complexity in trying to maintain complex UI and client reporting over multiple domains. Of course we'll endeavour to mitigate the impact of the change by telling G about the change in WMC and ensuring we monitor crawl errors etc etc.
My question is whether we should allow the new site to be indexed during the beta period?
My gut feeling is yes for the following reasons:
-
It's only 4 weeks and until such time as we start redirecting the old site the new domain won't have much whuffie so there's next to no chance the site will ranking for anything much.
-
Give Googlebot a headstart on indexing a lot of URLs so they won't all be new when we cut over the redirects
Is that sound reasoning? Is the duplication during that 4 week beta period likely to have some negative impact that I am underestimating?
-
-
I wouldn't sweat it. We left up www.bulwarkpest.com for several months while moving to www.bulwarkpestcontrol.com .... I know that there is some risk in it. But I think Google is pretty understanding of site migrations. Of course I am just a small pest control guy so they may not have ever noticed. Sooo.. take that with a grain of salt.
It's does make it easier to have the other site live so that you can redirect on a per page base and know that it's working. I would rather make sure the redirects are correct and working prior to moving the entire site over. But be warned.. site redirects may not always give you the same authority... research the online Yellow Pages.
-
My opinion of risk goes up much higher if this is a directory vs a site with original content articles.
-
Hi Aran, thanks for your response.
My thinking has also evolved a bit and I'm now thinking we ought to exclude the new site until we're ready to cut over as @EGOL suggested.
The critical info I didn't mention before was that there is important client ROI and reporting reasons that we need to ensure that the current site continues to perform right up until the cut over, at which point the 301s will be implemented. The cross domain canonical would address the dupliaction, but would also start to depreciate the current pages prematurely.
The thing that I was underestimating before was the negative impression that the new domain would give Google when it suddenly appeared with 1M+ pages of duplicate content plus no real link profile of its own (until we implement the 301s)...all the hallmarks of a scraper.
Better I think to avoid this by excluding the beta until we cut over, and make sure we prep well for that.
-
Agreed, though Charles could use canonical tags to tell Google that the new pages are authoritative. This may take a while to be indexed, but should prevent any detrimental effects with duplicate content.
-
Thanks very much for your thoughts. The root of my uncertainty is indeed the way Google in particular is viewing duplciated content today.
What if I told you that the site was a business directory and that the new site would be a big improvement in terms of on page optimization? By which I mean new/different (and much better) page titles and improved internal linking. I mention this only because the new site won't a direct replicar of the old one. Make a difference?
-
I have no factual data on this... just going with my gut....
Based upon how Google is acting these days I would not take chances with having two copies of the same site in the SERPs for an entire month. I would not want to see any pages on the new site filtered for being duplicates.
Most people don't get a new site indexed and those redirected domains normally go fairly well. So, I would be pleased with that and not take chances.
Safety might be better than going for some unknown gain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to build Domain Authority?
My site: https://www.fishingspots.com.au/ has started to drop Domain Authority in the past weeks, however less quality sites like http://silverstories.com.au/ are rising... I am not sure why? Is there someway I can understand why my site would suddenly start dropping authority?
Intermediate & Advanced SEO | | thinkLukeSEO0 -
:Pointing hreflang to a different domain
Hi all, Let's say I have two websites: www.mywebsite.com and www.mywebsite.de - they share a lot of content but the main categories and URLs are almost always different. Am I right in saying I can't just set the hreflang tag on every page of www.mywebsite.com to read: rel='alternate' hreflang='de' href='http://mywebsite.de' /> That just won't do anything, right? Am I also right in saying that the only way to use hreflang properly across two domains is to have a customer hreflang tag on every page that has identical content translated into German? So for this page: www.mywebsite.com/page.html my hreflang tag for the german users would be: <link < span="">rel='alternate' hreflang='de' href='http://mywebsite.de/page.html' /></link <> Thanks for your time.
Intermediate & Advanced SEO | | Bee1590 -
Domain name suffix impact on SEO
Hello there, We are about to launch a new website and were wondering what impact a specific suffix would have from an SEO point of view. We were thinking about going for a domain which ends in .london as oppose to .com We are based in London and sell world wide via our website. We are suggesting www.domain.london as oppose to www.domain.com I would appreciate your views... Thanks
Intermediate & Advanced SEO | | roberthseo0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
Is it safe to 301 redirect old domain to new domain after a manual unnatural links penalty?
I have recently taken on a client that has been manually penalised for spammy link building by two previous SEOs. Having just read this excellent discussion, http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience I am weighing up the odds of whether it's better to cut losses and recommend moving domains. I had thought under these circumstances it was important not to 301 the old domain to the new domain but the author (Lewis Sellers) comments on 3/4/13 that he is aware of forwards having been implemented without transferring the penalty to the new domain. http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience#jtc216689 Is it safe to 301? What's the latest thinking?
Intermediate & Advanced SEO | | Ewan.Kennedy0 -
Help! My Domain Authority keeps dropping! What do I do?
Hey! I just noticed my Domain Authority keeps dropping? What's happening? What do I do to get it better. I'm scared and dont know the next move to make to get this site better. Help please! Thanks! http://www.moondoggieinc.com Kristy O
Intermediate & Advanced SEO | | KristyO1 -
How do i know if my domain has given penalty by Google and why?
I have problems with my website to google search results. Some of the important keywords frequently and sharply rankings vary a lot lately. For example; 07-14-2012 Term1 Position #12 07-20-2012 Term1 Position #38 07-24-2012 Term1 Position #10 07-26-2012 Term1 Position #40 I wonder what is the cause of this incident? Is there any Penalties? If there is then how do i know this?
Intermediate & Advanced SEO | | ersaky0 -
Does duplicate content on a sub-domain affect the rankings of root domain?
We recently moved a community website that we own to our main domain. It now lives on our website as a sub-domain. This new sub-domain has a lot of duplicate page titles. We are going to clean it up but it's huge project. (We had tried to clean it even before migrating the community website) I am wondering if this duplicate content on the new sub-domain could be hurting rankings of our root domain? How does Google treat it? From SEO best practices, I know duplicate content within site is always bad. How severe is it given the fact that it is present on a different sub-domain?
Intermediate & Advanced SEO | | Amjath0