We sold our site's domain and have a new one. Where do we go from here?
-
We recently sold our established domain -- for a compelling price -- and now have the task of transitioning to our new domain. What steps would you recommend to lesson the anticipated decline from search engines in this scenario?
-
You should have migrated the sold domains "authority" to the new site using Google Webmaster Tools. Doing this would have transferred all of your SEO to the new site. However, since this was not done (or 301 redirect each link to the new site), i'm sorry to say that you'll need to start over
-
Thanks! We have a grace period of time of roughly 120 days before the new domain owners can switch the DNS on the domain, so it sounds like we should jump on the 301s immediately.
-
As mentioned above, if I had just bought your domain, I wouldn't leave any 301s in place and would probably be the first thing I removed, so unless you have it in the contract that they have to leave any 301s in place, its basically a brand new domain.
What you could do is contact the web masters who are linking to your old domain, explain that you have changed domain names and would they like to update their site. Explain to them, this will help their users otherwise they the user could get a bad experience of getting a 404 page and could hurt the web masters brand, if they are seen to be linking off to dead pages.
I would also recommend doing a lot of social messages, same great website just a new name, otherwise people will still be searching for your old name.
But the bad news is, you have a new domain and you are basically starting from scratch, (with the little help of being able to contact previous web masters and changing links). You will suffer a drop in rankings and will take a while to recover.
-
If you were not able to do any redirects on the old domain, there won't so much be a decline as it will be starting over from scratch. Were you able to put any 301 redirects on the old domain that the new owners will leave?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
HTML5: Changing 'section' content to be 'main' for better SEO relevance?
We received an HTML5 recommendation that we should change onpage text copy contained in 'section" to be listed in 'main' instead, because this is supposedly better for SEO. We're questioning the need to ask developers spend time on this purely for a perceived SEO benefit. Sure, maybe content in 'footer' may be seen as less relevant, but calling out 'section' as having less relevance than 'main'? Yes, it's true that engines evaluate where onpage content is located, but this level of granular focus seems unnecessary. That being said, more than happy to be corrected if there is actually a benefit. On a side note, 'main' isn't supported by older versions of IE and could cause browser incompatibilities (http://caniuse.com/#feat=html5semantic). Would love to hear others' feedback about this - thanks! 🙂
Intermediate & Advanced SEO | | mirabile0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
Penguin hit Website - Moving to new domain
Hey! I am working on a Penguin hit Website. Still ranking for all brand keywords and blog articles are still being returned in Google SERPs, but the website is showing up for only 3 or 4 money keywords. It is clearly a penguin hit as it was ranked 1st page for all money keywords before latest update (3.0). We already did a link cleanup and disavowed all bad backlinks. Still, the recovery process could take over 2 years from previous experience, and in 2 years, the site will suffer a slow death. Solution: We own the .com version of the domain, currently being served on the .net. We bought the .com version about 6 years ago, it is clean and NOT redirected to the .net (actual site). We were thinking about moving the whole Website to the .com version to start over. However, we need to make sure Google doesn't connect the 2 sites (no pagerank flow). Of course Google will notice is the same content, but there won't be any pagerank flowing from the old site to the new one. For this, we thought about the following steps: Block Googlebot (and only googlebot) for the .net version via robots.txt. Wait until Google removes all URLs from the index. Move content to the .com version. Set a 301 redirect from .net to .com (without EVER removing the block on googlebot). Thoughts? Has anyone went over this before? Other ideas? Thanks!
Intermediate & Advanced SEO | | FedeEinhorn0 -
Chinese Sites Linking With Bizarre Keywords Creating 404's
Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra". Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?
Intermediate & Advanced SEO | | alrockn0 -
Working to Start an Shopping Idea Site - Which Totally Based On Scraping product from Ecom. How Quickly I should Add products and categories in this new domain.
How Quickly I should Add products and categories in this new domain. We are going to start its promotional by google adwords and facebook. I worrying about 10000's of product pages. kindly guide me.
Intermediate & Advanced SEO | | innovatebizz0 -
What to do when all products are one of a kind WYSIWYG and url's are continuously changing. Lots of 404's
Hey Guys, I'm working on a website with WYSIWYG one of a kind products and the url's are continuously changing. There are allot of duplicate page titles (56 currently) but that number is always changing too. Let me give you guys a little background on the website. The site sells different types of live coral. So there may be anywhere from 20 - 150 corals of the same species. Each coral is a unique size, color etc. When the coral gets sold the site owner trashes the product creating a new 404. Sometimes the url gets indexed, other times they don't since the corals get sold within hours/days. I was thinking of optimizing each product with a keyword and re-using the url by having the client update the picture and price but that still leaves allot more products than keywords. Here is an example of the corals with the same title http://austinaquafarms.com/product-category/acans/ Thanks for the help guys. I'm not really sure what to do.
Intermediate & Advanced SEO | | aronwp0 -
What's the best .NET blog solution?
I asked our developers to implement a WordPress blog on our site and they feel that the technology stack that is required to support WP will interfere with a number of different .NET production applications on that server. I can't justify another server just because of a blog either. They want me to find a .NET blog solution. The only thing that looks decent out there is dotnetblogengine.net. Has anyone had any experience with this tool or any others like it? Thanks, Alex
Intermediate & Advanced SEO | | dbuckles1