Which domain we should continue with?
-
Hello All,
We are working with a client who had manual penalty from Google. We worked on that and now penalty has been removed.
Client had already started working on the new domain and now the big dilemma is- Which domain should we continue with? Old or New?
We are suggesting them to continue with the old one as that domain had good PR, good backlinks, better visibility on their social profiles etc.
What do you suggest? any inputs are highly appreciated.
Thanks
-
Hey guys!
I agree with James and Bruce. We have a client that ran into a similar issue about a year ago where he was issued a manual penalty on his site, from work performed prior to us of course ; ). While we were working to correct the issue, we developed a new site (on a new URL) that began ranking fairly well with little effort. Once the penalty was removed from the older site, we simply kept the two sites separate and focused our efforts primarily on the older site. This proved to be an effective strategy in our situation since the new site didn't have much authority or credibility. It was ranking well so the client decided to leave it on its own and not redirect or connect it to the older site in any way.
So in my experience, I would have to agree that using the older domain would be best. Of course assuming that the site (content, URL structure, UX, link profile and social signals) is at a higher level than the newer site.
Regarding the redirect from the new site to the old, this is totally a judgement call you will have to make based on the amount and quality of off-site (links & social mentions) between the two sites. IMO the on-page and UX stuff can and should always be tweaked and improved upon. So I don't think that should really be a deciding factor.
Hope this helps in one way or another!
-
Wow that's tough if the new site has different new content,
Yes i would,
But i would ensure the old site is up to the high standard also, I would ensure any significant improvements on the new site IE better quality content, better responsive design, better internal structure and navigation are mirrored across on the old site, I would review and merge the better aspects and content onto the older site as it has the better platform with the link/social profile etc,
I wouldn't want to you redirect which could be a better new site if the old site is not up to scratch, I would hope to get a second opinion from another Moz user on this,
James
-
Hi James,
Thanks for your quick response.
Would you suggest a redirect from new to old domain? new domain is also ranking now and have good content too.
-
I vote Old:
If you start with a new domain, then you will have to build all the rankings basicaly from scratch, give or take a few possible redirects etc. Providing the penalty has gone then you should be fine on this. Google doesn't from what we know keep a token penalty on the site because of a past problem, once the main penalty has been lifted its lifted
A new site can take quite a while to settle in and therefore you could expect many months of patience waiting for the site to get to the level of the old one.
Bruce.
-
I would think the Old depending on its current state,
If you can make any necessary changes to the old site without any technical limitations and maintain the back links and social profile then better than starting a fresh,
I wouldn't of thought the penalty would effect you long term if it has been fully rectified and was not for an overly serious offence,
James
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one domain for email and another domain for your website, but redirects...
Hello - We are rebranding and our new name is fairly lengthy. We own all main domain versions of our brand name - .com, .new and .org - There is a very high search volume for the new brand name as it is a merger of 2 popular existing brands so want to take advantage of that and use our full name within our website domain name. However, since the name is a little long as mentioned - 25 characters - we also own the 3 character acronym of the new brand so we are debating on using the acronym for our new email addresses. ie name@abc.com so it is user friendly. We would obviously redirect the acronym email domain to point to the longer website domain. Are there any negative SEO effects if we do that? Use the longer domain for the website and shorter acronym for our email? Thank you
Technical SEO | | KRBishopBh1 -
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
SEO for sub domains
I've recently started to work on a website that has been previously targeting sub domain pages on its site for its SEO and has some ok rankings. To better explain, let me give an example...A site is called domainname.com. And has subdomains that they are targeted for seo (i.e. pageone.domainname.com, pagetwo.domainname.com, pagethree.domianname.com). The site is going through a site re-development and can reorganise its pages to another URL. What would be best way to approach this situation for SEO? Ideally, I'm tempted to recommend that new targeted pages be created - domainname.com/pageone, domainname.com/pagetwo, domainname.com/pagethree, etc - and to perform a 301 redirect from the old pages. Does a subdomain page structure (e.g. pageone.domainname.com) have any negative effects on SEO? Also, is there a good way to track rankings? I find that a lot of rank checkers don't pick up subdomains. Any tips on the best approach to take here would be appreciated. Hope I've made sense!
Technical SEO | | Gavo0 -
Domain Crawl Question
We have our domain hosted by two providers - web.com for the root and godaddy for the subdomain. Why SEOMOZ is not picking up the total pages of the entire domain?
Technical SEO | | AppleCapitalGroup0 -
Domain Masking with New Keyword-Rich Domains
Hello, friends. We have an ecommerce site and we also own several keyword-rich domains but haven't done anything with them yet. Is there any value in using domain masking to point them to either product pages or special landing pages on our primary ecommerce site? Here's an example: Primary site is widgetzone.com Keyword rich URL is acmewidget.com (which is totally blank and isn't indexed) It could point to our category page for Acme Widgets: widgetzone.com/category/acme-widgets or it could point to a new landing page: widgetzone.com/acme-widgets My concern is that because the keyword-rich URL hasn't been utilized at all there's really no point in redirecting it. I'm of the mind that it's either going to be ineffective at best or a duplicate content issue at worst. What do you guys think? As a follow-up, if we don't redirect these domains, what should we do with them? Just try to sell them off rather than create totally new sites?
Technical SEO | | jbreeden0 -
Domain restructure, sitemaps and indexing
I've got a handcoded site with around 1500 unique articles and a handcoded sitemap. Very old school. The url structure is a bit of a mess, so to make things easier for a developer who'll be making the site database-driven, I thought I'd recategorise the content. Same content, but with new url structure (I thought I'd juice up the urls for SEO purposes while I was at it) To this end, I took categories like: /body/amazing-big-shoes/
Technical SEO | | magdaknight
/style/red-boots/
/technology/cyber-boots/ And rehoused all the content like so, doing it all manually with ftp: /boots/amazing-boots/
/boots/red-boots/
/boots/cyber-boots/ I placed 301 redirects in the .htaccess file like so: redirect 301 /body/amazing-boots/ http://www.site.co.uk/boots/amazing-boots/ (not doing redirects for each article, just for categories which seemed to make the articles redirect nicely.) Then I went into sitemap.xml and manually overwrote all the entries to reflect the new url structure, but keeping the old dates of the original entries, like so: <url><loc>http://www.site.co.uk/boots/amazing-boots/index.php</loc>
<lastmod>2008-07-08</lastmod>
<changefreq>monthly</changefreq>
<priority>0.5</priority></url> And resubmitted the sitemap to Google Webmasters. This was done 4 days ago. Webmaster said that the 1400 of 1500 articles indexed had dropped to 860, and today it's climbed to 939. Did I adopt correct procedure? Am I going about things the right way? Given a little time, can I expect Google to re-index the new pages nicely? I appreciate I've made a lot of changes in one fell swoop which could be a bit of a no-no... ? PS Apologies if this question appears twice on Q&A - hopefully I haven't double-posted0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0 -
Domain Redirect Issues
Hi, I have a domain that is 10 years old, this is the old domain that used to be the website for the company. The company approximately 7 years ago was bought by another and purchased a new domain that is 7 years old. The company did not do a 301 redirect as they were not aware of the SEO implications. They continued building web applications on the old domain while using the new domain for all marketing and for business partner links. They just put in a server level redirect on the folders themselves to point to the new root. I am on Tomcat, I do not have the option of a 301 redirect as the web applications are all hard coded links (non-relative) (hundreds of thousands of dollars to recode) After beginning SEO; Google is seeing them as the same domain, and has replaced all results in Google with the old domain instead of the new one..... My questions is.... Is it better to take the hit and just put a robots.txt to disallow all robots on the old domain Or... Will that hurt my new domain as well since Google is seeing them as the same? Or.... Has Google already made the switch without a redirect to see these as the same and i should just continue on? (even the cache for the new site shows the old domain address) Old Domain= www.floridahealthcares.com New = www.fhcp.com *****Update after writing this I began changing index.htm to all non relative links so all links on the old domain homepage would point to fhcp.com fixing the issue of the entire site being replicated under the old domain. I think this might "Patch" my issue, but i would still love to get the opinion of others Thanks Shane
Technical SEO | | Jinx146780