Re-directing old domains to New Domains
-
I previously had a domain with good ranking, but had to redirect this to a new domain for branding purposing and has only been around for 1 year instead 10 years like the previous.
Does the weight of the entire pagerank from the old domain get transferred to the new domain? How does Google handle this? The old domain had a good keyword in the name, which help rank that keyword...does that keyword also get transferred to the new domain with Google?
-
Hi SEOCM, (apologies for the lack of paragraphing here, doesn't work on an iPad). Note: Keri added paragraph marks, and the iPad issue is in our bug queue. (Thank you)
Q1) and Q2) If the redirect was done correctly (a 301 permanent redirect in effect) then Most of the previous website's strength will be applied to your new website, there is some loss though, usually varies between a 5% and 20% loss. A 301 redirect tells the search engines that the move is permanent, so Google after a fairly short while, recognises this new site as the new home of the previous site.
Q3) Keywords in Domains are believed to now not have as much ranking weight as they used to. Most of the old website's overall strength will be applied to the new website rather than say an individual keyword in the previous domain.
So don't worry, so long as the migration from the old site to the new one was done correctly, you'll likely get the maximum benefit that you can.
Regards, Simon
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to handle outdated & years old Blog-posts?
Hi all, We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted? Thanks
Algorithm Updates | | vtmoz0 -
New feature in seo results with icon?
I have never seen it before in the search: an icon in the title. Do you guys know how to get this icon in the title? See here: http://snag.gy/e7BiI.jpg e7BiI.jpg
Algorithm Updates | | Emilija1 -
Domains dominating SERPs w/multiple listings
I know Cutts addressed this as a potential future update to the Google algo but it's driving me bonkers.. My primary targeted keyword has one of our competitors listed 4 times in a row on the top of page 2. Some of the pages have duplicate page titles and the content is relatively thin. The site has a PR of 2 and a DA of 35. Why on earth are they able to suck up a whole half of a results page?!?!?! I don't know that there's anything anyone can tell me that will help, but if there's something I missed about this update please let me know. 'snot fair. 😞
Algorithm Updates | | jesse-landry0 -
Sub-domain or sub-directory for mobile version
sub-domain or sub-directory for mobile version advantages or dis-advangages?
Algorithm Updates | | Superflys0 -
Sub-domains and keyword rich domains
Hello All I'm hoping for some opinions as i am confused as to the best action for me to take. The problem:
Algorithm Updates | | jonny512379
Although i say the below, we have never been penalised by Google, not taken part in any bad link building and don't do too bad with SERP. but i worry Google may not like what i do these days. We have one main site that is broken down into areas/cities (i,e London, Manchester, etc) so the domain looks like www.domain.co.uk/London But in addition to this we also use Sub-domains to target popular areas (i,e. http://London.domain.co.uk).
These sub-domains take the content from the main site but of course only display results relevant to London and are optimised for "London + Keyword"
Any page that gets duplicated (i.e London.domain.co.uk/profile123 and www.domain.co.uk/profile123 are ALMOST the same content) we add a rel="canonical" link that points to the main domain+page on www.
All these sites have a large amount of links back to www.domain.co.uk/?Page so the user can also search in other areas other then London, etc. This method has worked well for us and is popular with both users and Google search results. All sites/sub-domains are added to GWT under the same account and all sites have unique sitemaps. I do however worry that Google may class this as link manipulation owing to the amount of links pointing back to the main domain and its pages (this is not the reason we use the sub-domains though) In addition to the above sub-domains we have a few domain names (5/6) that are keyword rich that we also place the same content on (i,e www.manchester-keyword.co.uk would show only content relevant to Manchester), and again these sites have links back to the main domain, so users can navigate other areas of the UK. I worry that these additional domains may also not be liked by Google What do people think? I have started to reduce/replace some of the additional keyword rich domains with sub-domains from the main site and then 301 the keyword rich domain (i.e. www.manchester-Keyword.co.uk now goes to http://Manchester.domain.co.uk) as i feel sub-domains may not be penalised as much as unique domains are.
There are domains that i dont really want to 301 as they bring in good amounts of traffic and users have bookmarked them, etc. Any opinions or what you think i should do would be great, as i really worry that if Google stops giving us good results, i'm in real trouble. Although im not sure if what we do is wrong with Google or not.0 -
How can we start to improve Domain MozRank & MozTrust for our website?
A simple question maybe, but how and where do we start if we want to improve our 'Domain MozRank & Moztrust', 'assuming of course that by improving both these we will improve our rankings with Google plus sales?
Algorithm Updates | | ewanTHH0 -
Domain Deindexed because of Redirect
I think this is an interesting topic to discuss though I'm looking for answers too. One of my well performing domain deindexed by Google today. Reason: Redirect from a 9 year old Deindexed domain (Must be penalysed) I believe this is done by one of my competitor. What you people suggest me to do now? Don't you think if this is the way Google treat the redirects after Penguin anybody can use this technique to harm their competitors?
Algorithm Updates | | HeIsHere0 -
Google new update question
I was just reading this, http://www.entrepreneur.com/blog/220662 We have our official site, which has 200+ service pages, which we wrote once and we keep doing SEO for them, so they rank high all the time. Now my question is, how does Google handle the site freshness ? Service static pages or if we are adding blog items, then also they consider them as fresh site, right ? So, we dont have to update those service pages, right ?
Algorithm Updates | | qubesys0