Why is old site not being deindexed post-migration?
-
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
-
Jarred,
Whenever you move to a new domain name Google will keep the old domain name indexed for up to a year (or longer!). It's just the way that Google does it, I suspect that it's because you may change your mind and go back to the old domain.
Having the old domain indexed in Google isn't a problem, as users should be redirected to the content on the new domain.
It will take up to a year for Google to stop indexing the old domain.
By the way, make sure you use the Google Change of Address Tool in Google Search Console, it will really help.
-
We did some 301 redirects in early February. There are still some pages on the old domain hanging in the SERPs - however, the 301s are sending the traffic to the right place.
The more powerful your domain, the longer it can take for the pages to drop from the SERPs, because you have a lot of spiders coming in through existing links. Also, weak domains can take a long time to drop from the SERPs - they have lots of pages that are rarely crawled.
-
Hi Jarred,
With regards to advice on this topic - what are you trying to accomplish?
Is the issue that you are using the same content for both sites and are worried about duplicate content?
If this is the case, a 301 redirect should solve your problems.
Have you stopped hosting on the old site?
If not it still exists as far as Google is concerned and you aren't going to see a de-indexation. Even if you have stopped hosting it can take months for Google to realize the site isn't there. Normally you start by seeing a few pages developing 400 errors before being removed completely. This isn't ideal as you are losing the link profile for these pages, hence the value of 301's.
Is it a 301 redirect situation?
If you are redirecting to the new domain, you are not going to de-index the old one. As far as Google is concerned it still exists and will continue to exist as long as it retains hosting.
In addition to above, de-indexation of a website can take months. We had this issue with a client we were transferring 300 domains for and it took about 2-3 months to see Google recognize the pages from the new websites and disregard the old ones. That being said, we were conducting redirects and the old pages never truly disappeared or de-indexed.
In short, 16 days probably isn't a long enough time frame to see any significant changes - if you are using 301's, this change won't happen at all. It doesn't mean anything negative from what you've described here.
If you want to fill me in on more details I'm happy to help as best I can.
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you optimize a blog post for SEO after the RankBrain?
Hi Guys Just curious to hear what you guys do to rank blog posts in the top in Google especially onsite, after the RankBrain update? Do you still use SEO tools to optimize this or are the SEO tools outdated for this? If yes which tools do you use to get success with? Cheers John
Algorithm Updates | | igniterman751 -
Domain Migration Question
Lets say there is a brand that has one primary product type at different optional tiers. (Think something like SMB/Enterprise/Individual) Lets also say that 1 year ago this brand migrated from having everything under 1 domain (Domain A) to moving 2 of their product tiers to another domain (Domain B), a new domain. They have done some initial SEO work on this domain and had a pretty successful migration but it has also been decided that they are going to no longer offer one of these product tiers and they intend to eventually migrate everything back under the 1 domain (Domain A) They just are not sure whether they should do this now or later.
Algorithm Updates | | DRSearchEngOpt
During this next year or so there is also going to be some likely re-branding/design, etc...stemming from this decision, on the domain, meaning content changes and all that fun that goes into a migration/re-design/re-branding strategy. The timing of this has not been fully decided on. Here is the question: Should they a) Migrate back to Domain A first and then do the re-design or b) Keep 2 separate domains for now, figure out the re-design/re-branding, make content changes and then migrate Site A over in a year or so after all changes have been made? My concern with option a) is that they migrated a little less than 1 year ago and will be migrating back which I feel could have a negative impact on the content and the domain. The positive side I see here is that this impact could be just as large even if we waited so doing this now might be a better, more efficient use of our time if we can migrate and make content changes fairly close together or concurrently.
My concern with option b) is that the tier they no longer offer makes up the majority of that sites business and traffic, leaving us with not much in terms of content that ranks well and garners much traffic. Trying to optimize for the remaining product tier by itself on it's own domain could be quite hard and then having to migrate it in a year or so back to Domain A could negatively impact any small organic impact I can make on applicable pages/domain. Does anybody have any input here? I am leaning towards Option A and but wanted to get some other opinions. Thanks Everybody! Edit: So far, this has received a lot of views but no input. I am hoping to have a bit of a dialog on this so any ideas or input is welcome.0 -
Can site blocked for US visitors rank well internationally?
Because of regulatory reasons, a stock trading site needs to be blocked to United States visitors Since most of google datacenters seem to be located in the US, can this site rank well in the other countries where does business despite being blocked in the US? Do U.S. Google data centers influence only US rankings?
Algorithm Updates | | tabwebman0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Top resulting sites sites for a specific keyword
I'm teaching myself SEO so that I can speak more intelligently to it with my clients. I've spent a great deal of time on seomoz and love it. The more I learn, the more I realize I don't know and that brings me to my current question. I can search on a keyword and see results, however I see every URL available. I'm looking for a simple way to see the root domains for the top 100-500 resulting websites for a specific keyword. Is there an easy way to get this information I'm sure it's right in front of me, but I can't find it. Many thanks, ahossom
Algorithm Updates | | ahossom0 -
Best practice for someone wanting to repost / translate some of your blog posts?
I've been contacted by several sites (a few in other countries) who would like to repost some of our articles on their site. A few of these are in other countries and they would like to translate them in their language. (we have a site about raising a child with Down syndrome so they are wanting to use our info to help people...not "beat us" in rankings, or anything like that.) I didn't know what the best practice on this was. I don't want to get dinged for duplicate content or have someone rank higher than me for my own article, etc. Just curious what the best way to go about this was. I'm also assuming the articles that are translated wouldn't be an issue at all since the content will be in another language. Is this right? Thanks!
Algorithm Updates | | NoahsDad0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0