Hi
I'm in a similarish situation with a clients site.
Their situation is that the dev site is on a subdomain i.e. staging.domain.com and they want to keep the staging area active for demonstrating future development work, so situation may be slightly different from yours.
They have now blocked via robot.txt but that's like shutting the stable door after the horse has already bolted.
I asked Moz Q&A a few months ago and got the below answer from a few very helpful and wize Mozzers
-
Setup a completely different Webmaster Tools account unrelated to the main site, so that there
is a new W.T account specific to the staging area sub-domain -
Add a robots.txt on the staging area sub domain site that disallows all pages and all crawlers
OR use the no-index meta tag on all pages but Google much prefers Robots.txt usage for this
Note: Its very important when you update the main site it does not include or push out these files and
instructions too (since that would result in main site being de-indexed)
-
Request removal of all pages in GWT. Leave the form field for the page to be removed blank,
since will remove all subdomain pages -
After about 1 month OR you see that the pages are all out of the Search Engine listings (SERPS),
and Google has spidered and seen the robots.txt, then put up a password on the entire staging
site.
Hope that helps
All Best
Dan