Development site is live (and has indexed) alongside live site - what's the best course of action?
-
Hello Mozzers,
I am undertaking a site audit and have just noticed that the developer has left the development site up and it has indexed. They 301d from pages on old site to equivalent pages on new site but seem to have allowed the development site to index, and they haven't switched off the development site. So would the best option be to redirect the development site pages to the homepage of the new site (there is no PR on dev site and there are no links incoming to dev site, so nothing much to lose...)? Or should I request equivalent to equivalent page redirection?
Alternatively I can simply ask for the dev site to be switched off and the URLs removed via WMT, I guess...
Thanks in advance for your help!
-
Very pleased to have been of assistance
heres links to older threads where i asked similar before, for further verification and credit to those that originally helped me:
-
Thanks Amelia - yes you're definitely on the right lines - Dan's response below is v helpful too, that's for sure. I do struggle with developers from time to time, so teaching myself coding and so on via codeacademy, etc. - learnt at uni many years ago but v out of date! Will come in useful for SEO too.
-
Many thanks Dan - much appreciated - that process there makes perfect sense even though in my case too :)))) I will report back on progress in a month or so...
-
Yes a great answer there from Dan - and thanks for your useful input - good point re: not relying on robots.txt alone!
-
Thanks Robert and for the extra comments too !
I cant remember which Mozzer helped me with the above in the first place who should be credited but ill track down the original thread and add it to this post since also contains further info and discussion
All Best
Dan
-
Dan,
This is a very good answer. Just to emphasize, probably the most important piece with a "dev" site is the last one Dan mentions: Password protection. Once you clean up the issue, add it then you should not have the issue going forward.
Even with robots.txt on our dev sites and our design studio, we have had pages end up on the SERPS. Because of the DA of our design studio (where clients go to approve a comp, etc.) we recently had a new political client's comp ranking for a search term on page one. (Ahead of their actual site (we were building another to replace it). So, even with robots.txt, there is still no guarantee it will not be crawled.
Adding password protection will assist in that.Lastly, if you have someone building you a site, and they say they do not want to take down the dev version after your launch, tell them you do not wish to pay them. It will go down. That is unreasonable. I cannot think of a reason to keep the dev version live once the client site launches.
Again, good job Dan.
-
Hi
I'm in a similarish situation with a clients site.
Their situation is that the dev site is on a subdomain i.e. staging.domain.com and they want to keep the staging area active for demonstrating future development work, so situation may be slightly different from yours.
They have now blocked via robot.txt but that's like shutting the stable door after the horse has already bolted.
I asked Moz Q&A a few months ago and got the below answer from a few very helpful and wize Mozzers
-
Setup a completely different Webmaster Tools account unrelated to the main site, so that there
is a new W.T account specific to the staging area sub-domain -
Add a robots.txt on the staging area sub domain site that disallows all pages and all crawlers
OR use the no-index meta tag on all pages but Google much prefers Robots.txt usage for this
Note: Its very important when you update the main site it does not include or push out these files and
instructions too (since that would result in main site being de-indexed)-
Request removal of all pages in GWT. Leave the form field for the page to be removed blank,
since will remove all subdomain pages -
After about 1 month OR you see that the pages are all out of the Search Engine listings (SERPS),
and Google has spidered and seen the robots.txt, then put up a password on the entire staging
site.
Hope that helps
All Best
Dan
-
-
Hi Luke,
I'm interested in other responses to this question...
If I was in your position after seriously berating the dev I would make sure you disallow the dev site in your robots.txt and use webmaster tools to remove the URLs from the index. Then I would password protect the dev site so the search engines couldn't get there even if they try.
Like I say, I'm interested in other responses! This is what I would do, but I don't really know if it's definitely the right thing to do. Does anyone else have anything to add?
Best of luck - its crappy when someone else's error cocks up your work: when our site launched for the first time our IT department screwed up on a monumental scale by getting the DNS settings wrong.
Amelia
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
What's the best way to deal with deleted .php files showing as 404s in WMT?
Disclaimer: I am not a developer During a recent site migration I have seen a bit of an increase in WMT of 404 errors on pages ending .php. Click on the link in WMT and it just shows as File Not Found - no 404 page. There are about 20 in total showing in webmaster tools and I want to advise the IT department what to do. What is the best way to deal with this for on-page best practice? Thanks
Intermediate & Advanced SEO | | Blaze-Communication0 -
Huge Google index on E-commerce site
Hi Guys, Refering back to my original post I would first like to thank you guys for all the advice. We implemented canonical url's all over the site and noindexed some url's with robots.txt and the site already went from 100.000+ url's indexed to 87.000 urls indexed in GWT. My question: Is there way to speed this up?
Intermediate & Advanced SEO | | ssiebn7
I do know about the way to remove url's from index (with noindex of robots.txt condition) but this is a very intensive way to do so. I was hoping you guys maybe have a solution for this.. 🙂0 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0 -
Traffic drop off and page isn't indexed
In the last couple weeks my impressiona and clicks have dropped off to about half what it used to be. I am wondering if Google is punishing me for something... I also added two new pages to my site in the first week of June and they still aren't indexed. In the past it seemed like new pages would be indexed in a couple days. Is there any way to tell if Google is unhappy with my site? WMT shows 3 server errors, 3 Access denied, and 122 not found errors. Could those not found pages be killing me? Thanks for any advise, Greg www.AntiqueBanknotes.com
Intermediate & Advanced SEO | | Banknotes0 -
For multi language sites, what is best - two domains or one with both languages?
We are assisting a client in setting up English and Spanish sites in Texas. They want to be able to find customers who are Spanish speaking predominantly or totally along with the customers they now get who are English speakers. We are building them a new site and I have researched to find answers all over the board or less than clear. Should the structure be such that we have one site with a set of English and Spanish pages all with Spanish links to Spanish pages and English links to English pages. Should we instead just have an English site for those people who utilize English and a different site for those who utilize Spanish? Thanks
Intermediate & Advanced SEO | | RobertFisher0 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0