Dev Site Was Indexed By Google
-
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out?
From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything.
Any advice is welcome, I just wanted to discuss this before making a decision.
-
We ran into this in the past, and one thing that we (think) happened is that the links to the dev site were sent via email to several gmail accounts. We think this is how Google then indexed the site, as there were no inbound links posted anywhere.
I think that the main issue is how it's perceived by the client, and if they are freaking out about it. In that case, using an access control password to prevent anyone from coming to the site will limit anyone from seeing it.
The robot.txt file should flush it out, but yes, it takes a little bit of time.
-
I've had this happen before. In the dev subdomain, I added a robots.txt that excluded everything, verified the subdomain as its own site in GWT, then asked for that site (dev subdomain) to be removed.
I then went and used a free code monitoring service that checked for code changes of a URL once a day. I set it up to check the live site robots.txt and the robots.txt of all of the dev sites, so I'd know within 24 hours if the developers had tweaked the robots.txt.
-
Hi Tyler,
You definitely don't want to battle yourself for duplicate content. If the current sub-domains have little link juice (in links) to them, I would simply block the domain from being further indexed. If there are a couple pages that are of high value it maybe worth the time to use a 301 redirect to prevent losing any links / juice.
Using robots.txt or noindex / tags may work, but in my personal experience the easiest and most efficient way to block any indexing is simply use .htaccess / .htpasswrd this will prevent anybody without credentials from even viewing your site effectively blocking all spiders / bots and unwanted snoopers.
-
Hey Tyler,
We would follow the same protocol if in your shoes. Remove any instance of the indexed dev subdomain(s), then create your new robot.txts files for each subdomain and disavow any indexed content/links as an extra step. Also, double check and even resubmit your root domain's XML sitemap so Google can reindex your main content/links as a precautionary measure.
PS - We develop on a separate server and domain for any new work for our site or any client sites. Doing this allows us to block Google from everything.
Hope this was helpful! - Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can adding thousands of new indexable URLs to my site at once be a problem?
Hi everyone, I am currently working on a project that will quickly add thousands of new indexable URLs to my site. For context, the site currently has over a million indexable pages. Is there any danger of adding a few thousand URLs at once to the site? Could it potentially affect crawlability/SEO/other pages? Thank you!
Technical SEO | | StevenLevine0 -
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
What can i do to get google to visit my site more often
Hi, i am having serious problems since i upgraded my website from joomla 1.5 to 3.0 We have dropped down the rankings from page one for the word lifestyle magazine, and we have dropped down in rankings for other very important words including gastric band hypnotherapy and i am starting to regret having the site upgraded. i am finding the google is taking its time visiting my site, i know this for two reasons, one i have checked the cache and it is showing the 2nd july and i have checked articles that we have written and they are still not showing. example if i put this article name in word for word it does not come up, Carnival Divert Ships In The Caribbean Due To bad Weather this was an article that was done yesterday. in the old days before the upgrade that would have been in google now. these problems are costing us the loss of a great deal of traffic, we are losing around 70% of our traffic since the upgrade and would be grateful if people could give me advice on how to turn things around. we add articles all the time. each day we add a number of articles, i was considering changing the front page in the middle and having a few paragraphs of the latest story to get google to visit more often. i know this would look messy but i am running out of ideas. any help would be great
Technical SEO | | ClaireH-1848860 -
How should i knows google to indexed my new pages ?
I have added many products in my ecommerce site but most of the google still not indexed yet. I already submitted sitemap a month ago but indexed process was very slow. Is there anyway to know the google to indexed my products or pages immediately. I can do ping but always doing ping is not the good idea. Any more suggestions ?
Technical SEO | | chandubaba1 -
Google haveing problems accessing part of my site
hi my site is, www.in2town.co.uk and for a few weeks now google has had trouble accessing part of my site. Today googlewebmaster tools tells me that google is having major problems it shows, 123 pages where access were denied. i have spoken to my hosting company who could not find a problem, so not sure what to do now. can anyone please give me advice on what the problem may be. any help would be great
Technical SEO | | ClaireH-1848860 -
Has anyone had problems with google webmaster tools verified sites
Hi, i have just been into google webmaster tools and i have noticed that five of my websites are no longer verified. i have tried putting the code back into the head and also i have tried verifying it through google analaystics but nothing is working can anyone let me know what has happened and if anyone has noticed this regards
Technical SEO | | ClaireH-1848860 -
What changes do i need to make to my site to get into google news
Hi, when we had the old design, we were in google news but then when we upgraded our site, we had a major problem which forced us to have to redesign our site. Since then we have not been included in google news and we would like to get back in. We only want to be in google news for the following page http://www.in2town.co.uk/Latest-News-Headlines But for some reason, no matter what we do we keep getting knocked back. I would love to know what we should be doing to get into google news and see what the problems are. We have moved to a bigger dedicated server to increase speed so i know it is not that. Any help would be great Also is there an alternative to google news that i can get our site into to generate traffic and to get our news stories straight out to people Hi, Thank you for your note. We appreciate your interest in sharing your content with us. However, when we reviewed your site, we found that we cannot include it in Google News at this time. We have certain guidelines in place regarding the quality of sites which are included in Google News. Please feel free to review these guidelines at the following link: http://www.google.com/support/news_pub/bin/answer.py?hl=en&answer=40787 We know it can be frustrating to not have more information about this but we appreciate your efforts and understanding. We will log your site for future consideration. Please keep in mind that we will be unlikely to review your site for at least 60 days following this email. Thanks for your understanding and your continued interest in Google News. Regards,
Technical SEO | | ClaireH-184886
The Google News Team0 -
How does Google Crawl Multi-Regional Sites?
I've been reading up on this on Webmaster Tools but just wanted to see if anyone could explain it a bit better. I have a website which is going live soon which is going to be set up to redirect to a localised URL based on the IP address i.e. NZ IP ranges will go to .co.nz, Aus IP addresses would go to .com.au and then USA or other non-specified IP addresses will go to the .com address. There is a single CMS installation for the website. Does this impact the way in which Google is able to search the site? Will all domains be crawled or just one? Any help would be great - thanks!
Technical SEO | | lemonz0