Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with Getting Googlebot to See Google Charts
We received a message from Google saying we have an extremely high number of URLs that are linking to pages with similar or duplicate content. The main difference between these pages are the Google charts we use. It looks like Google isn't able to see these charts (most of the text are very similar) and the charts (lots of it) are the main differences between these pages. So my question is what is the best approach to allowing Google to see the data that exists in these charts? I read from here http://webmasters.stackexchange.com/questions/69818/how-can-i-get-google-to-index-content-that-is-written-into-the-page-with-javascr that a solution would be to have the text that is displayed on the charts coded into the html and hidden by CSS. I'm not sure but it seems like a bad idea to have it seen by Google but hidden to the user by CSS. It just sounds like a cloaking hack. Can someone clarify if this is even a solution or is there a better solution?
Technical SEO | | ERICompensationAnalytics1 -
My website internal pages are not getting cached with latest data
In our website we have sector list, in home page main category list is displayed click on main category user has to select sub category and reach the result page. EX: Agriculture->Agribusiness->Rice Agriculture page is indexed,but Agribusiness and Rice page is not getting cached,it is showing old indexed date as 23 July 2013,but i have submitted the sitemaps after this 4 times, and some url i have submitted manually in web master tool, but after this also my pages are not cached recently, Please suggest the solution and what might be the problem Thank you In Advance, Anne
Technical SEO | | Vidyavati0 -
Speed benefits from loading images from a subdomain
I have read that loading images from a subdomain of your site instead of the main domain will give you speed benefits on load time. Has anyone actually seen that to be the case? Thanks!
Technical SEO | | Gordian0 -
Multiple Subdomains, my worst seo mistake. now what should i do?
Hello Everyone, I have been running www.designzzz.com from lats 3 years now. and was doing extremely good with a PR 6 and 800K+ traffic monthly, but 6 months ago it started falling and falling badly.. now i am down to 350K total impressions :{ per month. I have been blaming penguin for this and been talking to google reps continously over it. they assured me that my site is not under any type of manual spam etc. Then i begin think and i realized taht was exactly the time when i launched a few subdomains as sub parts of my site like coding.designzzz.com , wordpress.designzzz.com , photograph.designzzz.com , shop.designzzz.com in the making... now is the part that i can't undo these subdomains.. what should i do ? my search traffic is almost killed. I seriously need insight on this guys : thanks in advance! Ayaz
Technical SEO | | wickedsunny10 -
How do I get Update Date in SERP snippets?
How do I get Update Date in SERP snippets as opposed to the Date Created?
Technical SEO | | Travis-W1 -
Have a client that migrated their site; went live with noindex/nofollow and for last two SEOMoz crawls only getting one page crawled. In contrast, G.A. is crawling all pages. Just wait?
Client site is 15 + pages. New site had noindex/nofollow removed prior to last two crawls.
Technical SEO | | alankoen1230 -
How can you get the right site links for your site?
Hello all, I have been trying to get Google to list relevant site links for my site when you type in our brand name, Loco2 or for when Loco2 comes up in a search result. Different things come up when you search Loco2 and Loco 2. We would like site links to look like how they do when you search Loco 2. However Loco2 is our brand name, NOT Loco 2. Does anyone know why Google is doing this and whether we can influence results? We have done as much as possible via Google webmaster, in terms of specifying the links we DO NOT want Google to list for Loco2. However, when you search "Loco2", results only show simple site links. Ideally what we want is: Loco2 to be recognised as the brand NOT Loco 2 The same results (substantial, identical) for Loco2 as for Loco 2 (think o2 and o 2) For the site links to reflect the main pages of our site (Times & Tickets, Engine Room forum etc.) Many thanks in advance! Anila
Technical SEO | | anilababla0 -
Why do I get duplicate page title errors.
I keep getting duplicate page title errors on www.etraxc.com/ and www.etraxc.com/default.asp, which are both pointing to the same page. How do i resolve this and how bad is it hurting my SEO.
Technical SEO | | bobbabuoy0