Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm struggling to understand (and fix) why I'm getting a 404 error. The URL includes this "%5Bnull%20id=43484%5D" but I cannot find that anywhere in the referring URL. Does anyone know why please? Thanks
Can you help with how to fix this 404 error please? It appears that I have a redirect from one page to the other, although the referring page URL works, but it appears to be linking to another URL with this code at the end of the the URL - %5Bnull%20id=43484%5D that I'm struggling to find and fix. Thanks
Technical SEO | | Nichole.wynter20200 -
Getting a ton of "not found" errors in Webmaster tools stemming from /plugins/feedback.php
So recently Webmaster tools showed a million "not found" errors with the url "plugins/feedback.php/blah blah blah." A little googling helped me find that this comes from the Facebook comment box plugin. Apparently some changes recently have made this start happening. The question is, what's the right fix? The thread I was reading suggested adding "Disallow: /plugins/feedback.php" to the robots.txt file and marking them all fixed. Any ideas?
Technical SEO | | cbrant7770 -
2 sites versus a subdomain: Which is better?
I have a client that sponsors a couple of events during the year. They currently have pages within a single website for these events but are interested in creating a separate website so they can brand the events differently. I'm not sure this is the most effective way to do it for fear of losing the "google juice" already there for these pages.Here's what I'm thinking is a better strategy: 1) Host the content both on the main domain and the sub-domain2) Make sure there is a tag on each page of the sub-domain version that points to the main version.That will give them the branding they are seeking while pushing all juice across to the main domain.What are your thoughts?
Technical SEO | | Britewave0 -
Getting 404 error when open the cache link of my site
My site is hazanstadservice.se and when I am trying to open this to check the cache date i got a 404 error from google. I don't know why ? The cache page url is http://webcache.googleusercontent.com/search?q=cache:j99uW96RuToJ:www.hazanstadservice.se/+&cd=1&hl=en&ct=clnk.
Technical SEO | | Softlogique0 -
How to recover from duplicate subdomain penalty?
Two and half a weeks ago, my site was slapped with a penalty -- 60% of organic traffic disappeared over 2-3 days. After investigating we discovered that our site was serving the same content for all subdomains, and Google somehow had two additional subdomains it was crawling and indexing. We solved the issue with 301 redirects to our main site (www) a couple of days after the drop -- about two weeks ago. Our rankings have not recovered, and the subdomains are still indexed per Webmaster Tools. Yesterday we submitted a Reconsideration Request. Will that help? Is there any other way to speed up the process of lifting the penalty? This is the site: http://goo.gl/3DCbl Thank you!
Technical SEO | | tact0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Why is a 301 redirected url still getting indexed?
We recently fixed a redirect issue in a website, and although it appears that the redirection is working fine, the url in question keeps on getting crawled, indexed and cached by google. The redirect was done a month ago, and google shows cached version of it, even for a couple of days ago. Manual checking shows that its being redirected, and also a couple of online tools i checked report a 301 redirect. Do you have any idea why this could be happening? The website I'm talking about is www.hotelmajestic.gr and its being redirected to www.hotel-majestic.gr
Technical SEO | | dim_d0 -
If non-paying customers only get a 2 min snippet of a video, can my video length in sitemap.xml be the full length?
I am working on a website that all of its primary contents are videos. They have an assortment of free videos, but the majority or viewable only with a subscription to the site. If you don't have a subscription, you can see a 2 min video clip of the contents of the video. But all the videos can be anywhere from 10min to 1.5 hours. When I am auto-generating the sitemap.xml, can I put the full length of the videos for paying members in the XML in the video:duration property? Or because publicly only 2 minutes is available (unless you pay for a membership) is that frowned upon?
Technical SEO | | nbyloff0