Best way to block a sub-domain from being indexed
-
Hello,
The search engines have indexed a sub-domain I did not want indexed its on
old.domain.com and dev.domain.com - I was going to password them but is there a best practice way to block them.
My main domain default robots.txt says :-
Sitemap: http://www.domain.com/sitemap.xml
global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category//
Disallow: */trackback/
Disallow: */feed/
Disallow: /comments/
Disallow: /? -
Hi,
CleverPhD has some interesting ideas with robots.txt and Google Webmaster Tools, but simply password protecting all dev pages should keep pages out of Google's index. There's no best practice here, since a password wall will keep Googlebot out on its own.
To be doubly safe, you can also include a meta noindex tag on dev pages.
Keep in mind that once a page is in Google's index, it's going to take awhile for it to leave (unless you use CleverPhD's method). But, having a blank page in Google's index really isn't all that bad. It's there, but it won't rank for much.
Hope this helps,
Kristina
-
I've never tried a method like this - FreshFireOne, did you?
-
First and foremost when you finish all this - password protect your dev instances. A url will leak out eventually and then this happens. I know it is a PIA, but it is worth it.
To remove subdomains. Go into GWT and register the subdomains as separate websites in GWT. Create a robots.txt for each subdomain (not the one you mention, you need a robots that is specific to that subdomain that disallows all files. If you cant do that, have your subdomains include a noindex meta tag on all pages. You have to be careful with this as you do not want to push out your dev. robots.txt or the noindex meta tags to your production server, but it can be done. Talk to your devs. Then go into GWT and use the URL removal tool. Just leave it blank and it will remove the whole site.
Poof. Gone. You can then watch the GWT accounts. They will show errors for the dev site like "Severe health issues are found on your site - Some important page has been removed by request." This is a good error as it confirms that that subdomain is removed.
We actually used this not on a dev site but on our www1 server that was indexed. We use a load balancer with multiple copies of the site. www1 was completing with www. Using this above did the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
block primary . xxx domain with disavow tool
Hi friends I discovered spam url attack on top sites with good google positions for specific keyword. Can I block primary domain like .xxx with disavow tool? There is hundreds of different domains but primary domain is always the same. for example like this domain:xxx? Thanks
Intermediate & Advanced SEO | | netcomsia0 -
301 redirecting staff Domain to Company Domain
My colleague owns a domain (A) for about 10 years that he does not use. The domain's content is the same as my company's website (B) content.
Intermediate & Advanced SEO | | khi5
Question: Can I 301 redirect domain A to domain B's homepage or is it better he just closes down his website since this would not be SEO best practices? thank you0 -
Sub domain on root domain
Hello,
Intermediate & Advanced SEO | | dror999
I have a question that I can't find a good answer on.
I have a site, actually a "portal"/ "directory" for service providers.
Now, for start, we opened every service provider own page on our site, but now we get a lot of applications from those providers that thy want sites from their own.
We want to make every service provider his own site, but on sub domain url. ( they don’t mind… its ok for them)
So, my site is www.exaple.com
There site will be: provider.exaple.com
Now I have two questions:
1. can it harm my site in SEO?
2. if one from those sub domain , punished by google because is owner do "black hat seo" , how it will affect the rood domin? It can make the root domain to get punished?
Thanks!!0 -
Should I block temporary pages
I need some SEO advice on an odd scenario: We are launching a new product line (party supplies) on it's own domain (PartySuperCenter.com). Due to some internal/technical reasons we will not be able to launch the site until the summer. We already have the product in our warehouse so the owners want to created a section on our current site (CostumeSuperCenter.com) for the new products. Once the new site is up the product will be removed from our current site and moved to the new site. I am concerned about the effect this will have on our SEO - having thousands of product pages appear and then disappear after a few months. I was thinking about blocking the pages using the "noindex" tag. Is this how you would handle it? Thanks in advance for your help!
Intermediate & Advanced SEO | | costume0 -
Indexing issue?
Hey guys when I do a search of site:thetechblock.com query in Google I don't seem to see any recent posts (nothing for August). In Google webmaster I see that the site is being crawled (I think), but I'm not sure. I also see the the sitemaps are being indexed but again it just seems really odd that I'm not seeing these in Google results. SEO seems all good too with SEO Moz. Is there something I'm not getting?
Intermediate & Advanced SEO | | ttb0 -
Issues with Sub domains for dealers
I'm starting a new SEO project and am feeling a little overwhelmed due to the scale of it. I am not sure where to start and hope that someone has some ideas. Thousands of dealer websites reside as sub domains on gravelymower.com/ (e.g. http://quality-mowers.gravelymower.com/) The particular sub domain mentioned above is not showing up at all for any searches and is not cached by Google: http://webcache.googleusercontent.com/search?q=cache:http://quality-mowers.gravelymower.com/ I realize that pretty much zero SEO best practices are followed on page and the location is not on the page, but why is this sub domain not even being indexed by Google? Any help is appreciated. Thanks!
Intermediate & Advanced SEO | | BridgelineDigital880 -
Best way to geo redirect
Hi I have a couple of ecommerce websites which have both a UK and USA store. At the moment I have both the UK and the USA domains sending me traffic from UK and USA search engines which means that a number of users are clicking a Google page for the store not in their location, ie UK people are clicking on a .com listing and ending up on the USA website. What is the best way to automatically redirect people to the correct store for their region? If I use an IP based auto redirect system would Google see some of the pages are doorway pages? Thanks
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Best way to consolidate link juice
I've got a conundrum I would appreciate your thoughts on. I have a main container page listing a group of products, linking out to individual product pages. The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products. Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe. To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page. However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page. Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this. Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method? Thanks
Intermediate & Advanced SEO | | James770