Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best way to block a sub-domain from being indexed
-
Hello,
The search engines have indexed a sub-domain I did not want indexed its on
old.domain.com and dev.domain.com - I was going to password them but is there a best practice way to block them.
My main domain default robots.txt says :-
Sitemap: http://www.domain.com/sitemap.xml
global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category//
Disallow: */trackback/
Disallow: */feed/
Disallow: /comments/
Disallow: /? -
Hi,
CleverPhD has some interesting ideas with robots.txt and Google Webmaster Tools, but simply password protecting all dev pages should keep pages out of Google's index. There's no best practice here, since a password wall will keep Googlebot out on its own.
To be doubly safe, you can also include a meta noindex tag on dev pages.
Keep in mind that once a page is in Google's index, it's going to take awhile for it to leave (unless you use CleverPhD's method). But, having a blank page in Google's index really isn't all that bad. It's there, but it won't rank for much.
Hope this helps,
Kristina
-
I've never tried a method like this - FreshFireOne, did you?
-
First and foremost when you finish all this - password protect your dev instances. A url will leak out eventually and then this happens. I know it is a PIA, but it is worth it.
To remove subdomains. Go into GWT and register the subdomains as separate websites in GWT. Create a robots.txt for each subdomain (not the one you mention, you need a robots that is specific to that subdomain that disallows all files. If you cant do that, have your subdomains include a noindex meta tag on all pages. You have to be careful with this as you do not want to push out your dev. robots.txt or the noindex meta tags to your production server, but it can be done. Talk to your devs. Then go into GWT and use the URL removal tool. Just leave it blank and it will remove the whole site.
Poof. Gone. You can then watch the GWT accounts. They will show errors for the dev site like "Severe health issues are found on your site - Some important page has been removed by request." This is a good error as it confirms that that subdomain is removed.
We actually used this not on a dev site but on our www1 server that was indexed. We use a load balancer with multiple copies of the site. www1 was completing with www. Using this above did the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to set up URL structure for reviews off of PDP pages.
We are adding existing customer reviews to Product Detail Pages pages. There are about 300 reviews per product so we're going to have to paginate reviews off of the PDP page. I'm wondering what the best url structure for reviews pages is to get the most seo benefit. For example, would it be something like this? site.com/category/product/reviews/page-1 or something that used parameters, such as: site.com/reviews?product=a Also, what is the best way to show that the internal link on the PDP page to "All Reviews" is a higher priority link than the other links on the page?
Intermediate & Advanced SEO | | katseo10 -
Move domain to new domain, for how much time should I keep forwarding?
I'm not sure but my website looks like is not getting it's juice as supposed to be. As we already know, google preferred https sites and this is what happened to mine, it was been crawling as https but when the time came to move my domain to new domain, I used 301 or domain forwarding service, unfortunately they didn't have a way to forward from https to new https, they only had regular http to https, when users clicked to my old domain from google search my site was returned to "site does not exist", I used hreflang at least that google would detect my new domain been forwarding and yes it worked but now I'm wondering, for how much time should I keep the forwarding the old domain to the new one, my site looks like is not going up, I have changed all the external links, any help would be appreciated. Thanks!
Intermediate & Advanced SEO | | Fulanito1 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Is it safe to 301 redirect old domain to new domain after a manual unnatural links penalty?
I have recently taken on a client that has been manually penalised for spammy link building by two previous SEOs. Having just read this excellent discussion, http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience I am weighing up the odds of whether it's better to cut losses and recommend moving domains. I had thought under these circumstances it was important not to 301 the old domain to the new domain but the author (Lewis Sellers) comments on 3/4/13 that he is aware of forwards having been implemented without transferring the penalty to the new domain. http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience#jtc216689 Is it safe to 301? What's the latest thinking?
Intermediate & Advanced SEO | | Ewan.Kennedy0 -
Yoast SEO Plugin: To Index or Not to index Categories?
Taking a poll out there......In most cases would you want to index or NOT index your category pages using the Yoast SEO plugin?
Intermediate & Advanced SEO | | webestate0 -
Best way to merge 2 ecommerce sites
Our Client owns two ecommerce websites. Website A sells 20 related brands. Website has improving search rank, but not normally on the second to fourth page of google. Website B was purchased from a competitor. It has 1 brand (also sold on site A). Search results are normally high on the first page of google. Client wants to consider merging the two sites. We are looking at options. Option 1: Do nothing, site B dominates it’s brand, but this will not do anything to boost site A. Option 2: keep both sites running, but put lots of canonical tags on site B pointing to site A Option 3: close down site B and make a lot of 301 redirects to site A Option 4: ??? Any thoughts on this would be great. We want to do this in a way that boosts site A as much as possible without losing sales on the one brand that site B sells.
Intermediate & Advanced SEO | | EugeneF0 -
Any way to find which domains are 301 redirected to competitors' websites?
By looking at the work from an SEO collegue it became clear that his weak linkbuilding graph probably is not the cause for his good rankings for a pretty competitive keyword. (also no social mentions where found) I was wondering what it could be, site structure and other on page optimization factors seems to be ok and I don't think there will be exceptionally good or bad user behavior... Finally I looked at the competitors and found that they have more links, better content en better design, so I got a little stuck. The only reason I can think of is that he is doing 301 redirects (or is rel=canonical tags). Is there a way to trace these redirects back to the source in order to include this important variable in your competitor research? thnx
Intermediate & Advanced SEO | | djingel10