Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
-
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like:
staging.domain.com
User-agent: *
Disallow: /in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.
-
Just make sure that when/if you copy over the staging site to the live domain that you don't copy over the robots.txt, htaccess, or whatever means you use to block that site from being indexed and thus have your shiny new site be blocked.
-
I agree. The name of your subdomain being "staging" didn't register at all with me until Matt brought it up. I was offering a generic response to the subdomain question whereas I believe Matt focused on how to handle a staging site. Interesting viewpoint.
-
Matt/Ryan-
Great discussion, thanks for the input. The staging.domain.com is just one of the domains we don't want indexed. Some of them still need to be accessed by the public, some like staging could be restricted to specific IPs.
I realize after your discussion I probably should have used a different example of a sub-domain. On the other hand it might not have sparked the discussion so maybe it was a good example
-
.htaccess files can be placed at any directory level of a site so you can do it for just the subdomain or even just a directory of a domain.
-
Staging URL's are typically only used for testing so rather than do a deny I would recommend using a specific ALLOW for only the IP addresses that should be allowed access.
I would imagine you don't want it indexed because you don't want the rest of the world knowing about it.
You can also use HTACCESS to use username/passwords. It is simple but you can give that to clients if that is a concern/need.
-
Correct.
-
Toren, I would not recommend that solution. There is nothing to prevent Googlebot from crawling your site via almost any IP. If you found 100 IPs used by the crawler and blocked them all, there is nothing to stop the crawler from using IP #101 next month. Once the subdomain's content is located and indexed, it will be a headache fixing the issue.
The best solution is always going to be a noindex meta tag on the pages you do not wish to be indexed. If that method is too much work or otherwise undesirable, you can use the robots.txt solution. There is no circumstance I can imagine where you would modify your htaccess file to block googlebot.
-
Hi Matt.
Perhaps I misunderstood the question but I believe Toren only wishes to prevent the subdomain from being indexed. If you restrict subdomain access by IP it would prevent visitors from accessing the content which I don't believe is the goal.
-
Interesting, hadn't thought of using htaccess to block Googlebot.Thanks for the suggestion.
-
Thanks Ryan. So you don't see any issues with de-indexing the main site if I created a second robots.txt file, e.g.
http://staging.domin.com/robots.txt
User-agent: *
Disallow: /That was my initial thought but when Google announced they consider sub-domains part of the TLD I was afraid it might affect the htp://www.domain.com versions of the pages. So you're saying the subdomain is basically treated like a folder you block on the primary domain?
-
Use an .htaccess file to only allow from certain ip addresses or ranges.
Here is an article describing how: http://www.kirupa.com/html5/htaccess_tricks.htm
-
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Place a robots.txt file in the root of the subdomain.
User-agent: *
Disallow: /This method will block the subdomain while leaving your primary domain unaffected.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why My site pages getting video index viewport issue?
Hello, I have been publishing a good number of blogs on my site Flooring Flow. Though, there's been an error of the video viewport on some of my articles. I have tried fixing it but the error is still showing in Google Search Console. Can anyone help me fix it out?
Technical SEO | | mitty270 -
Are images stored in Amazon S3 buckets indexable to your domain?
We're storing all our images in S3 bucket, common practice, but we want to get these images to drive traffic back to our site -- and credit for that traffic. We've configured the URLs to be s3.owler.com/<image_name>/<image_id>. I've not seen any of these images show in our web master tools. I am wondering if we're actually not going to get the credit for these images because technically they do sit on another domain. </image_id></image_name>
Technical SEO | | mindofmiller0 -
Best practice for multiple domain links
A site i'm working on has about 12 language domains - .es, it, .de etc. On each page of every domain the header has links to every homepage. At the moment these are all set to no-follow as an initial step to stop potential link profile issues spreading around. Moving forward i'm not totally sure how to handle these links. On one side I see and agree that no-follow is not necessary, but do-follow is just filtering out and weakening link juice. What is the best way to handle this scenario?
Technical SEO | | MickEdwards0 -
Getting a video displaying a lightbox indexed
We have created a video for a category page with the goal of building links to the page and improving the conversion rate of visitors to the page. This category is Christmas oriented so we want to get the video dropped in ASAP. Unfortunately there was a mixup with our developer and he created a lightbox pop-up to display the video on the category page. I'm concerned this will hurt our ability to get the video indexed in Google. Here was his response. Is what he says here true? "With the video originally being in lightbox the iFrame Embed was enough since the video can't be on the page, it would have to be hidden on the page which is ignored by Google. The SEO would be derived from modifying the video sitemap to define the category page as the HTML page for the Wistia video and Google will make the association. The sitemap did all the heavy lifting, the schema markup did not come till later so it had no additional affect on Google other then to re-enforce the sitemap." Thanks for your help!
Technical SEO | | GManSEO0 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
Avoiding Duplicate Content in E-Commerce Product Search/Sorting Results
How do you handle sorting on ecommerce sites? Does it look something like this? For Example: example.com/inventory.php example.com/inventory.php?category=used example.com/inventory.php?category=used&price=high example.com/inventory.php?category=used&location=seattle If not, how would you handle this? If so, would you just include a no-index tag on all sorted pages to avoid duplicate content issues? Also, how does pagination play into this? Would it be something like this? For Example: example.com/inventory.php?category=used&price=high__ example.com/inventory.php?category=used&price=high&page=2 example.com/inventory.php?category=used&price=high&page=3 If not, how would you handle this? If so, would you still include a no-index tag? Would you include a rel=next/prev tag on these pages in addition to or instead of the no-index tag? I hope this makes sense. Let me know if you need me to clarify any of this. Thanks in advance for your help!
Technical SEO | | AlexanderAvery1 -
Domain Perfect Match (ChocolateChocolate.com)
Hi Guys, I'm Lucas, from Brazil and this is my first question at this forum, i expend alot of time reading the forum content, its very good and very helpfully, thanks everyone! 🙂 . i'm trying to rank to a specific word, that word, lets say as "Chocolate", ok? If i buy the domain chocolatechocolate.com is can be considered a Perfect Match domain? Anyone have tried this? Thank's for your time..
Technical SEO | | lucassc0 -
Domain name with separated/non-separated keywords
I start a new webshop within a month about spices and coffee. I'm thinking about the domain name to take. I would like to get visitors from coffee and spice keyword searches. How much does it matter (in terms of SEO) if I use spiceandcoffee instead of spice-and-coffee? (The site will be hungarian and it sounds easy to remember without the hypen: fuszer-es-kave or fuszereskave.) Does Google weighing more separated keywords in domain, instead of non-separated?
Technical SEO | | joo0