Robots.txt disallow subdomain
-
Hi all,
I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only?
Thanks in advance!
-
I would suggest you talk to the developers as Theo suggests to exclude visitors from your test site.
-
The copying is a manual process and I don't want any risks for the live environment. A Httphandler for robots.txt could be a solution and I'm going to discuss this with one of our developers. Other suggestions are still welcome of course!
-
Do you ftp copy one domain to the other? If this is a manual process the excluding the robots.txt that is on the test domain would be as simple as excluding it.
If you automate the copy and want code to function based on base url address then you could create a Httphandler for robots.txt that delivered a different version based on the request url host in the http request header.
-
You could use enviromental variables (for example in your env.ini or config.ini file) that are set to DEVELOPMENT, STAGING, or LIVE based on the appropriate environments the code finds itself in.
With the exact same code, your website would either be limiting IP addresses (on the development environment) or allow all IP addresses (in the live environment). With this setup you can also set different variables per environment such as the level of detail that is shown in your error reporting, connect to a testing database rather than a live one, etc.
[this was supposed to be a reply, but I accidentely clicked the wrong button. Hitting 'Delete reply' results in an error.]
-
Thanks for your quick reply, Theo. Unfortunately, this htpasswd will also get copied to the live environment, so our websites will get password protected live. Could there be any other solution for this?
-
I'm sure there is, but I'm guessing you don't want any human visitors to go to your development subdomain and view what is being done there as well? I'd suggest you either limit the visitors that have access by IP address (thereby effectively blocking out Google in one move) and/or implement a .htpasswd solution where developers can log in with their credentials to your development area (which blocks out Google as well).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Control Over Subdomains - What Will the Effect Be?
Hello all, I work for a university and I my small team is responsible for the digital marketing, website, etc. We recently had a big initiative on SEO and generating traffic to our website. The issue I am having is that my department only "owns" the www subdomain. There are lots of other subdomains out there. For example, a specific department can have its own subdomain at department.domain.com and students can have their own webpage at students.domain.com, etc. I know the possibilities of domain cannibilization, but has any one run into long term problems with a similar situation or had success in altering the views of a large organization? If I do get the opportunity to help some of these other domains, what is best to help our overall domain authority? Should the focus be on removing similar content to the www subdomain or cleaning up errors? Some of these subdomains have hundreds of 4XX errors.
Intermediate & Advanced SEO | | Jeff_Bender0 -
Are TLD and numbers in subdomain ranking factors?
Several years ago my firm migrated our domain from a very lengthy 3point7designs.com to 3.7designs.co (we couldn't get 7designs.com at the time) thinking this would be a clever way to brand the name 3.7 Designs. Ever since that change we've had a dramatic reduction in search rankings which has lasted years. https://monosnap.com/file/adJUdkX9YCXQaODcXype4qza70pMCE You can see the drop in early 2011, we made the switch in February. I've read some discussion about Google changing weights based on having numbers in the subdomain as it appears spammy. I've also herd speculation about .co vs .com. Further evidence is being outranked by a competitor for a term we previously dominated despite having higher domain authority, inbound links, exact match keyword in our title and content. We now own 37designs.com and 7designs.com and are contemplating a switch. Any insight into these being ranking factors or is the site being penalized for other reasons?
Intermediate & Advanced SEO | | 3PointRoss0 -
Subdomain Blog Sitemap link - Add it to regular domain?
Example of setup:
Intermediate & Advanced SEO | | EEE3
www.fancydomain.com
blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!0 -
Subdomain vs root which is better for SEO
We run a network of sites that we are considering consolidating into one main site with multiple categories. Which would be better having each of the "topics / site" reside in subdomains or as a sub-folder off of the root? Pros and cons of each would be great. Thanks, TR
Intermediate & Advanced SEO | | DisMedia0 -
If i disallow unfriendly URL via robots.txt, will its friendly counterpart still be indexed?
Our not-so-lovely CMS loves to render pages regardless of the URL structure, just as long as the page name itself is correct. For example, it will render the following as the same page: example.com/123.html example.com/dumb/123.html example.com/really/dumb/duplicative/URL/123.html To help combat this, we are creating mod rewrites with friendly urls, so all of the above would simply render as example.com/123 I understand robots.txt respects the wildcard (*), so I was considering adding this to our robots.txt: Disallow: */123.html If I move forward, will this block all of the potential permutations of the directories preceding 123.html yet not block our friendly example.com/123? Oh, and yes, we do use the canonical tag religiously - we're just mucking with the robots.txt as an added safety net.
Intermediate & Advanced SEO | | mrwestern0 -
Subdomain or directory path?
Hi Mozzers, Client: Important carpet cleaner player in the carpet cleaning industry Main Goal: Creating good content to Get more organic traffic to our main site Structure of the extra content: It will act like a blog but will be differentiated from the regular site by not selling anything but just creating good content. The look and design will be different from the client's site. SEO Question: Which option is more beneficial, creating a subdomain or adding a regular page within the website following a directory path URL? If possible, please state what are the advantages and disadvantages of these 2 options in terms of SEO. Thank you and have a great weekend everyone,
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Subdomains and SEO - Should we redirect to subfolder?
A new client has mainsite.com and a large numer of city specific sub domains i.e. albany.mainsite.com. I think that these subdomains would actually work better as subfolders i.e mainsite.com/albany rather than albany.mainsite.com. The majority of links on the subdomains link to the main site anyway i.e. mainsite.com/contactus rather than albany.mainsite.com/contactus. Having mostly main domain links on a subdomain doesnt seem like clever link architecture to me and maybe even spammy. Im not overly familiar with redirecting subdomains to subfolders. If we go the route of 301'ing subdomains to subfolders any advice/warnings?
Intermediate & Advanced SEO | | AndyMacLean0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0