Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain Removal in Robots.txt with Conditional Logic??
-
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live.
My specific situation is this:
I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content.
Should I:
a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense)
b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning...
Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
-
Here's how I dealt with a similar situation in the past.
Robots.txt on each of the dev subdomains and on the live domain. Dev subdomains robots.txt excluded the entire subdomain, and subdomains were verified in GWT and removed as needed.
Made live subdomain robots.txt read-only so it didn't get overwritten. Should have made dev subdomains robots.txt read-only as well, since they sometimes got refreshed with the live content (there was a UGC database that would occasionally get copied to a dev subdomain, and we'd have robots.txt get copied over too and dev subdomain indexed).
Set up a code monitor that checks the contents of all of the robots.txt daily and sends me an email if anything is changed.
Not perfect, but I was at least able to catch changes soon after they happened, and prevented a few changes.
-
you can't put logic in robots.txt and subdomains are seen as different sites, so you need to create separate robots.txt files for each subdomain and block them in their respective robots.txt files.
You'll need to also add the Google verification code and verify them, then in GWMT you can request to have the subdomain removed from Googles index, that's the fastest way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage was removed from google and got deranked
Hello experts I have a problem. The main page of my homepage got deranked severely and now I am not sure how to get the rank back. It started when I accidentally canonicalized the main page "https://kv16.dk" to a page that did not exist. 4 months later the page got deranked, and you were not able to see the "main page" in the search results at all, not even when searching for "kv16.dk". Then we discovered the canonicalization mistake and fixed it, and were able to get the main page back in the search results when searching for "kv16.dk". At first after we made the correction, some weeks passed by, and the ranking didn't get better. Google search console recommended uploading a sitemap, do we did that. However in this sitemap there was a lot of "thin content sites", for all the wordpress attachments. E.g. for every image in an article. more exactly there were 91 of these attachment sites, and the rest of the page consists of only two pages "main page" and an extra landing page. After that google begun recommending the attachment urls in some searches. We tried fixing it by redirecting all the attachments to their simple form. E.g. if it was an attachment page for an image we redirected strait to the image. Google has not yet removed these attachment pages, so the question is if you think it will help to remove the attachments via google search console, or will that not help at all? For example when we search "kv16" an attachment URL named "birksø" is one of the first results
Technical SEO | | Christian_T0 -
Bulk URL Removal in Webmaster Tools
One of Wordpress sites was hacked (for about 10 hours), and Google picked up 4000+ urls in the index. The site is fixed, but I'm stuck with all those urls in the index. All the urls of of the form: walkerorthodontics.com/index.php?online-payday-cash-loan.htmloncewe The only bulk removal option I could find was to remove an entire folder, but I can't do that, as it would only leave the homepage and kill off everything else. For some crazy reason, the removal tools doesn't support wildcards, so that obvious solution is right out. So, how do it get rid of 4000 results? And no, waiting around for them to 404 out of the index isn't an option.
Technical SEO | | MichaelGregory0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
Removing a large number of unnecessary pages from a site
Hi all, I got a big problem with my website. I have a lot of page, duplicate page made from various combinations of selects, and for all this duplicate content we've be hit by a panda update 2 years ago. I don't want to bring new content an all of these pages, about 3.000.000, because most of them are unnecessary. Google indexed all of them (3.000.000), and I want to redirect the pages that I don't need anymore to the most important ones. My question, is there any problem in how google will see this change, because after this it will remain only 5000-6000 relevant pages?
Technical SEO | | Silviu0 -
I accidentally blocked Google with Robots.txt. What next?
Last week I uploaded my site and forgot to remove the robots.txt file with this text: User-agent: * Disallow: / I dropped from page 11 on my main keywords to past page 50. I caught it 2-3 days later and have now fixed it. I re-imported my site map with Webmaster Tools and I also did a Fetch as Google through Webmaster Tools. I tweeted out my URL to hopefully get Google to crawl it faster too. Webmaster Tools no longer says that the site is experiencing outages, but when I look at my blocked URLs it still says 249 are blocked. That's actually gone up since I made the fix. In the Google search results, it still no longer has my page title and the description still says "A description for this result is not available because of this site's robots.txt – learn more." How will this affect me long-term? When will I recover my rankings? Is there anything else I can do? Thanks for your input! www.decalsforthewall.com
Technical SEO | | Webmaster1230 -
Exact match subdomains
Hi, I have seen significant SEO benefits from owning exact match domains and was wondering whether exact match subdomains hold the same (or some) of these benefits? eg. halloweencostumes.co.uk vs. halloween [dot] costumes.co.uk Many thanks.
Technical SEO | | martyc0 -
How to remove a sub domain from Google Index!
Hello, I have a website having many subdomains having same copy of content i think its harming my SEO for that site since abc and xyz sub domains do have same contents. Thus i require to know i have already deleted required subdomain DNS RECORDS now how to have those pages removed from Google index as well ? The DNS Records no more exists for those subdomains already.
Technical SEO | | anand20100 -
Robots.txt File Redirects to Home Page
I've been doing some site analysis for a new SEO client and it has been brought to my attention that their robots.txt file redirects to their homepage. I was wondering: Is there a benfit to setup your robots.txt file to do this? Will this effect how their site will get indexed? Thanks for your response! Kyle Site URL: http://www.radisphere.net/
Technical SEO | | kchandler0