User-agent: *
User-agent: dotbot
Disallow: /
User-agent: rogerbot
Disallow: /
if you want to prevent robots from crawling your site truly prevent them you will need to use either a password restriction or a tool similar to this
http://www.distilnetworks.com/
If you see what is being said by Google and Moz a robots.txt file can not guarantee blocking something that is linked to. if you want to do that you will have to block the referral using a WAF like distilnetworks
http://moz.com/help/guides/search-overview/crawl-diagnostics
https://moz.com/researchtools/ose/dotbot
&
https://support.google.com/webmasters/answer/6062608?rd=2
Also blocking link analysis user agents that are nothing but a drain on your resources is a good idea. Simple enough to do in htaccess with something like this:
Search Engine Blocked by Robots.txt
This page cannot be crawled by search engines due to the robots.txt protocol. If you're seeking to remove this page from search results, we recommend that you use meta robots (with noindex, follow values) instead of robots.txt. This will ensure that the page does not appear in the results but allows link juice to flow through the page's links and count towards the relevance/popularity of other pages on your site.
How to block DotBot from crawling your site
If you don't want dotbot crawling your site, we always respect the standard Robots Exclusion Protocol (aka robots.txt). If you would like to block dotbot, all you need to do is add our user-agent string to your robots.txt file.
If you want to ban dotbot from most areas of your site, it looks a little something like this:
User-agent: dotbot
Disallow: /admin/
Disallow: /scripts/
Disallow: /images/
below this I have placed what somebody has created that they state works I do not know if it works I told you that distill networks will work but I cannot guarantee the very bottom I think you will not have any trouble if you set up the robots.txt as configured at the top.
If you want to ban dotbot from crawling any part of your site, add this text instead:
User-agent: dotbot
Disallow: /
BEGIN
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^rogerbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^exabot [OR]
RewriteCond %{HTTP_USER_AGENT} ^MJ12bot [OR]
RewriteCond %{HTTP_USER_AGENT} ^dotbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^gigabot [OR]
RewriteCond %{HTTP_USER_AGENT} ^AhrefsBot
RewriteRule .* – [F]
SetEnvIfNoCase User-Agent .rogerbot. bad_bot
SetEnvIfNoCase User-Agent .exabot. bad_bot
SetEnvIfNoCase User-Agent .mj12bot. bad_bot
SetEnvIfNoCase User-Agent .dotbot. bad_bot
SetEnvIfNoCase User-Agent .gigabot. bad_bot
SetEnvIfNoCase User-Agent .ahrefsbot. bad_bot
SetEnvIfNoCase User-Agent .sitebot. bad_bot
Order Allow,Deny
Allow from all
Deny
END
Thomas