Hi Stefan,
You can block the Baidu crawler in in the robots.txt.
There should be no adverse affect to your site. As this is not an area you are targeting and has no future long term benerfit to your business. Blocking the crawler will mean that your server has less load to deal with from the unnecessary traffic you have been receiving.
You can block the spiders in the following ways:
- Robots.txt (below is code for Baidu)
User-agent: Baiduspider
User-agent: Baiduspider-video
User-agent: Baiduspider-image
Disallow: /
- Blocking Spiders via the Apache Configuration File httpd.conf
See the below article for more details on this method
http://searchenginewatch.com/article/2067357/Bye-bye-Crawler-Blocking-the-Parasites
You may also want to check out:
http://www.robotstxt.org/
I hope this helps,
Ally