Understand that the robots.txt is meant to be the official way to instruct spiders what they can and cannot read. But it's a known fact many of these bots ignore that file. That's also why Clark recommends the alternative, more secure option of closing the community down for the public. As long as a normal user is able to see your community, so will bots.