I think this article overstates the case. We can’t explain how people work, and they fuck up all the time. Should we not be using air traffic controller AIs that crash plans a tenth of the time of their human counterparts, because we don’t know what they’re thinking?