There's a provocative interview with the philosopher Daniel Dennett in Living on Earth.
The topic is Dennett's latest book — From Bacteria to Bach and Back: The Evolution of Minds — and his idea that Charles Darwin and Alan Turing can be credited, in a way, with the same discovery: that you don't need comprehension to achieve competence.
Darwin showed how you can get the appearance of purpose and design out of blind processes of natural selection. And Turing, one of the pioneers in the field of computation, offered evidence that any problem precise enough to be computed at all, can be computed by a mechanical device — that is, a device without an iota of insight or understanding.
But the part of the interview that particularly grabbed my attention comes at the end.Living on Earth host Steve Curwood raises the, by now, hoary worry that as AI advances, machines will come to lord over us. This is a staple of science fiction and it has recently become the focus of considerable attention among opinion-makers. (Discussion of the so-called "singularity.") Dennett acknowledges that the risk of takeover is a real one. But he says we've misunderstood it: The risk is not that machines will become autonomous and come to rule over us — the risk is, rather, that we will come to depend too much on machines.