When I began this series of posts, I wasn’t aware of the 2015 Edge.org/Harper Perennial book What to Think About Machines That Think, edited by the late John Brockman, who had previously produced a series of such books where the views of different scientists and other thinkers were laid out side by side, for example – This Idea Must Die, This Explains Everything, and This Will Make You Smarter, which all became bestsellers.
I’ve read a few of these collections of current thought, which bring, generally, 20-30 minds together. But the subject of AI here brought them out in greater numbers – believe it or not, looking over the Contents, I count 186 of them! Here are just a few, with their provocative titles:
- Antony Garret Lisi – I, for One, Welcome Our Machine Overlords
- John Markoff – Our Masters, Slaves, or Partners?
- John C. Mather – It’s Going to be a Wild Ride
- Maria Popova – The Umwelt of the Unanswerable
- Marcelo Gleiser – Welcome to Your Transhuman Self
- Sean Carroll – We Are All Machines That Think
- Arnold Trehub – Machines Cannot Think
- Cesar Hidalgo – Machines Don’t Think, but Neither Do People
- Joscha Bach – Every Society Gets The AI It Deserves
- Lawrence M. Krauss – What, Me worry?
- Beatrice Golomb – Will We Recognize It When It Happens?
Well, I’ve barely started, but I can see that this book is going to produce more than a few posts on this blog.
One entry I didn’t list above is physicist Max Tegmark’s Let’s Get Prepared! Lamenting the sensationalist coverage that AI often gets, then one by one showing the weakness of the arguments that superintelligence will never happen, in his last paragraph he concludes:
The advent of machines that truly think will be the most important event in human history. Whether it will be the best or worst thing ever to happen to humankind depends on how we prepare for it, and the time to start preparing is now. One doesn’t need to be a superintelligent AI to realize that running unprepared toward the biggest event in human history would be just plain stupid.
Who can argue with that?