...has finally been published.
Contents:
- Uziel Awret - Introduction
- Susan Blackmore - She Won’t Be Me
- Damien Broderick - Terrible Angels: The Singularity and Science Fiction
- Barry Dainton - On Singularities and Simulations
- Daniel Dennett - The Mystery of David Chalmers
- Ben Goertzel - Should Humanity Build a Global AI Nanny to Delay the Singularity Until It’s Better Understood?
- Susan Greenfield - The Singularity: Commentary on David Chalmers
- Robin Hanson - Meet the New Conflict, Same as the Old Conflict
- Francis Heylighen - Brain in a Vat Cannot Break Out
- Marcus Hutter - Can Intelligence Explode?
- Drew McDermott - Response to ‘The Singularity’ by David Chalmers [this link is a McDermott-corrected version, and therefore preferred to the version that was published in JCS]
- Jurgen Schmidhuber - Philosophers & Futurists, Catch Up!
- Frank Tipler - Inevitable Existence and Inevitable Goodness of the Singularity
- Roman Yampolskiy - Leakproofing the Singularity: Artificial Intelligence Confinement Problem
The issue consists of responses to Chalmers (2010). Future volumes will contain additional articles from Shulman & Bostrom, Igor Aleksander, Richard Brown, Ray Kurzweil, Pamela McCorduck, Chris Nunn, Arkady Plotnitsky, Jesse Prinz, Susan Schneider, Murray Shanahan, Burt Voorhees, and a response from Chalmers.
McDermott's chapter should be supplemented with this, which he says he didn't have space for in his JCS article.
I don't think the FAI / UFAI distinction is particularly helpful in this case. That framework implies that this is a property of the machine itself. Here we are talking about the widespread release of a machine with a programmable utility function. Its effects will depend on the nature and structure society in which it is released into (and the utility functions that are used with it) - rather than being solely attributes of the machine itself.
If you are dealing with a secretive monopolist, nobody on the outside is going to know what kind of machine they have built. The fact that they are a secretive monopolist doesn't bode well, though. Failing to share is surely one of the most reliable ways to signal that you don't have the interests of others at heart.
Industrial espionage or reverse engineering can't shut organisations down - but it may be able to liberate their technology for the benefit of everyone.
So we estimate based on what we anticipate about the possible state of society.
If it's expected that sharing AGI design results in everyone dying, not sharing it can't signal bad intentions.