You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

[LINK] Article in the Guardian about CSER, mentions MIRI and paperclip AI

19 Sarokrae 30 August 2014 02:04PM

http://www.theguardian.com/technology/2014/aug/30/saviours-universe-four-unlikely-men-save-world

The article is titled "The scientific A-Team saving the world from killer viruses, rogue AI and the paperclip apocalypse", and features interviews with Martin Rees, Huw Price, Jaan Tallinn and Partha Dasgupta. The author takes a rather positive tone about CSER and MIRI's endeavours, and mentions x-risks other than AI (bioengineered pandemic, global warming with human interference, distributed manufacturing).

I find it interesting that the inferential distance for the layman to the concept of paperclipping AI is much reduced by talking about paperclipping America, rather than the entire universe: though the author admits still struggling with the concept. Unusually for an journalist who starts off unfamiliar with these concepts, he writes in a tone that suggests that he takes the ideas seriously, without the sort of "this is very far-fetched and thus I will not lower myself to seriously considering it" countersignalling usually seen with x-risk coverage. There is currently the usual degree of incredulity in the comments section though.

For those unfamiliar with The Guardian, it is a British left-leaning newspaper with a heavy focus on social justice and left-wing political issues. 

[Link] First talk by CSER

3 NoSuchPlace 11 March 2014 03:01PM

The Centre for the Study of Existential Risk (CSER) has recently held its first public lecture which can be found here:

 

Existential Risk: Surviving the 21st Century

 

The talk's blurb:

"In the coming century, the greatest threats to human survival may come from our own technological developments. However, if we can safely navigate the pitfalls, the benefits that technology promises are enormous. A philosopher, an astronomer, and an entrepreneur have come together to form the Centre for the Study of Existential Risk. The goal: to bring a fraction of humanity’s talents to bear on the task of ensuring our long-term survival. In this lecture, Huw Price, Martin Rees and Jaan Tallinn will outline humanity’s greatest challenge: surviving the 21st century."

From CSER's about page:

"An existential risk is one that threatens the existence of our entire species.  The Cambridge Centre for the Study of Existential Risk (CSER) — a joint initiative between a philosopher, a scientist, and a software entrepreneur — was founded on the conviction that these risks require a great deal more scientific investigation than they presently receive.  CSER is a multidisciplinary research centre dedicated to the study and mitigation of risks that could lead to human extinction.

Our goal is to steer a small fraction of Cambridge’s great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future."

The philosopher, scientist and entrepreneur in question being Huw Price, Martin Rees and Jaan Tallinn respectively.

 

Incase you are looking for the talk that Jaan Tallinn referred to, I think that it is this.

The Singularity Wars

52 JoshuaFox 14 February 2013 09:44AM

(This is a introduction, for  those not immersed in the Singularity world, into the history of and relationships between SU, SIAI [SI, MIRI], SS, LW, CSER, FHI, and CFAR. It also has some opinions, which are strictly my own.)

The good news is that there were no Singularity Wars. 

The Bay Area had a Singularity University and a Singularity Institute, each going in a very  different direction. You'd expect to see something like the People's Front of Judea and the Judean People's Front, burning each other's grain supplies as the Romans moved in. 

continue reading »

Centre for the Study of Existential Risk (CSER) at Cambridge makes headlines.

19 betterthanwell 26 November 2012 08:56PM

As of an hour ago, I had not yet heard of the Centre for the Study of Existential Risk.

Luke announced it to Less Wrong, as The University of Cambridge announced it to the world, back in April:

CSER at Cambridge University joins the others.

Good people involved so far, but the expected output depends hugely on who they pick to run the thing.

CSER is scheduled to launch next year.

 


 

Here is a small selection of CSER press coverage from the last two days:

http://www.bbc.co.uk/news/technology-20501091

http://www.guardian.co.uk/education/shortcuts/2012/nov/26/cambridge-university-terminator-studies

http://www.dailymail.co.uk/news/article-2238152/Cambridge-University-open-Terminator-centre-study-threat-humans-artificial-intelligence.html

http://www.theregister.co.uk/2012/11/26/new_centre_human_extinction_risks/

http://www.slashgear.com/new-ai-think-tank-hopes-to-get-real-on-existential-risk-26258246/

http://www.techradar.com/news/world-of-tech/super-brains-to-guard-against-robot-apocalypse-1115293

http://www.hindustantimes.com/world-news/Europe/Cambridge-to-study-risks-from-robots-at-Terminator-Centre/Article1-964746.aspx

http://economictimes.indiatimes.com/news/news-by-industry/et-cetera/cambridge-to-study-risks-from-robots-at-terminator-centre/articleshow/17372042.cms

http://www.extremetech.com/extreme/141372-judgment-day-update-disneys-grenade-catching-robot-and-the-burger-flipping-robot-that-could-replace-2-million-us-workers

http://slashdot.org/topic/bi/cambridge-university-vs-skynet/

http://www.businessinsider.com/researchers-robots-risk-human-civilization-2012-11

http://www.newscientist.com/article/dn22534-megarisks-that-could-drive-us-to-extinction.html

http://news.cnet.com/8301-11386_3-57553993-76/killer-robots-cambridge-brains-to-assess-ai-risk/

http://www.globalpost.com/dispatches/globalpost-blogs/weird-wide-web/cambridge-university-opens-so-called-termintor-centre-stu

http://www.washingtonpost.com/world/europe/cambridge-university-to-open-center-studying-the-risks-of-technology-to-humans/2012/11/25/e551f4d0-3733-11e2-9258-ac7c78d5c680_story.html

http://www.foxnews.com/tech/2012/11/26/terminator-center-to-open-at-cambridge-university/

Google News: All 119 news sources...

 


 

Here's an excerpt from one quite typical story appearing in tech-tabloid theregister.co.uk today:

 

Cambridge boffins fear 'Pandora's Unboxing' and RISE of the MACHINES

Boffins at Cambridge University want to set up a new centre to determine what humankind will do when ultra-intelligent machines like the Terminator or HAL pose "extinction-level" risks to our species.

A philosopher, a scientist and a software engineer are proposing the creation of a Centre for the Study of Existential Risk (CSER) to analyse the ultimate risks to the future of mankind - including bio- and nanotech, extreme climate change, nuclear war and artificial intelligence.

Apart from the frequent portrayal of evil - or just misguidedly deadly - AI in science fiction, actual real scientists have also theorised that super-intelligent machines could be a danger to the human race.

Jaan Tallinn, the former software engineer who was one of the founders of Skype, has campaigned for serious discussion of the ethical and safety aspects of artificial general intelligence (AGI).

Tallinn has said that he sometimes feels he is more likely to die from an AI accident than from cancer or heart disease, CSER co-founder and philosopher Huw Price said.
[...]




The source for these stories appears to be a press release from the University of Cambridge:

Humanity’s last invention and our uncertain future

In 1965, Irving John ‘Jack’ Good sat down and wrote a paper for New Scientist called Speculations concerning the first ultra-intelligent machine. Good, a Cambridge-trained mathematician, Bletchley Park cryptographer, pioneering computer scientist and friend of Alan Turing, wrote that in the near future an ultra-intelligent machine would be built. [...] 



Three Four quick observations:

1: That's a lot of Terminator II photos.
2: FHI at Oxford and the Singularity Institute does not often get this kind of attention.
3: CSER doesn't appear to have published anything yet.
4: The number of people who have heard the term "existential risk" must have doubled a few times today.