timtyler comments on Singularity Institute Executive Director Q&A #2 - Less Wrong

20 Post author: lukeprog 06 January 2012 03:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (39)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 07 January 2012 02:17:55PM *  1 point [-]

Computer programs and aeroplanes crash "by default". That has little to do with what computer programs and aeroplanes actually do.

What happens "by default" typically has precious little to do with what actually happens - since agent preferences get involved in between.

Comment author: FAWS 09 January 2012 12:43:22AM 3 points [-]

What happens "by default" typically has very much to do with what actually happens the first few times, unless extraordinary levels of caution and preparation are applied.

Comment author: timtyler 09 January 2012 12:19:53PM *  0 points [-]

Things like commercial flight and the space program show that reasonable levels of caution can be routinely applied when lives are at stake.

The usual situation with engineering is that you can have whatever level of safety you are prepared to pay for.

As I understand it, most lives are currently lost to engineering through society-approved tradeoffs - in the form of motor vehicle accidents. We know how to decrease the death rate there, but getting from A to B rapidly is widely judged to be more important.

It is easy to imagine how machine intelligence is likely to produce similar effects - via unemployment. We could rectify such effects via a welfare state, but it isn't clear how popular that willl be. We can - pretty easily - see this one coming. If the concern is with human lives, we can see now that we will need to make sure that unemployed humans have a robust safety net - and that's a relatively straightforwards political issue.

Comment author: FAWS 09 January 2012 09:43:06PM 1 point [-]

If you accept that a single failure could mean extinction and worse the history of rockets and powered flight isn't exactly inspiring.

Comment author: timtyler 10 January 2012 11:22:52AM -1 points [-]

If you have a system in which a single failure could mean extinction of anything very important, then it seems likey that there must have been many failures in safety systems and backups leading up to that situation - which would seem to count against the idea of a single failure. We have had many millions of IT failures so far already.