Fluttershy comments on Open thread, Jan. 25 - Jan. 31, 2016 - Less Wrong

3 Post author: username2 25 January 2016 09:07PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (169)

You are viewing a single comment's thread. Show more comments above.

Comment author: Fluttershy 28 January 2016 10:55:30AM 0 points [-]

Oops. I've tried to clarify that he's only interested in FAI research, not AI research on the whole.

Comment author: username2 28 January 2016 11:08:58AM 2 points [-]

I think that interest in AI research in general would help to demystify the whole topic a bit, it would make it look a little bit less like magic.

Comment author: Lumifer 28 January 2016 03:25:59PM *  1 point [-]

he's only interested in FAI research

There's no such thing, any more than there is research into alien flying saucers with nanolasers of doom. There's a lot of fiction and armchair speculation, but that's not research.

Any reason he's not trying to fix his phobia by conventional means?

Comment author: Fluttershy 28 January 2016 09:18:24PM 0 points [-]

What I mean is that he'd be interested in working for MIRI, but not, say, OpenAI, or even a startup where he'd be doing lots of deep learning, if he overcomes his phobia.

Comment author: Lumifer 28 January 2016 09:53:22PM 1 point [-]

It might be that his interest in FAI is tied to his phobia so if the phobia goes away, so may the interest...

Comment author: RichardKennaway 28 January 2016 11:41:56AM 1 point [-]

FAI is only a problem because of AI. The imminence of the problem depends on where AI is now and how rapidly it is progressing. To know these things, one must know how AI (real, current and past AI, not future, hypothetical AI, still less speculative, magical AI) is done, and to know this in technical terms, not fluff.

I don't know how much your friend knows already, but perhaps a crash course in Russell and Norvig, plus technical papers on developments since then (i.e. Deep Learning) would be appropriate.