torekp comments on A cynical explanation for why rationalists worry about FAI - Less Wrong

25 Post author: aaronsw 04 August 2012 12:27PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (179)

You are viewing a single comment's thread. Show more comments above.

Comment author: torekp 04 August 2012 06:38:37PM 10 points [-]

If I want to signal how much I care, I'll stick with puppies or local soup kitchens, thank you very much. That will get me a lot more warm fuzzies - and respect - from my neighbors and colleagues than making hay about a robot apocalypse.

Comment author: ModusPonies 04 August 2012 07:22:08PM 8 points [-]

If you want to maximize respect from a broad, nonspecific community (e.g. neighbors and colleagues), that's a good strategy. If you want to maximize respect from a particular subculture, you could do better with a more specific strategy. For example, to impress your political allies, worry about upcoming elections. To impress members of your alumni organization, worry about the state of your sports team or the university president's competence. To impress folks on LessWrong, worry about a robot apocalypse.

Comment author: [deleted] 05 August 2012 01:21:34AM 4 points [-]

That's a fully general argument: to impress [people who care about X], worry about [X]. But it doesn't explain why for rationalists X equals a robot apocalypse as opposed to [something else].

Comment author: ModusPonies 05 August 2012 04:41:38AM 5 points [-]

My best guess is that it started because Eliezer worries about a robot apocalypse, and he's got the highest status around here. By now, a bunch of other respected community members are also worried about FAI, so it's about affiliating with a whole high-status group rather than imitating a single leader.

Comment author: Jonathan_Graehl 05 August 2012 07:04:25AM 4 points [-]

I wouldn't have listened to EY if he weren't originally talking about AI. I realize others' EY origin stories may differ (e.g. HPMOR).

Comment author: David_Gerard 04 August 2012 08:40:13PM 9 points [-]

Humans are adaptation-executers, not fitness maximisers - and evolved in tribes of not more than 100 or so. And they are exquisitely sensitive to status. As such, they will happily work way too hard to increase their status ranking in a small group, whether it makes sense from the outside view or not. (This may or may not follow failing to increase their status ranking in more mainstream groups.)

Comment author: timtyler 04 August 2012 11:39:30PM 1 point [-]

Much depends on who you are trying to impress. Around here, lavishing care on cute puppies won't earn you much status or respect at all.

Comment author: ShardPhoenix 05 August 2012 10:07:44AM *  0 points [-]

That raises the question of why people care about getting status from Less Wrong in the first place. There are many other more prominent internet communities.

Comment author: timtyler 05 August 2012 12:25:07PM 3 points [-]

Other types of apocalyptic phyg also acquire followers without being especially prominent. Basically the internet has a long tail - offering many special interest groups space to exist.

Comment author: DanielLC 04 August 2012 07:07:55PM 1 point [-]

Yeah, but how much respect will they get you from LessWrong?