Vaniver comments on The Goal of the Bayesian Conspiracy - Less Wrong

-9 Post author: Arandur 16 August 2011 06:40PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (80)

You are viewing a single comment's thread.

Comment author: Vaniver 16 August 2011 02:05:04AM 3 points [-]

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.

Really? How would one demonstrate this? What does it mean for a definition to be "correct"? If something is true by definition, is it really demonstrable?

we have a moral obligation to work our hardest on this project

Really? Your plan is to get people interested in world domination by guilting them?

Comment author: Arandur 16 August 2011 06:29:09AM 2 points [-]

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination...

I hadn't considered that, but now I see it clearly. How interesting.

Really? Your plan is to get people interested in world domination by guilting them?

Ha! If that would work, maybe it'd be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying "See that poor starving African woman? if you had listened to my plan, she'd be happier." But I won't be doing that.