timtyler comments on Evaluating the feasibility of SI's plan - Less Wrong

25 Post author: JoshuaFox 10 January 2013 08:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (186)

You are viewing a single comment's thread. Show more comments above.

Comment author: timtyler 12 January 2013 02:12:51PM *  0 points [-]

Let's try a simple calculation. What is the expected FAI/UFAI ratio when friendliness is not proven? According to Eliezer's reply in this thread, it's close to zero:

your conditionally independent failure probabilities add up to 1 and you're 100% doomed.

So let's overestimate it as 1 in a million, as opposed to a more EY-like estimate of 1 in a gazillion

Ignoring the issue of massive overconfidence, why do you even think these concepts are clearly enough defined to assign probability estimates to them like this? It seems pretty clear that they are not. Before discussing the probability of a poorly-defined class of events, it is best to try and say what it is that you are talking about.

Comment author: shminux 13 January 2013 07:16:23PM 0 points [-]

Feel free to explain why it is not OK to assign probabilities in this case. Clearly EY does not shy away from doing so, as the quote indicates.

Comment author: timtyler 13 January 2013 08:48:38PM *  0 points [-]

Well obviously you can assign probabilities to anything - but if the event is sufficiently vague, doing so in public is rather pointless - since no one else will know what event you are talking about.

I see that others have made the same complaint in this thread - e.g. Richard Loosemore:

before deciding exactly how many angels can dance on the head of a pin, you have to make sure the "angel" concept is meaningful enough that questions about angels are meaningful