Pierre-André_Noël comments on Crisis of Faith - Less Wrong

57 Post author: Eliezer_Yudkowsky 10 October 2008 10:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (153)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Pierre-André_Noël 14 October 2008 12:06:00AM 1 point [-]

First of all, great post Eliezer. If somebody holding that kind of standard thinks that cryogenics is a good investment, I should someday take some time into investing the question deeper than I had.

Now, without lessening the previous praise, I would like to make the following remarks about friendly AI:
- The belief has long remained in your mind;
- It is surrounded by a cloud of known arguments and refutations;
- You have sunk costs in it (time, money, public declarations).
I do not know if it has emotional consequences for you or if it has gotten mixed up in your personality generally.

I think the following questions better translate my line of thoughts than any explanations I could formulate. Given a limited amount of EY resources:
- is friendly AI the best bet for "saving mankind"?
- would this "crisis of faith technique" (or similar rational approaches) be more popular, could other alternatives than FAI be envisioned?
- if FAI is required (the sole viable alternative), would it worth the cost to invest time into "educating people" into such rational approaches (writing books, publicise etc.) in order to gather ressources/manpower to achieve FAI?

Maybe you have already passed through such a reasoning and came to the answer that the current time you invest on OB is the optimal amount of publicity...