DanArmak comments on Anticipation vs. Faith: At What Cost Rationality? - Less Wrong

8 Post author: Wei_Dai 13 October 2009 12:10AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (105)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanArmak 13 October 2009 04:07:24PM *  1 point [-]

To have the "pure rational" version serve the human one? Have it achieve goals that aren't defined by ownership (e.g., 'reduce total suffering'), because it's better at those things? This sounds reasonable and I don't see why we shouldn't do this - assuming it's easier or otherwise preferable to creating a GAI from scratch to achieve these goals.

Comment author: aausch 16 October 2009 02:02:06AM *  0 points [-]

Would you enter a lottery where a small portion (0.01%) of the entrants are selected as losers at random, and transformed without being copied?

Comment author: DanArmak 16 October 2009 02:07:14AM *  0 points [-]

Being modified like this is much the same as death. I would lose personhood and the modified remains would not be serving my original goals, but those of the winners in the lottery.

So this is equivalent to asking: would I enter a lottery with 0.01% chance of instant death and a prize for everything else? I might, depending on the prize. If the prize is service by the modified individuals, and they aren't based on myself but on other people (so they're not very well suited to advancing my personal goals), and I have to timeshare them with all the other winners (every winner receives 0.01 / 99.99 ~~ 10^-4 % service time) - that doesn't seem worthwhile.

Comment author: aausch 16 October 2009 02:16:55AM 0 points [-]

Modified question in place.

Comment author: DanArmak 16 October 2009 01:22:35PM 0 points [-]

Modified answer in place...