Desrtopa comments on Rationality Quotes: April 2011 - Less Wrong

6 Post author: benelliott 04 April 2011 09:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (384)

You are viewing a single comment's thread. Show more comments above.

Comment author: Desrtopa 04 April 2011 01:43:12PM 9 points [-]

Part of me wants to say that it was foolish of Tony to take so much less money than he could have gotten simply for getting the guy to profess that it was a piece of quartz rather than a power crystal, but I'm not sure I would feel comfortable exploiting a guy's delusions to that degree either.

Comment author: benelliott 04 April 2011 03:57:44PM 4 points [-]

There's no guarantee the guy would have bought it at all for $150. The impression I get is that this was ultimately a case of belief in belief, Tony knew he couldn't get much more than $15 and just wanted to win the argument.

Comment author: Desrtopa 04 April 2011 04:04:28PM 2 points [-]

I doubt he would have bought it for $150, but after making a big deal of its properties as a power crystal, he'd be limited in his leverage to haggle it down; he'd probably have taken it for three times the asking price if not ten.

Comment author: zaph 04 April 2011 02:27:26PM *  10 points [-]

I thank Tony for not taking the immediately self-benefiting path of profit and instead doing his small part to raise the sanity waterline.

Comment author: Giles 04 April 2011 03:10:37PM *  13 points [-]

Was the buyer sane enough to realise that it probably wasn't a power crystal, or just sane enough to realise that if he pretended it wasn't a power crystal he'd save $135?

Is that amount of raising-the-sanity waterline worth $135 to Tony?

I would guess it's guilt-avoidance at work here.

(EDIT: your thanks to Tony are still valid though!)

Comment author: childofbaud 04 April 2011 08:55:09PM *  7 points [-]

And with that in mind, how would it have affected the sanity waterline if Tony had donated that $135 to an institution that's pursuing the improvement of human rationality?

Comment author: Eliezer_Yudkowsky 05 April 2011 04:35:44AM 40 points [-]

Look, sometimes you've just got to do things because they're awesome.

Comment author: thomblake 07 April 2011 08:27:45PM 2 points [-]

But would you feel comfortable with that maxim encoded in an AI's utility function?

Comment author: Alicorn 07 April 2011 08:38:20PM 13 points [-]

For a sufficiently rigorous definition of "awesome", why not?

Comment author: benelliott 08 April 2011 07:54:21AM 4 points [-]

If its a terminal value then CEV should converge to it.

Comment author: DanielLC 05 April 2011 12:25:47AM 5 points [-]

I think he would have been better off taking the money and donating it to a good charity.