Wei_Dai comments on What Curiosity Looks Like - Less Wrong

31 Post author: lukeprog 06 January 2012 09:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (283)

You are viewing a single comment's thread. Show more comments above.

Comment author: Wei_Dai 10 January 2012 02:16:17AM *  6 points [-]

On a grand scale, my hunger for truths is probably as limited and easy to satisfy as my hunger for cheeseburgers.

I have very good reasons to think that my hunger for cheeseburgers is limited and easy to satisfy (e.g., ample evidence from past consumption/satiation of various foods including specifically cheeseburgers). On the other hand, there seems good reason to suspect that if my appetite for truths is limited, the satiation level comes well after what can be achieved at human intelligence level and within a human lifetime (e.g., there are plenty of questions I want answers to that seem very hard, and every question that gets answered seems to generate more interesting and even harder questions).

(It's an interesting question whether all my questions could be answered within 1 second after the Singularity occurs, or if it would require the more than the resources in our entire light cone, or something in between, but the answer to that doesn't affect my point that a curious person would seek to become superintelligent.)

I do feel that in a post-Singularity world I'd want to enhance my intelligence, but the underlying motivation seems to be status-seeking, a desire to be significant.

If Omega offered to enhance your intelligence and/or answer all your questions, but for your private benefit only (i.e., you couldn't tell anyone else or otherwise use your improved intelligence/knowledge to affect the world), would you not be much interested?

Comment author: wedrifid 10 January 2012 09:53:02AM 4 points [-]

It's an interesting question whether all my questions could be answered within 1 second after the Singularity occurs, or if it would require the more than the resources in our entire light cone, or something in between, but the answer to that doesn't affect my point that a curious person would seek to become superintelligent.

Are you at all curious about what the 3^^^3rd digit of Pi is?

Comment author: wedrifid 10 January 2012 04:31:14AM 2 points [-]

If Omega offered to enhance your intelligence and/or answer all your questions, but for your private benefit only (i.e., you couldn't tell anyone else or otherwise use your improved intelligence/knowledge to affect the world), would you not be much interested?

Nice. The opposite of the premise of a lot of fantasy worlds!

Comment author: Normal_Anomaly 16 January 2012 05:39:33PM 0 points [-]

If Omega offered to enhance your intelligence and/or answer all your questions, but for your private benefit only (i.e., you couldn't tell anyone else or otherwise use your improved intelligence/knowledge to affect the world), would you not be much interested?

I would be interested, but I wouldn't take it unless I got a solid technical explanation of "affect the world" that allowed me to do at least as much as I am doing now.

Comment author: cousin_it 10 January 2012 11:15:07AM *  0 points [-]

No, I wouldn't be much interested, I'd even pay to refuse the offer because I don't want the frustration of being unable to tell anyone.

Comment author: wedrifid 10 January 2012 12:03:47PM 0 points [-]

No, I wouldn't be much interested, I'd even pay to refuse the offer because I don't want the frustration of being unable to tell anyone.

You aren't willing to just console yourself with all the hookers, cars, drugs, holidays and general opulence you have been able to buy with the money you earned with your 'personal benefit only' intelligence? Or are we to take it that we can't even use the intelligence to benefit ourselves materially and can only use it to sit in a chair and think to ourselves?

Comment author: cousin_it 10 January 2012 12:39:05PM 5 points [-]

I think that counts as "using your improved intelligence to affect the world". If it's allowed, then sure, sign me up.

Comment author: wedrifid 10 January 2012 01:29:35PM 2 points [-]

I think that counts as "using your improved intelligence to affect the world". If it's allowed, then sure, sign me up.

And if even personal use is not allowed then I rapidly become indifferent between the choices (and to the question itself).

Comment author: Vladimir_Nesov 12 January 2012 10:49:56AM 0 points [-]

Affecting your mind is still a discernible effect...

Comment author: wedrifid 12 January 2012 01:01:20PM 0 points [-]

Affecting your mind is still a discernible effect...

Yes, you can reduce Wei's counterfactual to nonsensical if you try to pick it apart too far. Yet somehow I think that misses his point.

Comment author: Document 10 January 2012 12:46:50PM 0 points [-]

Worst-case (and probable) scenario, you get trapped inside your head and forced to watch your body act like an idiot. If you could engage in transactions, you could make lots of money and then selectively do business with people you like.

Comment author: wedrifid 10 January 2012 01:25:20PM 0 points [-]

you could make lots of money and then selectively do business with people you like.

This part isn't the case. You can't game "can't use powers for personal gain" laws of magic - the universe always catches you. The reversed case would be analogous.