RobinZ comments on Welcome to Heaven - Less Wrong

23 Post author: denisbider 25 January 2010 11:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (242)

You are viewing a single comment's thread. Show more comments above.

Comment author: RobinZ 26 January 2010 03:46:21PM 1 point [-]

But wire-heading is not death. It is the opposite - the most fulfilling experience possible, to which everything else pales in comparison.

..."fulfilling"? Wire-heading only fulfills "make me happy" - it doesn't fulfill any other goal that a person may have.

"Fulfilling" - in the sense of "To accomplish or carry into effect, as an intention, promise, or prophecy, a desire, prayer, or requirement, etc.; to complete by performance; to answer the requisitions of; to bring to pass, as a purpose or design; to effectuate" (Webster 1913) - is precisely what wire-heading cannot do.

Comment author: denisbider 26 January 2010 03:51:04PM *  -2 points [-]

Your other goals are immaterial and pointless to the outside world.

Nevertheless, suppose the FAI respects such a desire. This is questionable, because in the FAI's mind, this is tantamount to letting a depressed patient stay depressed, simply because a neurotransmitter imbalance causes them to want to stay depressed. But suppose it respects this tendency.

In that case, the cheapest way to satisfy your desire, in terms of consumption of resources, is to create a simulation where you feel like you are thinking, learning and exploring, though in reality your brain is in a vat.

You'd probably be better off just being happy and sharing in the FAI's infinite wisdom.

Comment author: RobinZ 26 January 2010 03:57:44PM *  2 points [-]

Would you do me a favor and refer to this hypothesized agent as a DAI (Denis Artificial Intelligence)? Such an entity is nothing I would call Friendly, and, given the widespread disagreement on what is Friendly, I believe any rhetorical candidates should be referred to by other names. In the meantime:

Your other goals are immaterial and pointless to the outside world.

I reject this point. Let me give a concrete example.

Recently I have been playing a lot of Forza Motorsport 2 on the XBox 360. I have recently made some gaming buddies who are more experienced in the game than I am - both better at driving in the game and better at tuning cars in the game. (Like Magic: the Gathering, Forza 2 is explicitly played on both the preparation and performance levels, although tilted more towards the latter.) I admire the skills they have developed in creating and controlling their vehicles and, wishing to admire myself in a similar fashion, wish to develop my own skills to a similar degree.

What is the DAI response to this?

Comment author: denisbider 26 January 2010 04:09:07PM *  1 point [-]

What is the DAI response to this?

An FAI-enhanced World of Warcraft?

You can still interact with others even though you're in a vat.

Though as I commented elsewhere, chances are that FAI could fabricate more engaging companions for you than mere human beings.

And chances are that all this is inferior to being the ultimate wirehead.

Comment author: RobinZ 26 January 2010 05:41:33PM 1 point [-]

What is the DAI response to this?

An FAI-enhanced World of Warcraft?

That could be fairly awesome.

You can still interact with others even though you're in a vat.

If it comes to that, I could see making the compromise.

Though as I commented elsewhere, chances are that FAI could fabricate more engaging companions for you than mere human beings.

And chances are that all this is inferior to being the ultimate wirehead.

This relates to subjects discussed in the other thread - I'll let that conversation stand in for my reply to it.

Comment author: denisbider 26 January 2010 04:05:21PM *  -1 points [-]

Well...

Consider you want to explore and learn and build ad infinitum. Progress in your activities requires you to control increasing amounts of matter and consume increasing amounts of energy, until such point as you conflict with others who also want to build and explore. When that point is reached, the only way the FAI can make you all happy is to intervene while you all sleep, put you in separate vats, and from then on let each of you explore an instance of the universe that it simulates for you.

Should it let you wage Star Wars on each other instead? And how would that be different from no AI to begin with?

Comment author: RobinZ 26 January 2010 04:24:42PM 2 points [-]

You seem to be engaging in all-or-nothing thinking. Because I want more X does not mean that I want to maximize X to the exclusion of all other possibilities. I want to explore and learn and build, but I also want to act fairly toward my fellow sapients/sentients. And I want to be happy, and I want my happiness to stem causally from exploring, learning, building, and fairness. And I want a thousand other things I'm not aware of.

An AI which examines my field of desires and maximizes one to the exclusion of all others is actively inimical to my current desires, and to all extrapolations of my current desires I can see.