You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Lumifer comments on Open thread, September 15-21, 2014 - Less Wrong Discussion

6 Post author: gjm 15 September 2014 12:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (339)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 16 September 2014 03:38:20PM 2 points [-]

If people eventually have relationships with FAI-created humans rather than humans generated by other means, is this a problem?

This looks to be wireheading lite and if you got there I don't see why you wouldn't make the next step as well -- the FAI will create the entire world for you to enjoy inside your head.

Comment author: NancyLebovitz 16 September 2014 03:41:35PM 2 points [-]

I thought wireheading meant stable high pleasure without content rather than an enjoyable simulated world. What do other people think wireheading means?

Comment author: Lumifer 16 September 2014 03:49:45PM 1 point [-]

Well, technically the term "wireheading" comes from experiments which involved inserting an electrode (a "wire") into a rat's pleasure center and giving the rat a pedal to apply electric current to this wire. So yes, in the narrow sense wireheading is just the direct stimulation of the pleasure center.

However I use "wireheading" in the wide sense as well and there it means, essentially, the focus on deriving pleasure from externally caused but internal experiences and the lack of interest in or concern with the outside world. Wireheading in the wide sense is, basically, purified addiction.

Comment author: NancyLebovitz 16 September 2014 04:28:35PM 2 points [-]

If we're living inside an FAI, "outside world" might be getting a little vague. This might even be true if we're still living in our DNA-based bodies.

Do you think an FAI would let people have access to anything it isn't at least monitoring, and more likely controlling?

Comment author: Lumifer 16 September 2014 04:32:49PM *  1 point [-]

If we're living inside an FAI

Uploads/ems are a bit of a different case.

Do you think an FAI would let people have access to anything it isn't at least monitoring, and more likely controlling?

I don't know, but in such a case I probably would not consider it a FAI.

Comment author: hyporational 17 September 2014 03:16:09PM *  1 point [-]

Uploads/ems are a bit of a different case.

How? Why does it matter in what substrate the information pattern called you resides in this case? I doubt the meat brain will have any connectibility issues once we have uploads.

Comment author: Lumifer 17 September 2014 03:24:11PM 1 point [-]

Why does it matter in what substrate the information pattern called you resides in this case?

I am not an information pattern having, for example, a considerable somatic component :-D

Comment author: hyporational 17 September 2014 03:30:26PM 1 point [-]

Depends. You could have a robotic somatic component, or a human body grown in a vat.

Comment author: Lumifer 17 September 2014 03:45:20PM *  0 points [-]

I don't see much difference between a human body grown in a vat and one grown in a womb.

But, generally speaking, in the context of wireheading the somatic component matters.

Comment author: hyporational 17 September 2014 04:02:32PM 0 points [-]

Does it matter to you because of semantic or moral reasons? I fail to see any moral difference in living in a virtual world as a meat brain vs living in a virtual world as a silicon brain. The semantic difference is obvious.