Alicorn comments on Why No Wireheading? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (112)
I'd probably test such a thing for an hour, actually, and for all I know it would be so overwhelmingly awesome that I would choose to stay, but I expect that assuming my preferences and memories remained intact, I would rather be out among real people. My desire to be among real people is related to but not dependent on my tendency towards loneliness, and guilt hadn't even occurred to me (I suppose I'd think I was being a bit of a jerk if I abandoned everybody without saying goodbye, but presumably I could explain what I was doing first?) I want to interact with, say, my sister, not just with an algorithm that pretends to be her and elicits similar feelings without actually having my sister on the other end.
In a sense, emotions can be accurate sort of like beliefs can. I would react similarly badly to the idea of having pleasant, inaccurate beliefs. It would be mistaken (given my preferences about the world) to feel equally happy when someone I care about has died (or something else bad) as when someone I care about gets married (or something else good).
Lying is wrong.
I know. It is one of the many terrible things about reality. I hate it.
Memories are a way to access reality-tracking information. As I said, remembering stuff is not consistently pleasant, but that's not what it's about.
Counterfactually.
Well, I wrote everything above that in my comment, and then noticed that there was this pattern, and didn't immediately come up with a counterexample to it.
I think it's fine if you want to wirehead. I do not advocate interfering with your interest in doing so. But I still don't want it.