Comment author: ArisKatsaris 24 June 2011 09:06:23AM *  2 points [-]

You are trying to say there is something other than pleasure, yet you concede that all of your examples cause pleasure.

If I was debating the structure of the atom, I could say that "there's more to atoms than their protons", and yet I would 'concede' that all atoms do contain protons. Or I'd say "there's more to protons than just their mass" (they also have an electric charge), but all protons do have mass.

Why are you finding this hard to understand? Why would I need to discover an atom without protons or a proton without mass for me to believe that there's more to atoms than protons (there's also electrons and neutrons) or more to protons than their mass?

That is exactly my point. There is nothing we seek that we don't expect to derive pleasure from.

You had made much stronger statements than that -- you said "You think you want more than pleasure, but what else is there?" You also said "But saying we want more than pleasure? That doesn't make sense. "

Every atom may contain protons, but atoms are more than protons. Every object of our desire may contain pleasure in its fullfillment, but the object of our desire is more than pleasure.

Does this analogy help you understand how your argument is faulty?

Comment author: tyrsius 22 July 2011 10:07:43PM 0 points [-]

No, it doesn't. I understand your analogy (parts vs the whole), but I do not understand how it relates to my point. I am sorry.

Is pleasure the proton in the analogy? Is the atom what we want? I don't follow here.

You are also making the argument that we want things that don't cause pleasure. Shouldn't this be, in your analogy, an atom without a proton? In that case yes, you need to find an atom without a proton before I will believe there is an atom without a proton. (This same argument works if pleasure is any of the other atomic properties. Charge, mass, etc).

Or is pleasure the atom? If that is the case, then I can't see where you argument is going. If pleasure is the atom, then your analogy supports my argument.

I am not trying to make a straw man, I genuinely don't see the connections.

Comment author: ArisKatsaris 17 June 2011 09:04:44AM *  3 points [-]

Those things you cite are valued because they cause pleasure.

No, they cause pleasure because they're valued.

  • You are arguing that we seek things in accordance to and proportionally to the pleasure anticipated in achieving them. (please correct me if I'm getting you wrong)
  • I'm arguing that we can want stuff without anticipation of pleasure being necessary. And we can fail to want stuff where there is anticipation of pleasure.

How shall we distinguish between the two scenarios? What's our anticipations for the world if your hypothesis is true vs if mine is true?

Here's a test. I think that if your scenario held, everyone would be willing to rewire their brains to get more pleasure for things they don't currently want; because then there'd be more anticipated pleasure. This doesn't seem to hold -- though we'll only know for sure when the technology actually becomes available.

Here's another test. I think that if my scenario holds, some atheists just before their anticipated deaths would still leave property to their offspring or to charities, instead of spending it all to prostitutes and recreational drugs in attempts to cram as much pleasure as possible before their death.

So I think the tests validate my position. Do you have some different tests in mind?

Comment author: tyrsius 24 June 2011 04:12:57AM *  0 points [-]

Your argument isn't making any sense. Whether they are valued because they cause pleasure, or cause pleasure because they are valued makes no difference.

Either way, they cause pleasure. Your argument is that we value them even though they don't cause pleasure. You are trying to say there is something other than pleasure, yet you concede that all of your examples cause pleasure.

For your argument to work, we need to seek something that does not cause pleasure. I asked you to name a few, and you named "Knowledge, memory, and understanding. Personal and collective achievement. Honour. Other people's pleasure."

Then in your next post, you say " they cause pleasure because they're valued."

That is exactly my point. There is nothing we seek that we don't expect to derive pleasure from.

I don't think your tests validate your position. The thought of leaving their belongings to others will cause pleasure. Many expect that pleasure to be deeper or more meaningful that prostitutes, and would therefore agree with your test while still holding to my position that people will seek the greatest expect pleasure.

I would place the standard of a Matrix-quality reality machine to accept lukeprogs offer. An orgasmium would not suffice, as I expect it to fail to live up to its promise. Wireheading would not work.

Double Edit to add a piece then fix the order it got put in.

Edit Again- Apologies, I confused this response with one below. Edited to remove confusion.

Comment author: ArisKatsaris 16 June 2011 04:57:32PM 5 points [-]

"If you gave me that option I would not take it, because it would be a lie that I would receive pleasure from the end of mankind."

Consider the package deal to include getting your brain rewired so that you would receive pleasure from the end of mankind. Now do you choose the package deal?

I wouldn't. Can you explain to me why I wouldn't, if you believe the only thing I can want is pleasure?

Stop moving the goalposts.

Giving additional examples, based on the same principle, isn't "moving the goalposts".

Why to argue against me to you have to bring murder or death into the picture?

Because the survival of your children and the community is the foremost example of a common value that's usually placed higher than personal pleasure.

You think you want more than pleasure, but what else is there?

Knowledge, memory, and understanding. Personal and collective achievement. Honour. Other people's pleasure.

I believe if you consider any answer you might give to that question, the reason will be because those things cause pleasure.

As an automated process we receive pleasure when we get what we want, that doesn't mean that we want those things because of the pleasure. At the conscious level we self-evidently don't want them because of the pleasure, or we'd all be willing to sacrifice all of mankind if they promised to wirehead us first.

Comment author: tyrsius 17 June 2011 02:42:02AM 0 points [-]

All of your other examples are pleasure causing. Don't you notice that?

Again, getting my brain rewired is not in the original question. I would decline getting my brain rewired; that seems like carte blanche for a lot of things that I cannot predict. I would decline.

Survival of the community and children, knowledge, and understanding all bring me pleasure. I think if those things caused me pain, I would fight them. In fact, I think I have good evidence for this.

When cultures have a painful response to the survival of OTHER cultures, they go to war. When people see pain for "enemies" they do not sympathize. When it is something you self-identify with, your own culture, only then does it cause pleasure.

Those things you cite are valued because they cause pleasure. I don't see any evidence that when those things cause pain, that they are still pursued.

@CuSithBell: I agree.

--Sorry, I don't know how to get the quote blocks, or I would respond more directly.

Comment author: loup-vaillant 16 June 2011 08:54:03AM *  0 points [-]

Of course it would. My question is, to what extent would you mind being alone? Not feeling alone, not even believing you are alone, just being alone.

Of course, once I'm plugged in to my Personal Matrix, I would not mind any more, for I wouldn't feel nor believe I am alone. But right now I do mind. Whatever the real reasons behind it, being cut off from the rest of the world just feels wrong. Basically, I believe I want Multiplayer Fun bad enough to sacrifice some Personal Fun.

Now, I probably wouldn't want to sacrifice much personal fun, so given the choice between maximum Personal Fun and my present life, (no third alternative allowed), I would probably take the blue pill. Though it would really bother me if everyone else wouldn't be given the same choice.

Now to get back on topic, I suspect Luke did want to talk about a primitive system that would turn you into an Orgasmium. Something that would even sacrifice Boredom to maximize subjective pleasure and happiness. (By the way, I suspect that "Eternal Bliss" promised by some beliefs systems is just as primitive.) Such a primitive system would exactly serve his point: do you only want happiness and pleasure? Would you sacrifice everything else to get it?

Comment author: tyrsius 16 June 2011 04:07:08PM 1 point [-]

If this is indeed Luke's intended offer, than I believe it to be a lie. Without the ability to introduced varied pleasure, an Orgasmium would fail to deliver on its promise of "maximal pleasure."

For the offer to be true, it would need to be a Personal Matrix.

Comment author: ArisKatsaris 16 June 2011 10:25:20AM 5 points [-]

But saying we want more than pleasure? That doesn't make sense.

Where is the point of your confusion? Why do you assume people only want pleasure? If you give me a choice between living a perfectly pleasurable life for a hundred years, but the whole humankind dies horribly afterwards, and living an average life but the rest of humankind keeps surviving and progressing indefinitely -- I WANT THE SURVIVAL OF MANKIND.

That's because I don't want just pleasure. I want more than pleasure.

We want pleasure, we are just not always sure how to get it.

No, even with perfect and certain knowledge, we would want more than pleasure. What's the hard thing to understand about that?

We are built to want more than a particular internal state of our own minds. Most of us aren't naturally built for solipsism.

If a machine knew what would give us pleasure, and gave us pleasure instead of what we "wanted," then we would always be getting pleasure.

Like e.g. a machine that kills a man's children, but gives him pleasure by falsely telling him they are living happily ever after and erasing any memories to the contrary.

In full knowledge of this, he doesn't want that. I wouldn't want that. Few people would want that. Most of us aren't built for solipsism.

Comment author: tyrsius 16 June 2011 04:01:52PM *  1 point [-]

You are using a quite twisted definition of pleasure to make your argument. For most of us, the end of mankind causes great displeasure. This should factor into your equation. Its also not part of Luke's original offer. If you gave me that option I would not take it, because it would be a lie that I would receive pleasure from the end of mankind.

Killing a man's children has the same problem. Why to argue against me to you have to bring murder or death into the picture? Luke's original question has no such downsides, and introducing them changes the equation. Stop moving the goalposts.

Luke's article clearly separates want from pleasure, but you seem attached to "wanting." You think you want more than pleasure, but what else is there?

I believe if you consider any answer you might give to that question, the reason will be because those things cause pleasure (including the thought "mankind will survive and progress"). I am interested in your answers nonetheless.

Comment author: loup-vaillant 15 June 2011 11:36:05AM *  2 points [-]

Then you're talking Friendly AI with the prior restriction that you have to live alone. Many¹ people will still run the "I would be subjected to a machine" cached thought, will still disbelieve that a Machine™ could ever understand our so-complex-it's-holly psyche, that even if it does, it will automatically be horrible, and that the whole concept is absurd anyway.

In that case they wouldn't reject the possibility because they don't want to live alone and happy, but because they positively believe FAI is impossible. My solution in that case is just to propose them to live a guaranteed happy life, but alone. For people who still refuse to answer on the grounds of impossibility, invoking the supernatural may help.

1: I derive that "many" from one example alone, but I suspect it extends to most enlightened people who treat philosophy as closer to literature than science (wanting to read the sources, and treating questions like "was Niezche/Kant/Spinoza plain wrong on such point" as ill typed —there's no truths or fallacies, only schools of thought). Michel Onfray appears to say that's typically European.

Comment author: tyrsius 15 June 2011 10:14:16PM 6 points [-]

This machine, if it were to give you maximal pleasure, should be able to make you feel as if you are not alone.

The only way I can see this machine actually making good on its promise is to be a Matrix-quality reality engine, but with you in the king seat.

I would take it.

Comment author: tyrsius 15 June 2011 10:05:52PM *  2 points [-]

I feel like I am missing something. You separated pleasure from wanting.

I don't see how this backs up your point though. Unless the machine offered is a desire-fulfilling machine and not a pleasure machine.

If it is a pleasure machine, giving pleasure regardless of the state of wanting, why would we turn it down? You said we usually want more than just pleasure, because getting what we want doesn't always give us pleasure. If wanting and pleasure are different, then of course this makes sense.

But saying we want more than pleasure? That doesn't make sense. You seem to be confusing the two terms your article sets out to separate. We want pleasure, we are just not always sure how to get it. We know we have desires, so why try to fill them, and that doesn't always work. But remember, pleasure and wanting are separate.

If a machine knew what would give us pleasure, and gave us pleasure instead of what we "wanted," then we would always be getting pleasure. Even when we don't get what we want.

Unless your machine doesn't work as advertised, of course.

Comment author: tyrsius 31 May 2011 09:27:38PM *  2 points [-]

Hi LW,

I joined this site not too long ago, but I missed this page and its request for an introduction. Better late than never, I guess.

I am 24, a Jr. Software Developer, and I live in Portland, OR. I was raised in a Baptist family, and left the Church during my junior year in high school over their stance on the Oregon Gay marriage bill. Once outside of the daily Sunday indoctrination, it took only a few short weeks to reason my way to atheism. I only wish I could have seen the truth sooner. I spent the next year or so on forums gaining a real variety of philosophical knowledge, and engaging in as many debates as I could. This made me stupid; I learned how to tear apart many arguments, and defend my own, skillfully.

I started reading Less Wrong at work, during down time, and it quickly devoured several weeks. I have always been drawn to science and rationality (though I used to have another name for it) and have found this community to be a fantastic resource. I have learned how to say oops, and to update quickly. I have learned how to see bias in my own thinking. I have started to learn (though still fail to grasp intuitively) Bayesian probability. This community has had a significant impact on me.

PS: How do you pronounce Eliezer?

Edit: Spelling

Comment author: orthonormal 31 May 2011 03:52:11PM *  1 point [-]

The practical point is that, if not all knowledge reduces to mathematical patterns of physical objects (the sort of thing that we can organize and learn from textbooks), then the actual project of reductionists becomes futile at a really early stage- we'd have to give up on fully understanding even a worm brain, since we could never have the knowledge of its worm-qualia.

I want to respond to your claim more thoroughly, but my response essentially consists of the second and third posts here. If you want to pick up this conversation on those threads, I'm all for it.

Also, welcome to Less Wrong!

Comment author: tyrsius 31 May 2011 09:04:03PM 1 point [-]

Your later posts do a better job of describing your position here. I don't think we disagree.

Comment author: Peterdjones 31 May 2011 12:26:34PM 0 points [-]

The knowledge Mary has is all physical knowledge, where physical knowledge means the kind of thing that can be found in books. You deam the further, experiential knowledge she gains to be physical because sensory processing is physical, but that is a different sense of physical. If you think she learns something on exiting the room, and it seems you do, then you are conceding part of the claim, the part about the incompleteness of physical explanation, even if you insist that the epistemic problem doesn't lead to an dualistic metaphysics.

Comment author: tyrsius 31 May 2011 02:51:45PM 2 points [-]

Only insofar as the definition of physical is limited to things you can find in books. I wholly reject such a definition.

@ Orthonormal. The conclusion seems to me to come very naturally from the thought experiment, if you allow for its assumptions. But that is what I think is silly, its assumptions. The thought experiment tries to define "all knowledge" in two different and contradictory ways.

If Mary has all knowledge, then there is nothing left for her to learn about red. If upon seeing red she learns something new, then she did not have all knowledge prior to seeing red.

It is their definition of knowledge, which is inconsistent, that leads to the entire thought experiment being silly.

View more: Next