You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Elo comments on Open Thread, May 25 - May 31, 2015 - Less Wrong Discussion

3 Post author: Gondolinian 25 May 2015 12:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (301)

You are viewing a single comment's thread. Show more comments above.

Comment author: Elo 25 May 2015 10:15:16AM 1 point [-]

A few rebuttals. Race to the bottom only works in a universe where there is a reason to keep getting lower. where in a petri dish; if you don't replicate fast enough you die; thats a strong selective pressure; in human-world reality there is no such pressure of equivalence. (I also disagree with rat-island) for this reason. Where there is not a benefit to get lower; it won't happen. Evolution seems to happen by two main pressures; slow selection of the most-fittest and sudden selection by major environmental factors (with a range of both in between). For the sake of argument; the cutest of humans procreate; and the lesser ones - less so; but no one survives the next meteor strike. only the cockroaches which then have to grow under slow pressures until the next big pressure.

as for distinct mind - we (as humanity) would only go down the path of non-distinct mind if we wanted to. It may seem like a bad thing from our perspective now; but its a bit of a strawman argument that it will certainly be something we do not want when the time comes that it is possible to do so. I am not concerned for such far-away situations that are framed as problems.

Was there any particular point that you would like refuted?

Comment author: Eitan_Zohar 25 May 2015 10:53:50AM *  2 points [-]

It may seem like a bad thing from our perspective now; but its a bit of a strawman argument that it will certainly be something we do not want when the time comes that it is possible to do so.

This is absolutely what I am afraid of. Values themselves will be selected for and I don't want my values to be ground up entirely to dust. Who's to say that I will want to exist under a different value system, even as a part of some larger consciousness? What if consciousness is a waste of resources?

Comment author: Elo 28 May 2015 12:48:03AM 0 points [-]

every day we wake up a slightly different version of the consciousness that went to sleep. In this way the entire of our conscious existence is undergoing small changes. Each day we wouldn't even think to ask ourselves if we are the same person as yesterday, but if we could isolate the me of today and talk to the me of 10 years ago we would be able to notice the different clearly.

It is a fact of life that we take changes day by day. If that's where we end up; I don't think the you of today has anything to complain about because the you of every day in between gradually made the choices to end up there.

the you of today should contend with the you of every single day between now and the state that you dislike (lack of consciousness or whatever) before being able to hold a complaint about it.

Comment author: Eitan_Zohar 28 May 2015 10:18:04AM *  1 point [-]

So? I don't think you're really getting my point here. If consciousness is fluid or imperfect, it doesn't mean that it is worthless.

Comment author: Elo 28 May 2015 11:18:26AM 0 points [-]

Yes; I don't think I was getting your point.

Also I am not sure that you were getting my point. If in the future the choice to do away with consciousness is made; it will be made by future entities with much more information and clearer reasons for doing so. Without that future information and reasoning at our disposal; We can't really criticize the decision. I can confidently say that my consciousness (based on what I know) does not want to be gotten rid of right now. If reasons overpoweringly convincing come along to change my mind then I will make that decision at that time with the best of information at the time.

My point was that the decision making process is up to the future self and is dependent on future information. The future self will not be making worse decisions. It will not make decisions that do not benefit itself (based on a version of your values right now that are slightly different..

does that make sense? Or should I try to explain it again...?

Comment author: Eitan_Zohar 28 May 2015 11:52:12AM *  0 points [-]

You're definitely missing the point of the whole thing. Suppose that the optimal design for gaining knowledge is something like this (a vast supercomputer without the slightest bit of awareness or emotion.)

I think it is very unlikely- even in the worst case scenarios, I can't imagine that superintelligence wouldn't inherit some sort of value.

Comment author: Elo 28 May 2015 09:03:04PM -1 points [-]

I don't see the problem with that being the eventual case. Death of the state of the world as we know it yes; but the existence of a new entity. That's the way the cookie crubles.

Comment author: RichardKennaway 25 May 2015 08:53:24PM 0 points [-]

Who's to say that I will want to exist under a different value system, even as a part of some larger consciousness?

Are you expecting these things to happen within your lifetime?

Comment author: Eitan_Zohar 26 May 2015 03:24:11AM 0 points [-]

Probably not within my own natural lifetime, no.