Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

DaFranker comments on The noncentral fallacy - the worst argument in the world? - Less Wrong

157 Post author: Yvain 27 August 2012 03:36AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1744)

You are viewing a single comment's thread. Show more comments above.

Comment author: DaFranker 20 September 2012 04:36:52PM 0 points [-]

I feel like asking this question is wrong, but I want the information:

If I know that letting you have freedom will be hurtful (like, say, I tell you you're going to get run over by a train, and you tell me you won't, but I know that you're in denial-of-denial and subconsciously seeking to walk on train tracks, and my only way to prevent your death is to manacle you in a dungeon for a few days), would you still consider the freedom terminally important? More important than the hurt? Which other values can be traded off? Would it be possible to figure out an exchange rate with enough analysis and experiment?

Yes. If this were many years ago and I weren't so conversant on the massive differences between the ways different humans see the world, I'd be very confused that you even had to ask that question.

Regarding this, what if I told you "Earth was a giant prison all along. We just didn't know. Also, no one built the prison, and no one is actively working to keep us in here - there never was a jailor in the first place, we were just born inside the prison cell. We're just incapable of taking off the manacles on our own, since we're already manacled."? In fact, I do tell you this. It's pretty much true that we've been prisoners of many, many things. Is your freedom node only triggered at the start of imprisonment, the taking away of a freedom once had? What if someone is born in the prison Raemon proposes? Is it still inherently wrong? Is it inherently wrong that we are stuck on Earth? If no, would it become inherently wrong if you knew that someone is deliberately keeping us here on Earth by actively preventing us from learning how to escape Earth?

The key point being: What is the key principle that triggers your "Freedom" light? The causal action that removes freedoms? The intentions behind the constraints?

It seems logical to me to assume that if you have freedom as a terminal value, then being able to do anything, anywhere, be anything, anyhow, anywhen, control time and space and the whole universe at will better than any god, without any possible restrictions or limitations of any kind, should be the Ultimately Most Supremely Good maximal possible utility optimization, and therefore reality and physics would be your worst possible Enemy, seeing as how it is currently the strongest Jailer than restricts and constrains you the most. I'm quite aware that this is hyperbole and most likely a strawman, but it is, to me, the only plausible prediction for a terminal value of yourself being free.

Comment author: thomblake 20 September 2012 05:18:49PM 5 points [-]

reality and physics would be your worst possible Enemy, seeing as how it is currently the strongest Jailer than restricts and constrains you the most.

This should answer most of the questions above. Yes, the universe is terrible. It would be much better if the universe were optimized for my freedom.

Which other values can be traded off?

All values are fungible. The exchange rate is not easily inspected, and thought experiments are probably no good for figuring them out.

Comment author: DaFranker 20 September 2012 07:14:03PM *  1 point [-]

You're right, this does answer most of my questions. I had made incorrect assumptions about what you would consider optimal.

After updates based on this, it now appears much more likely for me that you use terminal valuation of your freedom node such that it gets triggered by more rational algorithms that really do attempt to detect restrictions and constraints in more than mere feeling-of-control manner. Is this closer to how you would describe your value?

I'm still having trouble with the idea of considering a universe optimized for one's own personal freedom as a best thing (I tend to by default think of how to optimize for collective sum utilities of sets of minds, rather than one). It is not what I expected.

Comment author: [deleted] 20 September 2012 08:25:49PM 2 points [-]

“freedom as a terminal value” != “freedom as the only terminal value”

Comment author: DaFranker 20 September 2012 08:35:40PM *  1 point [-]

True, and I don't quite see where I implied this. If you're referring to the optimal universe question, it seems quite trivial that if the universe literally acts according to your every will with no restrictions whatsoever, any other terminal values will instantly be fulfilled to their absolute maximal states (including unbounded values that can increase to infinity) along with adjustment of their referents (if that's even relevant anymore).

No compromise is needed, since you're free from the laws of logic and physics and whatever else might prevent you from tiling the entire universe with paperclips AND tiling the entire universe with giant copies of Eliezer's mind.

So if that sort of freedom is a terminal value, this counterfactual universe trivially becomes the optimal target, since it's basically whatever you would find to be your optimal universe regardless of any restrictions.