"If you perform experiments to determine the physical laws of our universe, you will learn how to make powerful weapons."
It's all about incentives.
Thirty years from now, a well-meaning team of scientists in a basement creates a superintelligent AI with a carefully hand-coded utility function. Two days later, every human being on earth is seamlessly scanned, uploaded and placed into a realistic simulation of their old life, such that no one is aware that anything has changed. Further, the AI had so much memory and processing power to spare that it gave every single living human being their own separate simulation.
Each person lives an extremely long and happy life in their simulation, making what they perceive to be meaningful accomplishments. For those who are interested in acquiring scientific knowledge and learning the nature of the universe, the simulation is accurate enough that everything they learn and discover is true of the real world. Every other pursuit, occupation, and pastime is equally fulfilling. People create great art, find love that lasts for centuries, and create worlds without want. Every single human being lives a genuinely excellent life, awesome in every way. (Unless you mind being simulated, in which case at least you'll never know.)
I offer this particular scenario because it seems conceivable that with no possible competition between people, it would be possible to avoid doing interpersonal utility comparison, which could make Mostly Friendly AI (MFAI) easier. I don't think this is likely or even worthy of serious consideration, but it might make some of the discussion questions easier to swallow.
1. Value is fragile. But is Eliezer right in thinking that if we get just one piece wrong the whole endeavor is worthless? (Edit: Thanks to Lukeprog for pointing out that this question completely misrepresents EY's position. Error deliberately preserved for educational purposes.)
2. Is the above scenario better or worse than the destruction of all earth-originating intelligence? (This is the same as question 1.)
3. Are there other values (besides affecting-the-real-world) that you would be willing to trade off?
4. Are there other values that, if we traded them off, might make MFAI much easier?
5. If the answers to 3 and 4 overlap, how do we decide which direction to pursue?
"If you perform experiments to determine the physical laws of our universe, you will learn how to make powerful weapons."
It's all about incentives.
Humans need fantasy to be human.
"Tooth fairies? Hogfathers? Little—"
Yes. As practice. You have to start out learning to believe the little lies.
"So we can believe the big ones?"
Yes. Justice. Mercy. Duty. That sort of thing.
"They're not the same at all!"
You think so? Then take the universe and grind it down to the finest powder and sieve it through the finest sieve and then show me one atom of justice, one molecule of mercy.
- Susan and Death, in Hogfather by Terry Pratchett
So far we've talked about two kinds of meaningfulness and two ways that sentences can refer; a way of comparing to physical things found by following pinned-down causal links, and logical reference by comparison to models pinned-down by axioms. Is there anything else that can be meaningfully talked about? Where would you find justice, or mercy?
In people's brains, and in papers written by philosophy students.
Could you elaborate, please?
Sorry for the very belated reply, but I was struggling to find the words to describe exactly what I meant. Luckily, Eliezer has already done most of it for me in his latest post.
Thing A exists with respect to Thing B iff Thing A and Thing B are both part of the same causal network. So ArisKatsaris was half-right, but things outside our past and future light cones can be said to exist with respect to us if they have a causal relationship with anything that is inside our past and future light cones.
Other: Existence is a two-valued function, not one-valued.
EDIT: OK, on reflection I'm less confident in all this. Feel free to read my original comment below.
I have a theory that a high male-to-female ratio actually triggers creepy behavior in men. Why?
Creepy behavior has an evolutionary purpose, just like all human behavior. The optimal mating strategy changes depending on my tribe's gender ratio. As nasty as it sounds, from the perspective of my genes it may make sense to try to have sex by force, if it's not going to happen any other way.
I suspect evolution has programmed men to be more bitter, resentful, and belligerent if they seem to be in an area where there aren't many women. Hence you get sexual assault problems in the military, countries with surplus young males causing various forms of societal unrest, etc.
In other words, maybe it's not that individuals are creepy so much as men "naturally" act more rapey if there are only a few women around. Of course, we're all adults and we can supress unwanted internal drives, but it may also be a good idea to attack the root problem.
So in light of this, some possible solutions for male creepiness:
* When men feel desperate, they act creepy. That doesn't necessarily mean we should treat these men like bad people. Yes, these are antisocial behaviors. But they're a manifestation of internal suffering. So, try to feel compassion and respect for people that are suffering, in addition to letting them know that their behavior is antisocial.
* If you're a man and you notice yourself acting creepy, one idea is to try to get interested in something that's got a decent number of women involved with it. (Possible examples: acting, dancing, book clubs. Maybe other commenters have more ideas?) Hopefully, this will program your subconscious to believe you're no longer in a desperate situation. In the best case, maybe you'll find a girlfriend.
Creepy behavior has an evolutionary purpose, just like all human behavior.
Humans are adaptation-executors, not fitness-maximizers. Evolution may have crafted me into a person who wants to sit at home alone all day and play video games, but sitting at home alone all day and playing video games doesn't offer me a fitness advantage.
(I don't actually want to sit at home alone all day and play video games. At least, not every day.)
I work in video games, so my experience isn't at all typical of programming more generally. The big issues are that:
Many of these issues are specific to the games industry and my employer particularly, and shouldn't be considered representative of programming in general. Quality of life in the industry varies widely.
To clarify, the linked post by Eliezer actually says the following:
Thank you for pointing this out; I've apparently lost the ability to read. Post edited.