Comment author: lukeprog 24 February 2013 02:27:58AM *  7 points [-]

is Eliezer right in thinking that if we get just one piece wrong the whole endeavor is worthless?

To clarify, the linked post by Eliezer actually says the following:

Value isn't just complicated, it's fragile. There is more than one dimension of human value, where if just that one thing is lost, the Future becomes null. A single blow and all value shatters. Not every single blow will shatter all value - but more than one possible "single blow" will do so.

Comment author: WrongBot 24 February 2013 02:42:06AM 3 points [-]

Thank you for pointing this out; I've apparently lost the ability to read. Post edited.

Discussion: Which futures are good enough?

5 WrongBot 24 February 2013 12:06AM
Thirty years from now, a well-meaning team of scientists in a basement creates a superintelligent AI with a carefully hand-coded utility function. Two days later, every human being on earth is seamlessly scanned, uploaded and placed into a realistic simulation of their old life, such that no one is aware that anything has changed. Further, the AI had so much memory and processing power to spare that it gave every single living human being their own separate simulation.

Each person lives an extremely long and happy life in their simulation, making what they perceive to be meaningful accomplishments. For those who are interested in acquiring scientific knowledge and learning the nature of the universe, the simulation is accurate enough that everything they learn and discover is true of the real world. Every other pursuit, occupation, and pastime is equally fulfilling. People create great art, find love that lasts for centuries, and create worlds without want. Every single human being lives a genuinely excellent life, awesome in every way. (Unless you mind being simulated, in which case at least you'll never know.)

I offer this particular scenario because it seems conceivable that with no possible competition between people, it would be possible to avoid doing interpersonal utility comparison, which could make Mostly Friendly AI (MFAI) easier. I don't think this is likely or even worthy of serious consideration, but it might make some of the discussion questions easier to swallow.


1. Value is fragile. But is Eliezer right in thinking that if we get just one piece wrong the whole endeavor is worthless? (Edit: Thanks to Lukeprog for pointing out that this question completely misrepresents EY's position. Error deliberately preserved for educational purposes.)

2. Is the above scenario better or worse than the destruction of all earth-originating intelligence? (This is the same as question 1.)

3. Are there other values (besides affecting-the-real-world) that you would be willing to trade off?

4. Are there other values that, if we traded them off, might make MFAI much easier?

5. If the answers to 3 and 4 overlap, how do we decide which direction to pursue?

In response to Just One Sentence
Comment author: WrongBot 05 January 2013 02:11:56AM 36 points [-]

"If you perform experiments to determine the physical laws of our universe, you will learn how to make powerful weapons."

It's all about incentives.

Comment author: WrongBot 04 December 2012 10:12:38PM -2 points [-]

The Keeley book you linked has been discredited. See here, e.g.

Comment author: WrongBot 03 November 2012 11:41:54PM 32 points [-]

Took it and laughed several times.

In response to Logical Pinpointing
Comment author: Eliezer_Yudkowsky 25 October 2012 03:11:59AM 3 points [-]

Meditation:

Humans need fantasy to be human.

"Tooth fairies? Hogfathers? Little—"

Yes. As practice. You have to start out learning to believe the little lies.

"So we can believe the big ones?"

Yes. Justice. Mercy. Duty. That sort of thing.

"They're not the same at all!"

You think so? Then take the universe and grind it down to the finest powder and sieve it through the finest sieve and then show me one atom of justice, one molecule of mercy.

  • Susan and Death, in Hogfather by Terry Pratchett

So far we've talked about two kinds of meaningfulness and two ways that sentences can refer; a way of comparing to physical things found by following pinned-down causal links, and logical reference by comparison to models pinned-down by axioms. Is there anything else that can be meaningfully talked about? Where would you find justice, or mercy?

Comment author: WrongBot 01 November 2012 09:17:08PM 0 points [-]

In people's brains, and in papers written by philosophy students.

Comment author: pragmatist 26 September 2012 08:08:30PM 1 point [-]

Could you elaborate, please?

Comment author: WrongBot 18 October 2012 06:53:08AM 0 points [-]

Sorry for the very belated reply, but I was struggling to find the words to describe exactly what I meant. Luckily, Eliezer has already done most of it for me in his latest post.

Thing A exists with respect to Thing B iff Thing A and Thing B are both part of the same causal network. So ArisKatsaris was half-right, but things outside our past and future light cones can be said to exist with respect to us if they have a causal relationship with anything that is inside our past and future light cones.

Comment author: Jayson_Virissimo 26 September 2012 02:00:16PM 2 points [-]

Abstract objects: nominalism or Platonism?

Submitting...

Comment author: WrongBot 26 September 2012 07:59:07PM 0 points [-]

Other: Existence is a two-valued function, not one-valued.

Comment author: hg00 07 September 2012 10:15:03PM *  3 points [-]

EDIT: OK, on reflection I'm less confident in all this. Feel free to read my original comment below.


I have a theory that a high male-to-female ratio actually triggers creepy behavior in men. Why?

Creepy behavior has an evolutionary purpose, just like all human behavior. The optimal mating strategy changes depending on my tribe's gender ratio. As nasty as it sounds, from the perspective of my genes it may make sense to try to have sex by force, if it's not going to happen any other way.

I suspect evolution has programmed men to be more bitter, resentful, and belligerent if they seem to be in an area where there aren't many women. Hence you get sexual assault problems in the military, countries with surplus young males causing various forms of societal unrest, etc.

In other words, maybe it's not that individuals are creepy so much as men "naturally" act more rapey if there are only a few women around. Of course, we're all adults and we can supress unwanted internal drives, but it may also be a good idea to attack the root problem.

So in light of this, some possible solutions for male creepiness:
* When men feel desperate, they act creepy. That doesn't necessarily mean we should treat these men like bad people. Yes, these are antisocial behaviors. But they're a manifestation of internal suffering. So, try to feel compassion and respect for people that are suffering, in addition to letting them know that their behavior is antisocial.
* If you're a man and you notice yourself acting creepy, one idea is to try to get interested in something that's got a decent number of women involved with it. (Possible examples: acting, dancing, book clubs. Maybe other commenters have more ideas?) Hopefully, this will program your subconscious to believe you're no longer in a desperate situation. In the best case, maybe you'll find a girlfriend.

Comment author: WrongBot 07 September 2012 10:38:52PM 6 points [-]

Creepy behavior has an evolutionary purpose, just like all human behavior.

Humans are adaptation-executors, not fitness-maximizers. Evolution may have crafted me into a person who wants to sit at home alone all day and play video games, but sitting at home alone all day and playing video games doesn't offer me a fitness advantage.

(I don't actually want to sit at home alone all day and play video games. At least, not every day.)

Comment author: komponisto 05 June 2012 03:41:29AM *  8 points [-]

Can you explain in more detail? I'm interested in learning about the downsides of programming jobs (which have been strongly promoted around here).

Comment author: WrongBot 05 June 2012 11:06:02PM 1 point [-]

I work in video games, so my experience isn't at all typical of programming more generally. The big issues are that:

  • Development priorities and design are driven by marketing.
  • Lots of time is spent doing throwaway work for particular demos. I (and many others) wasted a couple weeks hacking together a scripted demo for E3 that will never be seen again.
  • The design for my portion of the project has changed directions numerous times, and each new version of the feature has been implemented in a rush, so we still have bits of code from five iterations ago hanging around, causing bugs.
  • Willingness to work extremely long hours (70+/week) is a badge of pride. I'm largely exempt from this because I'm a contractor and paid hourly, but my salaried coworkers frequently complain about not seeing enough of their families. On the other hand, some of the grateful to have an excuse to get away from their families.
  • The downside of being a contractor is that I don't get benefits like health insurance, sick days, paid time off, etc.

Many of these issues are specific to the games industry and my employer particularly, and shouldn't be considered representative of programming in general. Quality of life in the industry varies widely.

View more: Prev | Next