Comment author: player_03 08 April 2013 06:25:20AM *  16 points [-]

When I was a Christian, and when I began this intense period of study which eventually led to my atheism, my goal, my one and only goal, was to find the best evidence and argument I could find that would lead people to the truth of Jesus Christ. That was a huge mistake. As a skeptic now, my goal is very similar - it just stops short. My goal is to find the best evidence and argument, period. Not the best evidence and argument that leads to a preconceived conclusion. The best evidence and argument, period, and go wherever the evidence leads.

--Matt Dillahunty

Comment author: MichaelHoward 18 February 2009 12:06:00AM 5 points [-]

Sorry I'm late... is no-one curious given the age of the universe why the 3 races are so close technologically? (Great filter? Super-advanced races have prime directive? Simulated experiment? Novas only happen if 3 connecting stars recently had their first gates be gates out? ...)

Eliezer if you're reading this, amazing story. I'm worried though about your responses to so many commenters (generally smarter & more rational than the most humans) with widely different preferences and values to what you see as right in this story and the fun sequence. I'm not saying your values are wrong, I'm saying you seem to have very optimistic models/estimates of where many human value systems go when fed lots of knowledge/rationality/good arguments. If so, I hope CEV doesn't depend on it.

If your model is causing you to be constantly surprised, then...

Comment author: player_03 22 February 2013 11:45:55PM 2 points [-]

Sorry I'm late... is no-one curious given the age of the universe why the 3 races are so close technologically?

Sorry I'm late in replying to this, but I'd guess the answer is that this is "the past's future." He would not have been able to tell this story with one species being that advanced, so he postulated a universe in which such a species doesn't exist (or at least isn't nearby).

Your in-universe explanations work as well, of course.

Comment author: Hul-Gil 30 March 2012 04:59:07AM 9 points [-]

Agreed. I was very surprised that Mr. Yudkowsky went with the very ending I, myself, thought would be the "traditional" and irrational ending - where suffering and death are allowed to go on, and even caused, because... um... because humans are special, and pain is good because it's part of our identity!

Yes, and the appendix is useful because it's part of our body.

Comment author: player_03 22 February 2013 11:33:18PM 0 points [-]

Perhaps the fact that it's the "traditional and irrational" ending is the reason Eliezer went with it as the "real" one. (Note that he didn't actually label them as "good" and "bad" endings.)

Comment author: Anja 03 November 2012 11:45:03PM 37 points [-]

Took the survey. Does the god question include simulators? I answered under the assumption that it did not.

Comment author: player_03 04 November 2012 12:38:32AM 9 points [-]

I assumed the same, based on the definition of "god" as "supernatural" and the definition of "supernatural" as "involving ontologically basic mental entities."

(Oh, and for anyone who hasn't read the relevant post, the survey is quoting this.)

Comment author: NancyLebovitz 03 March 2012 01:01:43PM 7 points [-]

This has 6 karma points, so I'm left curious about whether people have anything in mind about what real intellectuals shouldn't know.

Comment author: player_03 04 March 2012 12:46:17AM *  2 points [-]

I could be interpreting it entirely wrong, but I'd guess this is the list Cochran had in mind:

Comment author: Nominull 01 November 2011 09:00:23PM 2 points [-]

As always when we hear the word "worse", we need to ask ourselves, "worse on what metric?"

Comment author: player_03 01 November 2011 11:36:02PM 5 points [-]

This reminds me of Lojban, in which the constructs meaning "good" and "bad" encourage you to specify a metric. It is still possible to say that something is "worse" without providing any detail, but I suspect most Lojban speakers would remember to provide detail if there was a chance of confusion.

Comment author: Eliezer_Yudkowsky 06 February 2009 07:27:02PM 13 points [-]

Nominull, neither Akon, the Lord Programmer, or the Xenopsychologist seem to be appearing in this section.

Billy Brown:

Give the furries, vampire-lovers and other assorted xenophiles a few generations to chase their dreams, and you're going to start seeing groups with distinctly non-human psychology.

WHY HAVEN'T I READ THIS STORY?

Comment author: player_03 28 October 2011 05:10:56AM *  1 point [-]

Give the furries, vampire-lovers and other assorted xenophiles a few generations to chase their dreams, and you're going to start seeing groups with distinctly non-human psychology.

WHY HAVEN'T I READ THIS STORY?

Because you haven't had time to read all the Orion's Arm stories, probably. (Details)

Comment author: Tiiba2 06 February 2009 02:58:52PM 8 points [-]

1) Who the hell is Master of Fandom? A guy who maintains the climate control system, or the crew's pet Gundam nerd?

2) Do you really think the aliens' deal is so horrifying? Or are you just overdramatizing?

Comment author: player_03 28 October 2011 04:50:10AM *  9 points [-]

2) Honestly, I would have been happy with the aliens' deal (even before it was implemented), and I think there is a ~60% chance that Elizier agrees.

I'm of the opinion that pain is a bad thing, except insofar as it prevents you from damaging yourself. People argue that pain is necessary to provide contrast to happiness, and that pleasure wouldn't be meaningful without pain, but I would say that boredom and slight discomfort provide more than enough contrast.

However, this future society disagrees. The idea that "pain is important" is ingrained in these people's minds, in much the same way that "rape is bad" is ingrained in ours. I think one of the main points Elizier is trying to make is that we would disagree with future humans almost as much as we would disagree with the baby-eaters or superhappies.

(Edit 1.5 years later: I was exaggerating in that second paragraph. I suspect I was trying too hard to sound insightful. The claims may or may not have merit, but I would no longer word them as forcefully.)

Comment author: NancyLebovitz 04 August 2011 03:13:33PM 1 point [-]

Following the link might help, but I believe the general idea is that if you're trying to present information in a graphic, to sort out what is important about it and what presentation will make it clear.

Comment author: player_03 05 August 2011 12:57:56AM *  1 point [-]

If you're trying to present any kind of information at all, you should figure out what is important about it and what presentation will make it clear.

Unfortunately, the quote above isn't at all clear, even in context. I suspect this is because Jacques Bertin isn't as good at expressing himself in English as in French, but even so I'm unable to understand the sample data he presents or how it relates to the point he was trying to make.

Comment author: ata 07 May 2010 11:31:56PM *  2 points [-]

That's going to be my new quick argument for transhumanism. "Listen to this depressing European synthpop! Do you really want the future to be like that??"

(Incidentally, a recent comment on the video states: "Ronan explained at the San Antonio show and San Diego show that this song was about living forever. That living forever was more of a torment than a gift." Sounded like the opposite to me — a song about how much human extinction would suck. But no, everything's gotta be about how the purposeless evolutionary status quo is coincidentally exactly what we should want...)

Comment author: player_03 04 August 2011 10:06:56PM *  0 points [-]

I agree with your interpretation of the song, and to back it up, here's the chorus of "The Farthest Star" (another song by the same band).

We possess the power, if this should start to fall apart, To mend divides, to change the world, to reach the farthest star. If we should stay silent, if fear should win our hearts, Our light will have long diminished before it reaches the farthest star.

This time, the message is pretty clear: we should aspire to overcome both our differences and our limitations, to avoid extinction and to expand throughout the universe. I suppose it doesn't say anything about immortality, but otherwise it seems to match transhumanist philosophy.

View more: Prev | Next