Posts

Sorted by New

Wiki Contributions

Comments

It's premature optimization, we won't reach heaven. Anyway, do you test those ideas in practice? Theoretical falsifiability isn't enough.

Eliezer is attacking human augmentation for the same reason he attacked subsumption arch: to rationalize his working on from-scratch AI. I don't yet see quantifiable arguments why from-scratch AI is easier.

Richard Hollerith, thanks for your interest, but you'll be disappointed: I have no religion to offer. The highlights of every person's ethical system depend on the specific wrongs they have perceived in life. My own life has taught me to bear fruit into tomorrow, but also to never manipulate others with normative/religious cheap talk.

Also, Occam's Razor can only apply to those terminal beliefs that are weaker held than the razor itself. Fortunately, most people's values aren't so weak, even if yours are. :-)

Anytime! If you want exploration, you'll see the next frontier of escape after the Singularity. If you want family life, artistic achievement or wireheading, you can have it now.

You're all wrong. We can't run out of real-world goals. When we find ourselves boxed in, the next frontier iwill be to get out, ad infinitum. Is there a logical mistake in my reasoning?

V.G., see my exchange with Eliezer about this in November: http://lesswrong.com/lw/vg/building_something_smarter/ , search for "religion". I believe he has registered our opinion. Maybe it will prompt an overflow at some point, maybe not.

The discussion reminds me of Master of Orion. Anyone remember that game? I usually played as Psilons, a research-focused race, and by the endgame my research tree got maxed out. Nothing more to do with all those ultra-terraformed planets allocated to 100% research. Opponents still sit around but I can wipe the whole galaxy with a single ship at any moment. Wait for the opponents to catch up a little, stage some nice space battles... close the game window at some point. What if our universe is like that?

V.G., good theory but I think it's ethnic rather than religious. Ayn Rand fell prey to the same failure mode with an agnostic upbringing. Anyway this is a kind of ad hominem called the Bulverism fallacy ("ah, I know why you'd say that"), not a substantive critique of Eliezer's views.

Substantively: Eliezer, I've seen indications that you want to change the utility function that guides your everyday actions (the "self-help" post). If you had the power to instantly and effortlessly modify your utility function, what kind of Eliezer would you converge to? (Remember each change is influenced by the resultant utility function after the previous change.) I believe (but can't prove) you would either self-destruct, or evolve into a creature the current you would hate. This is a condensed version of the FAI problem, without the AI part :-)

Daniel, I knew it :-)

Phil, you can look at it another way: the commonality is that to win you have to make yourself believe a demonstrably false statement.

Immediate association: pick-up artists know well that when a girl rejects you, she often doesn't know the true reason and has to deceive herself. You could recruit some rationalists among PUAs. They wholeheartedly share your sentiment that "rational agents must WIN", and have accumulated many cynical but useful insights about human mating behaviour.

I have a saying/hypothesis that a human trying to write code is like someone without a visual cortex trying to paint a picture - we can do it eventually, but we have to go pixel by pixel because we lack a sensory modality for that medium; it's not our native environment.

Eliezer, this sounds wrong to me. Acquired skills matter more than having a sensory modality. Computers are quite good at painting, e.g. see the game Crysis. Painting with a brush isn't much easier than pixel by pixel, and it's not a natural skill. Neither is the artist's eye for colour and shape, or the analytical ear for music (do you know the harmonies of your favourite tunes?) You can instantly like or dislike a computer program, same as a painting or a piece of music: the inscrutable inner workings get revealed in the interface.

Load More