Comment author: Tom_McCabe2 25 October 2008 04:14:45AM 0 points [-]

I will not be there due to a screwup by Continental Airlines, my apologies.

Comment author: Tom_McCabe2 24 October 2008 12:14:07AM 0 points [-]

See everyone there.

Comment author: Tom_McCabe2 23 October 2008 02:41:12AM 2 points [-]

"As far as my childhood goes I created a lot of problems for myself by trying to force myself into a mold which conflicted strongly with the way my brain was setup."

"It's interesting that others have shared this experience, trying to distance ourselves from, control, or delete too much of ourselves - then having to undo it. I hadn't read of anyone else having this experience, until people started posting here."

For some mysterious reason, my younger self was so oblivious to the world that I never experienced (to my recollection) a massive belief system rewrite. I assume that what you're referring to is learning a whole bunch of stuff, finding out later on that it's all wrong, and then go back and undoing it all. I don't think I ever learned the whole bunch of stuff in the first place- eg., when I discovered atheism, I didn't really have an existing Christian belief structure that had to be torn down. I knew about Jesus and God and the resurrection and so forth, but I hadn't really integrated it into my head, so when I discovered atheists, I just accepted their arguments as true and moved on.

In response to Ethical Injunctions
Comment author: Tom_McCabe2 20 October 2008 11:31:52PM 7 points [-]

"Would you kill babies if it was the right thing to do? If no, under what circumstances would you not do the right thing to do? If yes, how right would it have to be, for how many babies?"

I would have answered "yes"; eg., I would have set off a bomb in Hitler's car in 1942, even if Hitler was surrounded by babies. This doesn't seem to be a case of corruption by unethical hardware; the benefit to *me* from setting off such a bomb is quite negative, as it greatly increases my chance of being tortured to death by the SS.

Comment author: Tom_McCabe2 19 October 2008 02:48:13AM 1 point [-]

"But what if you were "optimistic" and only presented one side of the story, the better to fulfill that all-important goal of persuading people to your cause? Then you'll have a much harder time persuading them away from that idea you sold them originally - you've nailed their feet to the floor, which makes it difficult for them to follow if you yourself take another step forward."

Hmmm... if you don't need people following you, could it help you (from a rationality standpoint) to lie? Suppose that you read about AI technique X. Technique X looks really impressive, but you're still skeptical of it. If you talk about how great technique X looks, people will start to associate you with technique X, and if you try to change your mind about it, they'll demand an explanation. But if you lie (either by omission, or directly if someone asks you about X), you can change your mind about X later on and nobody will call you on it.

NOTE: This does require telling the same lie to everyone; telling different lies to different groups of people is, as noted, too messy.

Comment author: Tom_McCabe2 16 October 2008 03:07:29AM 0 points [-]

"Human beings, who are not gods, often fail to imagine all the facts they would need to distort to tell a truly plausible lie."

One of my pet hobbies is constructing metaphors for reality which are blatantly, factually wrong, but which share enough of the deep structure of reality to be internally consistent. Suppose that you have good evidence for facts A, B, and C. If you think about A, B, and C, you can deduce facts D, E, F, and so forth. But given how tangled reality is, it's effectively impossible to come up with a complete list of humanly-deducible facts in advance; there's always going to be some fact, Q, which you just didn't think of. Hence, if you map A, B, and C to A', B', and C', use A', B', and C' to deduce Q', and map Q' back to Q, how accurate Q is is a good check for how well you understand A, B, and C.

Comment author: Tom_McCabe2 14 October 2008 02:32:18AM 8 points [-]

"I am willing to admit of the theoretical possibility that someone could beat the temptation of power and then end up with no ethical choice left, except to grab the crown. But there would be a large burden of skepticism to overcome."

If all people, including yourself, become corrupt when given power, then why shouldn't you seize power for yourself? On average, you'd be no worse than anyone else, and probably at least somewhat better; there should be *some* correlation between knowing that power corrupts and not being corrupted.

Comment author: Tom_McCabe2 09 October 2008 05:24:45PM 2 points [-]

I volunteer to be the Gatekeeper party. I'm reasonably confident that no human could convince me to release them; if anyone can convince me to let them out of the box, I'll send them $20. It's *possible* that I couldn't be convinced by a transhuman AI, but I wouldn't bet $20 on it, let alone the fate of the world.

Comment author: Tom_McCabe2 08 October 2008 11:55:59PM 1 point [-]

"To accept this demand creates an awful tension in your mind, between the impossibility and the requirement to do it anyway. People will try to flee that awful tension."

More importantly, at least in me, that awful tension causes your brain to seize up and start panicking; do you have any suggestions on how to calm down, so one can think clearly?

Comment author: Tom_McCabe2 23 September 2008 05:57:05PM 2 points [-]

"Eliezer2000 lives by the rule that you should always be ready to have your thoughts broadcast to the whole world at any time, without embarrassment."

I can understand most of the paths you followed during your youth, but I don't really get this. Even if it's a good idea for Eliezer_2000 to broadcast everything, wouldn't it be stupid for Eliezer_1200, who just discovered scientific materialism, to broadcast everything?

"If everyone were to live for others all the time, life would be like a procession of ants following each other around in a circle."

For a more mathematical version of this, see http://www.acceleratingfuture.com/tom/?p=99.

"It does not seem a very intuitive belief (except for very religious types and Eliezer1997 was not one of those), so what was its justification?"

WARNING: Eliezer-1999 content.

http://yudkowsky.net/tmol-faq/tmol-faq.html

"Even so, if you don't try, or don't try hard enough, you don't get a chance to sit down at the high-stakes table - never mind the ability ante."

Are you referring to external exclusion of people who don't try, or self-exclusion?

View more: Prev | Next