Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
poiuyt210

This post convinced me to make a physical backup of a bunch of short stories I've been working on. At first I was going to go read through the rest of the comments thread and then go do the back up, but further consideration made me realize how silly that was - burning them to a DVD and writing "Short Story Drafts" on it with a sharpie didn't take more than five minutes to do and made the odds of me forever losing that part of my personal history tremendously smaller. Go go gadget Taking Ideas Seriously!

poiuyt60

I feel that I am being misunderstood: I do not suggest that people sign up for cryonics out of spite. I imagine that almost everyone signed up for cryonics does so because they actually believe it will work. That is as it should be.

I am only pointing out that being told that I am stupid for signing up for cryonics is disheartening. Even if it is not a rational argument against cryonics, the disapproval of others still affects me. I know this because my friends and family make it a point to regularly inform me of the fact that cryonics is "a cult", that I am being "scammed out of my money" by Alcor and that even if it did work, I am "evil and wrong" for wanting it. Being told those things fills me with doubts and saps my willpower. Hearing someone on the pro-cryonics side of things reminding me of my reasons for signing up is reassuring. It restores the willpower I lose hearing those around me insulting my belief. Hearing that cryonics is good and I am good for signing up isn't evidence that cryonics will work. Hearing that non-cryonicists will "regret" their choice certainly isn't evidence that cryonics is the most effective way to save lives. But it is what I need to hear in order to not cave in to peer pressure and cancel my policy.

I get my beliefs from the evidence, but I'll take my motivation from wherever I can find it.

poiuyt-10

Just asking, were you trying to make that sound awful and smug?

Yep.

While genuine compassion is probably the ideal emotion for a post-cryonic counselor to actually show, it's the anticipation of their currently ridiculed beliefs being validated, with a side order of justified smugness that gets people going in the here and now. There's nothing wrong with that: "Everyone who said I was stupid is wrong and gets forced to admit it." is probably one of the top ten most common fantasies and there's nothing wrong with spending your leisure budget on indulging a fantasy. Especially if it has real world benefits too.

poiuyt40

The other standard argument is that cryonics doesn't need to come out of my world-saving budget, it can come out of my leisure budget. Which is also true, but it requires that I'm interested enough in cryonics that I get enough fuzzy points from buying cryonics to make up whatever I lose in exchange. And it feels like once you take the leisure budget route, you're implicitly admitting that this is about purchasing fuzzies, not utilons, which makes it a little odd to apply to all those elaborate calculations which are often made with a strong tone of moral obligation. If one is going to be a utilitarian and use the strong tone of moral obligation, one doesn't get to use it to make the argument that one should invest a lot of money on saving just a single person, and with highly uncertain odds at that.

I imagine that a lot of people on Less Wrong get off on having someone tell them "with a strong tone of moral obligation" that death can be defeated and that they simply must invest their money in securing their own immortality. Even if it isn't a valid moral argument, per say, phrasing it as one makes cryonics buyers feel more good about their choice and improves the number of warm fuzzies they get from the thought that some day they'll wake up in the future, alive and healthy with everyone congratulating them on being so very brave and clever and daring to escape death like that.

poiuyt20

The apparent paradox is resolved as long as you note that P(Daisy thinks Dark does exist|Dark does exist) > P(Daisy thinks Dark doesn't exist|Dark does exist).

That is, even if Dark does exist and does want to hide his existence, his less-than-100%-effective attempts to hide will produce non-zero evidence for his existence and make the probability that Daisy will believe in Dark go up by a non-zero amount.

poiuyt-10

It seems to me like an AI enclosed in a cloud of chaotic antimatter would not be very useful. Any changes small enough to be screened out by the existence of the antimatter cloud would also be small enough to be destroyed by the antimatter cloud when we go to actually use them, right? If we want the AI to make one paperclip, presumably we want to be able to access that paperclip once it's built. And the antimatter cloud would prevent us from getting at the paperclip. And that's completely ignoring that antimatter bomb rigged to detonate the contents of the box. There needs to be a better way of defining "reduced impact" for this to be a practical idea.

poiuyt00

Pretty sure you've got some adware. Especially if the links are green and in a funny font.

poiuyt10

I think there's about two good answers here: "Don't make intelligences that just wants to make paperclips, or it will work towards creating paperclips in a way that humans would think is unreasonable. In order to have your intelligence act reasonably, it needs to have a notion of reasonableness that mirrors that of humanity. And that means having a utility function that matches that of humanity in general." or "Be sure that your AI has a boredom function so that it won't keep doing the same things over and over again. After a sufficient degree of certainty, the AI should get tired of checking and re-checking its work and move onto something else instead of plotting to take over the world so it can devote ever greater resources to a single project."

Maybe these are even the same answer. I know that humans get bored of checking and re-checking themselves, and would find someone who fails to get bored of doing the same calculations over and over again to be unreasonable and/or crazy.