Anonymous_Coward6

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

"Should your parents have the right to kill you now, if they do so painlessly?"

Yes, according to that logic. Also, from a negative utilitarian standpoint, it was actually the act of creating me which they had no right to do since that makes them responsible for all pain I have ever suffered.

I'm not saying I live life by utilitarian ethics, I'm just saying I haven't found any way to refute it.

That said though, non-existence doesn't frighten me. I'm not so sure non-existence is an option though, if the universe is eternal or infinite. That might be a very good thing or a very bad thing.

Why must destroying a conscious model be considered cruel if it wouldn't have even been created otherwise, and it died painlessly? I mean, I understand the visceral revulsion to this idea, but that sort of utilitarian ethos is the only one that makes sense to me rationally.

Furthermore, from our current knowledge of the universe I don't think we can possibly know if a computational model is even capable of producing consciousness so it is really only a guess. The whole idea seems near-metaphysical, much like the multiverse hypothesis. Granted, the nonzero probability of these models being conscious is still significant considering the massive future utility, but considering the enormity of our ignorance you might as well start talking about the non-zero probability of rocks being conscious.

I don't think anyone answered Doug's question yet. "Would a human, trying to solve the same problem, also run the risk of simulating a person?"

I have heard of carbon chauvinism, but perhaps there is a bit of binary chauvinism going on?

Some people with synesthaesia can "feel" numbers and thus perform amazing calculations. It would only make sense to have something similar for other tasks, like computer programming?

I have a sever inability to make big choices such as this, and I have cryocrastinated for quite some time. This year, I became a vegetarian after a lot of difficult reflection, and I doing the same with cryonics.

I feel that there just isn't that much to lose by not signing up, since non-existence does not scare me. Signing up, at that point, becomes a choice between the Precautionary Principle vs Proactionary Principle. Even a small chance that the world I wake up in will be horrible is enough to not want to sign up at all, even despite the potential gain.

Your pleas were very heartfelt. I am leading a healthy lifestyle, because I do wish to be around to experience the future and help make the world a better place. I find the odds of it actually working to be low, but if the expected utility is very high, then even a small investment is worth it. I am quite risk-averse though, even though this is illogical. I will re-read that article about Crisis of Faith, and hopefully come to a rational conclusion.