Filter This week

Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: ArisKatsaris 14 December 2017 01:02:43AM 1 point [-]

Makes sense, I'm betting many members of the wider rationalist community have seen their assets increase because of the significant rise of Bitcoin, Ethereum and other cryptocurrencies this year.

Comment author: entirelyuseless 15 December 2017 02:04:35PM 0 points [-]

Right. Utilitarianism is false, but Eliezer was still right about torture and dust specks.

Comment author: Dacyn 15 December 2017 01:49:28AM 0 points [-]

I'm not a utilitarian. The argument that 50 years of torture is preferable to 3^^^3 people suffering dust specks only presumes that preferences are transitive, and that there exists a sequence of gradations between torture and dust specks with the properties that (A) N people suffering one level of the spectrum is always preferable to N*(a googol) people suffering the next level, and (B) the spectrum has at most a googol levels. I think it's pretty hard to consistently deny these assumptions, and I'm not aware of any serious argument put forth to deny them.

It's true that a deontologist might refrain from torturing someone even if he believes it would result in the better outcome. I was assuming a scenario where either way you are not torturing someone, just refraining from preventing them from being tortured by someone else.

Comment author: g_pepper 12 December 2017 02:39:34PM 0 points [-]

On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...

The argument that 50 years of torture of one person is preferable to 3^^^3 people suffering dust specs presumes utilitarianism. A non-utilitarian will not necessarily prefer torture to dust specs even if his/her critical thinking skills are up to par.

Comment author: Dacyn 10 December 2017 10:26:13PM 0 points [-]

The only people who would consent to the dust speck are people who would choose SPECKS over TORTURE in the first place. Are you really saying that you "do not value the comfort of" Eliezer, Robin, and others?

However, your argument raises another interesting point, which is that the existence of people who would prefer that SPECKS was chosen over TORTURE, even if their preference is irrational, might change the outcome of the computation because it means that a choice of TORTURE amounts to violating their preferences. If TORTURE violates ~3^^^3 people's preferences, then perhaps it is after all a harm comparable to SPECKS. This would certainly be true if everyone finds out about whether SPECKS or TORTURE was chosen, in which case TORTURE makes it harder for a lot of people to sleep at night.

On the other hand, maybe you should force them to endure the guilt, because maybe then they will be motivated to research why the agent who made the decision chose TORTURE, and so the end result will be some people learning some decision theory / critical thinking...

Also, if SPECKS vs TORTURE decisions come up a lot in this hypothetical universe, then realistically people will only feel guilty over the first one.

Comment author: arundelo 09 December 2017 05:58:09PM *  0 points [-]

Spam!

This account has been posting spam since April 2017 (though all of their old comments have been deleted and are visible only on their overview and comments pages).