Comment author: khafra 10 January 2012 01:32:06PM 3 points [-]

That would create an interesting schelling point for torts of assault.

In response to comment by khafra on Dead Child Currency
Comment author: AlexanderRM 02 September 2015 11:01:33PM 0 points [-]

Worth noting that the dead baby value is very different from the actual amount which most Westerners regard the lives of white, middle-class people from their own country as being worth. In fact, pretty much the whole point of the statistic is that it's SHOCKINGLY low. I suppose we could hope that Dead Baby currency would result in a reduction to that discrepancy... although I think in the case of the actual example given, the Malthusians* have a point where it would dramatically increase access to life-prolonging things without increasing access to birth control much, resulting in more population and thus more people to save.

*To clarify: I actually agree with the Malthusian ecology- it's just a basic fact of ecology, I'm amazed that anyone seriously disagrees with it- but not to the objection to charitable donations on that basis; anyone who actually thinks that would go "you should instead give money to provide birth control".

Comment author: shminux 09 January 2012 03:14:37AM -1 points [-]

the right thing to do is press the button.

Why? Do we really need more people on this planet? I would be more likely to press the button in a net-neutral case (one saved, one dies, more money for me), provided your other conditions (not a research, not a joke, full anonymity, etc.) hold.

Comment author: AlexanderRM 02 September 2015 10:43:05PM 0 points [-]

Alternative rephrasing: $4000 dollars is given to your choice of either one of the top-rated charities for saving lives, or one of the top-rated charities for distributing birth control (or something else that reduces population growth).

That means a pure reduction on both sides in number of people on the planet, and- assuming there are currently too many people on the planet- a net reduction in suffering in the long run as there are fewer people to compete with each other, plus the good it does in the short run to women who don't have to go through unwanted pregnancies and raising the children and all the benefits associated with that (like being able to devote more resources to their other children, or possibly pursuing careers further, or the like).

Comment author: Vladimir_Nesov 08 January 2012 11:17:47AM *  16 points [-]

Reversal test: If this miracle of people dying and corresponding sums of money magically appearing in charity funds was commonplace, what debate would follow a hypothetical technology that terminates the miracle?

Comment author: AlexanderRM 02 September 2015 10:33:00PM 0 points [-]

Note that the Reversal Test is written with the assumption of consequentialism, where there's an ideal value for some trait of the universe, whereas the whole point of the trolley problem is that the only problem is deontological, assuming the hypothetical pure example where there are no unintended consequences.

However, the Reversal Test of things like "prevent people from pulling the lever" is still useful if you want to make deontologists question the action/inaction distinction.

Comment author: abigailgem 15 March 2009 08:18:01PM 7 points [-]

I am not sure I can be rational about this at all, because I find suicide repulsive. Yet my society admires the bravery of a soldier who, say, throws himself on a grenade so that it will not kill the others in his dugout. I might see a tincture of dishonesty in the man's actions, and yet he enters a contract, with a free contracting party, and performs his part of the contract.

So. Something to practice Rationality on. To consider the value of an emotional response. Thank you. I am afraid, I still have the emotional response, shameful. I cannot, now, see it as admirable.

Comment author: AlexanderRM 02 September 2015 08:00:02PM *  0 points [-]

I was about to give the exact same example of the soldier throwing himself on a grenade. I don't know where the idea of his actions being "shameful" even comes up.

The one thing I realize from your comment is there's the dishonesty of his actions, and if lots of people did this insurance companies would start catching on and it would stop working plus it would make life insurance that much harder to work with. But it didn't sound like the original post was talking about that with "shameful", it sounds like they were suggesting (or assuming people would think) that there was something inherently wrong with the man's altruism. At least that's what's implied by the title, "really extreme altruism".

Edit: I didn't catch the "Two years after the policy is purchased, it will pay out in the event of suicide." bit until reading others comments- so, indeed, he's not being dishonest, he made a bet with the insurance company (over whether he would still intend suicide two years later) and the insurance company lost. I don't know how many insurance companies have clauses like that, though.

In response to False Laughter
Comment author: AlexanderRM 07 August 2015 12:43:40AM 0 points [-]

I know I'm 8 years late on this (only started reading LessWrong a year ago)- does anyone have a good, snappy term for the quality of humor being funny regardless of the politics? There have been times when I was amused by a joke despite disagreeing with the political point, and wanted to make some comment along the lines of "I'm a [group attacked by the joke] and this passes the Yudkowsky Test of being funny regardless of the politics", but I think "Yudkowsky test" isn't a good term (for one thing, I have no idea if Yudkowsky actually came up with this originally).

(actually, a more generalized term of the art principle including humor would be useful. Although the only time I could think when I might have wanted to apply that was when I first listened to the ISIL theme, and that was a somewhat different attitude from my reaction to people who don't murder their ideological opponents coming up with a funny joke.)

Comment author: wedrifid 26 May 2012 02:59:08AM *  0 points [-]

If religious people we not hypocrites, we would all be burned at the stake.

This depends how the counterfactual is constructed - ie. when they stopped being hypocrites and whether the non-hypocrisy is prevented from causing the no-longer-hypocritical people to lose their religion. I mean - we might win and kill all the religious people!

Comment author: AlexanderRM 04 August 2015 04:31:54AM 0 points [-]

The assumption is that people start doing things that match with their stated beliefs- so, for instance, people who claim to oppose genocide would actually oppose genocide in all cases, which is the whole point of thinking hypocrisy is bad. Causing people to no longer be hypocrites by making them instead give up their stated beliefs would just make for a world which was more honest but otherwise not dramatically improved.

Incidentally, on the joking side: If atheists did win the religious war, they could then use this statement in a completely serious and logical context: https://www.youtube.com/watch?v=FmmQxXPOMMY

Comment author: Ferro 24 May 2012 09:48:07AM 1 point [-]

Most religions do not dictate that heretics be burned at the stake. And if all religious people were non-hypocritical to the basic tenets of a religion (see Ten Commandments, Five Pillars of Islam, etcetera) rather than to specific instructions that are open to interpretation, the world would probably be a much better place.

Comment author: AlexanderRM 04 August 2015 04:09:34AM 0 points [-]

Worth elaborating: If all religious people were non-hypocritical and do exactly what the religion they claim to follow commands, there would probably be an enormous initial drop in violence, followed by any religions that follow commandments like "thou shalt not kill" without exception being wiped out, with religions advocating holy war and the persecution of heretics getting the eventual upper hand (although imperfectly adapted religions might potentially be able to hold off the better-adapted ones through strength of numbers- for instance, if a large area was controlled by a religion with the burning of heretics and defensive, cooperative religious wars, they could hold off smaller nations with religions advocating offensive wars).

One good thing about hypocrisy is that it makes a massive buffer against certain types of virulent memes. On the other hand, a world where everyone took a burn-the-heretics interpretation of Christianity or Islam 100% seriously would certainly have some advantages over ours, and especially over our middle ages- things like no un-sanctioned killing, most notably, no wars against others of the same religion, etc. Probably lots of things that would be decent ideas if you could get everyone to follow them, at the cost of an occasional burnt heretic (and possibly constant holy wars, until one religion gains the upper hand and overwhelms the others).

Comment author: ThePrussian 22 July 2015 01:42:02PM 5 points [-]

Could be two different uses of the word rationality. There are certainly those who call themselves "reality based" or whatever and therefore assume that everything they assert is rational and scientific. But if you invest yourself in "doing rationality" rather than "being rational" you might do better.

Comment author: AlexanderRM 04 August 2015 03:53:16AM 0 points [-]

I think that might help somewhat- thinking of rationality as something you do rather than something you are is definitely good regardless- but there's still the basic problem that your self-esteem is invested in rationality. Rationality requires you to continually be willing to doubt your core values and consider that they might be wrong, and if your core values are wrong, then you haven't gotten any use up to that point out of your rationality. I don't think it's just a matter of realizing your were wrong once and recovering self-esteem out of the fact that you were rational enough to see that- ideally you ought to constantly consider the possibility that everything you believe might be wrong.

Now, if you can get up to the level of thinking I just described, that's probably still a lot better than basing your self-esteem on specific political views. It just doesn't totally solve the problem, and you need to be aware that it doesn't totally solve the problem.

In response to Growing Up is Hard
Comment author: AlexanderRM 04 August 2015 12:38:16AM 0 points [-]

I just want to mention that the thing about a human trying to self-modify their brain in the manner described and with all the dangers listed could make an interesting science fiction story. I couldn't possibly write it myself and am not even sure what the best method of telling it would be- probably it would at least partially include something like journal entries or just narration from inside the protagonists' head, to illustrate what exactly was going on.

Especially if the human knew the dangers perfectly well, but had some reason they had to try anyway, and also a good reason to think it might work- presumably this would require it to be an attempt at some modification other than "runaway intelligence" (and also a context where a modified self would have very little chance of thereafter achieving runaway superintelligence); if things went wrong they might spend the rest of their life doing very weird things, or die for one reason or another, or at the very worst go on a killing spree and kill a couple dozen people before being caught, but wouldn't convert the entire world into smiley faces. That way they would be a sympathetic viewpoint character taking perfectly reasonable actions, and the reader/viewer watches as their sanity teeters on the edge and is genuinely left wondering whether they'll last long enough to accomplish their goal.

Comment author: RH156 07 July 2015 03:32:38PM -1 points [-]

The reality , is that robots will subvert human society not because they are more intelligent or intent on doing humans harm, rather, they will scupper at least industrialised societies by simply being capable of most of the work now done by humans. It will be simple economics which does the trick, with employers caught between the Scylla of needing to compete with other employers who use robots to replace men (which will mean a catastrophic and irreparable lost of demand) and the Charybdis of having to throw away the idea of laissez faire economics and engage in a command economy, something which political elites raised on worship of the great god Market will have immense difficulty in doing.

Read more at https://livinginamadhouse.wordpress.com/2011/07/01/robotics-and-the-real-sorry-karl-you-got-it-wrong-final-crisis-of-capitalism/

Comment author: AlexanderRM 10 July 2015 05:41:49AM 0 points [-]

Why would a command economy be necessary to avoid that? Welfare Capitalism- you run the economy with laissez-faire except you tax some and give it away to poor people, who can then spend it as they wish as if they'd earned it in laissez-faire economics- would work just fine. As mechanization increases, you gradually increase the welfare.

It won't be entirely easy to implement politically, mainly because of our ridiculous political dichotomy where you can either understand basic economics or care about poor people, but not both.

Since we're citing sources I'll admit Scott expressed this better than I can: http://slatestarcodex.com/2013/12/08/a-something-sort-of-like-left-libertarianism-ist-manifesto/#comment-23688

View more: Prev | Next