Comment author: DanArmak 12 October 2016 02:02:14PM *  1 point [-]

I've been told that people use the word "morals" to mean different things. Please answer this poll or add comments to help me understand better.

When you see the word "morals" used without further clarification, do you take it to mean something different from "values" or "terminal goals"?

Submitting...

Comment author: username2 14 October 2016 12:04:43AM *  1 point [-]

Survey assumed a consequentialist utilitarian moral framework. My moral philosophy is neither, so there was no adequate answer.

Comment author: SithLord13 13 October 2016 03:37:16PM 0 points [-]

There are a lot of conflicting aspects to consider here outside of a vacuum. Discounting the unknown unknowns, which could factor heavily here since it's an emotionally biasing topic, you've got the fact that the baby is going to be raised by an assumably attentive mother, as opposed to the 5 who wound up in that situation once, showing at least some increased risk of falling victim to such a situation again. Then you have the psychological damage to the mother, which is going to be even greater because she had to do the act herself. Then you've got the fact that a child raised by a mother who is willing to do it has a greater chance of being raised in such a way as to have a net positive impact on society. Then you have the greater potential for preventing the situation in the future, caused by the increased visibility of the higher death toll. I'm certain there are more aspects I'm failing to note.

But, if we cut to what I believe is the heart of your point, then yes, she absolutely should. Let's scale the problem up for a moment. Say instead of 5 it's 500. Or 5 million. Or the entire rest of humanity aside from the mother and her baby. At what point does sacrificing her child become the right decision? Really, this boils down to the idea of shut up and multiply.

Comment author: username2 13 October 2016 11:42:29PM *  1 point [-]

But, if we cut to what I believe is the heart of your point, then yes, she absolutely should. Let's scale the problem up for a moment. Say instead of 5 it's 500. Or 5 million. Or the entire rest of humanity aside from the mother and her baby. At what point does sacrificing her child become the right decision? Really, this boils down to the idea of shut up and multiply.

Never, in my opinion. Put every other human being on the tracks (excluding other close family members to keep this from being a Sophie's choice "would you rather..." game). The mother should still act to protect her child. I'm not joking.

You can post-facto rationalize this by valuing the kind of societies where mothers are ready to sacrifice their kids, and indeed encouraged to save another life, vs. the world where mothers simply always protect their kids no matter what.

But I don't think this is necessary -- you don't need to validate it on utilitarian grounds. Rather it is perfectly okay for one person to value some lives more than others. We shouldn't want to change this, IMHO. And I think the OP's question about donating 100% to charity, at the detriment of themselves, is symptomatic of the problems that arise from utilitarian thinking. After all if OP was not having internal conflict between internal morals and supposedly rational utilitarian thinking, he wouldn't have asked the question...

Comment author: 9eB1 10 October 2016 05:01:32PM 3 points [-]

I would be very interested in this as well. In the meantime, there is a subreddit for the site that has a thread with best posts for a new reader, and a thread on people's favorite things from TLP.

Comment author: username2 12 October 2016 08:18:51PM 2 points [-]

Hey thanks for this. I had some time and I compiled this chronologically ordered list of links from those threads for personal use. https://my.mixtape.moe/nrbmyr.html

Comment author: Lumifer 11 October 2016 08:54:08PM 0 points [-]

...and did you read my comments in the thread?

Comment author: username2 11 October 2016 09:15:52PM 0 points [-]

Ah I did (at the time), but forgot it was you that made those comments. So I should direct my question to Jacobian, not you.

In any case I'm certainly not a "save the world" type of person, and find myself thoroughly confused by those who profess to be and enter into self-destructive behavior as a result.

Comment author: Lumifer 11 October 2016 07:47:10PM 0 points [-]

Hey, look here, you totally should. All that emotional empathy just gets in the way.

Comment author: username2 11 October 2016 08:48:50PM *  0 points [-]

Read it already. Let's be clear: you think the mother should push her baby in front of a trolley to save five random strangers? If so, why? If not, why not? I don't consider this a loaded question -- it falls directly out of the utilitarian calculus and assumed values that leads to "donate 100% to charities."

[Let's assume the strangers are also same-age babies, so there's no weasel ways out ("baby has more life ahead of it", etc.)]

Comment author: siIver 11 October 2016 07:40:04PM *  0 points [-]

Er... no. Utilitarianism prohibits that exact thing by design. That's one of its most important aspects.

Read the definition. This is unambiguous.

Comment author: username2 11 October 2016 08:45:03PM 3 points [-]

"Utilitarianism is a theory in normative ethics holding that the best moral action is the one that maximizes utility." -Wikipedia

The very next sentence starts with "Utility is defined in various ways..." It is entirely possible for there to be utility functions that treat sentient beings differently. John Stuart Mill may have phrased it as "the greatest good for the greatest number" but the clutch is in the word "good" which is left undefined. This is as opposed to, say, virtue ethics which doesn't care per se about the consequences of actions.

Comment author: MrMind 11 October 2016 01:06:33PM 1 point [-]

Is there a good rebuttal to why we don't donate 100% of our income to charity? I mean, as an explanation tribality / near - far are ok, but is there a good justification post-hoc?

Comment author: username2 11 October 2016 07:29:31PM 0 points [-]

A mother that followed that logic would push her own baby in front of a trolley to save five random strangers. Ask yourself if that is the moral framework you really want to follow.

Comment author: siIver 11 October 2016 03:50:10PM *  0 points [-]

100% doesn't work because then you starve. If I re-formulate your question to "is there any rebuttal to why we don't donate way more to charity than we currently do" then the answer depends on your belief system. If you are utilitarian, the answer is definitive no. You should spend way more on charity.

Comment author: username2 11 October 2016 07:28:09PM *  1 point [-]

Nonsense. I believe my life and the lives of people close to me are more important than someone starving in a place whose name I can't pronounce. I just don't assign the same weight to all people. That is perfectly consistent with utilitarianism.

Comment author: James_Miller 11 October 2016 04:10:40AM 1 point [-]

Get a job at Google or seek to influence the people developing the AI. If, say, you were a beautiful woman you could, probably successfully, start a relationship with one of Google's AI developers.

Comment author: username2 11 October 2016 07:24:07PM -1 points [-]

I am confused as to whether I should upvote for "get a job at Google" or downvoter for "prostitute yourself".

Comment author: turchin 10 October 2016 11:13:53AM 5 points [-]

If we knew that AI will be created by Google, and that it will happen in next 5 years, what should we do?

Comment author: username2 11 October 2016 07:20:11PM 0 points [-]

Rejoice because the end is near.

Maybe buy Google stock?

View more: Next