In response to 2014 Survey Results
Comment author: Gunnar_Zarncke 04 January 2015 10:25:46AM 3 points [-]

Among the correlations the first I found surprising are

  • Minimum Wage/Feminism .378 (1286)

  • Immigration/Feminism .365 (1296)#

I wouldn't have guessed these from the corresponding memeplexes and can't see a plausible relationship. Anybody volunteer?

Comment author: blacktrance 04 January 2015 10:30:34AM 24 points [-]

Support for a higher minimum wage, increased immigration, and feminism are all typically left-wing positions, so it's not surprising that they're found together.

In response to 2014 Survey Results
Comment author: blacktrance 04 January 2015 07:43:56AM *  1 point [-]

Thank you for doing this survey.

I would be interested to see the correlations between political identification and moral views, and between moral views and meta-ethics.

(Also, looking at my responses to the survey, I think I unintentionally marked "Please do not use my data for formal research".)

Comment author: blacktrance 10 December 2014 06:06:07AM 0 points [-]

Utilitarianism is a normative ethical theory. Normative ethical theories tell you what to do (or, in the case of virtue ethics, tell you what kind of person to be). In the specific case of utilitarianism, it holds that the right thing to do (i.e. what you ought to do) is maximize world utility. In the current world, there are many people who could sacrifice a lot to generate even more world utility. Utilitarianism holds that they should do so, therefore it is demanding.

Comment author: [deleted] 02 December 2014 05:52:24AM *  3 points [-]

I'd like to recommend a fun little piece called the The Schizophrenia of Modern Ethical Theories (PDF), which points out that popular moral theories look very strange when actually applied as a grounds for action in real-life situations. Minimally, the author argues that certain reasons for actions are incompatible with certain motives, and that this becomes incoherent if we suppose that these motives were (at least partially) the motivation we had to adopt that set of reasons in the first place.

For example, if you tend to your sick friend, but explain to them that you are (really only) doing so on utilitarian ground, or on egoistic grounds, or because you are obligated to do so, etc, well...doesn't that seem off? And don't those reasons for action, presumably a generalization of a great deal of specific situations of this sort, seem incompatible with the original motivation that we felt was morally good?

In response to comment by [deleted] on Open thread, Dec. 1 - Dec. 7, 2014
Comment author: blacktrance 03 December 2014 05:41:56AM *  1 point [-]

If I tell my friend that I am visiting him on egoistic grounds, it suggests that being around him and/or promoting his well-being gives me pleasure or something like that, which doesn't sound off - it sounds correct. I should hope that my friends enjoy spending time around me and take pleasure in my well-being.

Comment author: Transfuturist 01 December 2014 03:30:45AM 0 points [-]

And you think that "desirability" in that statement refers to the utility-maximizing path?

Comment author: blacktrance 01 December 2014 06:36:51AM 0 points [-]

I mean that pleasure, by its nature, feels utility-satisfying. I don't know what you mean by "path" in "utility-maximizing path".

Comment author: Transfuturist 28 November 2014 11:11:41PM 0 points [-]

If I have utility in the state of the world, as opposed to the transitions between A, B, and C, I don't see how it's possible for me to have cyclic preferences, unless you're claiming that my utility doesn't have ordinality for some reason. If that's the sort of inconsistency in preferences you're referring to, then yes, it's bad, but I don't see how ordinal utility necessitates wireheading.

Comment author: blacktrance 29 November 2014 10:50:41AM 0 points [-]

Regarding inconsistent preferences, yes, that is what I'm referring to.

Ordinal utility doesn't by itself necessitate wireheading, such as if you are incapable of experiencing pleasure, but if you can experience it, then you should wirehead, because pleasure has the quale of desirability (pleasure feels desirable).

Comment author: Transfuturist 26 November 2014 11:12:45PM 0 points [-]

And 3 utilons. I see no cost there.

Comment author: blacktrance 27 November 2014 01:09:35AM *  0 points [-]

But presumably you don't get utility from switching as such, you get utility from having A, B, or C, so if you complete a cycle for free (without me charging you), you have exactly the same utility as when you started, and if I charge you, then when you're back to A, you have lower utility.

Comment author: DefectiveAlgorithm 26 November 2014 11:24:11AM *  0 points [-]

I know what terminal values are and I apologize if the intent behind my question was unclear. To clarify, my request was specifically for a definition in the context of human beings - that is, entities with cognitive architectures with no explicitly defined utility functions and with multiple interacting subsystems which may value different things (ie. emotional vs deliberative systems). I'm well aware of the huge impact my emotional subsystem has on my decision making. However, I don't consider it 'me' - rather, I consider it an external black box which interacts very closely with that which I do identify as me (mostly my deliberative system). I can acknowledge the strong influence it has on my motivations whilst explicitly holding a desire that this not be so, a desire which would in certain contexts lead me to knowingly make decisions that would irreversibly sacrifice a significant portion of my expected future pleasure.

To follow up on my initial question, it had been intended to lay the groundwork for this followup: What empirical claims do you consider yourself to be making about the jumble of interacting systems that is the human cognitive architecture when you say that the sole 'actual' terminal value of a human is pleasure?

Comment author: blacktrance 26 November 2014 07:56:19PM 0 points [-]

What empirical claims do you consider yourself to be making about the jumble of interacting systems that is the human cognitive architecture when you say that the sole 'actual' terminal value of a human is pleasure?

That upon ideal rational deliberation and when having all the relevant information, a person will choose to pursue pleasure as a terminal value.

Comment author: Transfuturist 26 November 2014 06:32:29AM *  1 point [-]

And if I'm unaware that such a strategy is taking place. Even if I was aware, I am a dynamic system evolving in time, and I might be perfectly happy with the expenditure per utility shift.

Unless I was opposed to that sort of arrangement, I find nothing wrong with that. It is my prerogative to spend resources to satisfy my preferences.

Comment author: blacktrance 26 November 2014 07:53:27PM 0 points [-]

I might be perfectly happy with the expenditure per utility shift.

That's exactly the problem - you'd be happy with the expenditure per shift, but every time a fill cycle would be made, you'd be worse off. If you start out with A and $10, pay me a dollar to switch to B, another dollar to switch to C, and a third dollar to switch to A, you'd end up with A and $7, worse off than you started, despite being satisfied with each transaction. That's the cost of inconsistency.

Comment author: Transfuturist 26 November 2014 05:09:49AM *  0 points [-]

Retracted: Dutch booking has nothing to do with preferences; it refers entirely to doxastic probabilities.

As far as preferences and motivation are concerned, however things should be must appeal to them as they are, or at least as they would be if they were internally consistent.

I very much disagree. I think you're couching this deontological moral stance as something more than the subjective position that it is. I find your morals abhorrent, and your normative statements regarding others' preferences to be alarming and dangerous.

Comment author: blacktrance 26 November 2014 05:43:08AM 1 point [-]

Dutch booking has nothing to do with preferences; it refers entirely to doxastic probabilities.

You can be Dutch booked with preferences too. If you prefer A to B, B to C, and C to A, I can make money off of you by offering a circular trade to you.

View more: Prev | Next