steven0461 comments on Survey Results - Less Wrong

48 Post author: Yvain 12 May 2009 10:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (210)

You are viewing a single comment's thread. Show more comments above.

Comment author: SoullessAutomaton 13 May 2009 10:21:49PM *  3 points [-]

But I think that taw is poisoning the discourse, making it worse than it already is. It's a pretty common tactic to paint anyone outside the mainstream as an ideologue.

In what way is he "poisoning" the discourse? He didn't even use the term ideologue, and he explained in a later post why he thinks libertarianism is essentially deontological in nature. Accusing him of "making the discourse worse" only serves to itself worsen the discussion.

Quite frankly, in my experience with people arguing for libertarianism, it tends to be precisely what he describes--a lot of bottom-line faux-consequentialist arguments about why free market principles necessarily produce better results, combined with question-begging arguments that assume individual economic freedom as the value to be maximized.

As a concrete example, by almost any metric European-style socialized health care systems work empirically, objectively better. Given the high cost of trying untested systems and the general lack of predictive power demonstrated by current macroeconomics, I can't conceive of any coherent, consequentialist argument agaisnt the immediate utility of adopting such a system in the USA, yet most libertarians will argue until blue in the face that socialized health care is a terrible idea, in aparent defiance of reality.

EDIT: This comment was pretty promptly voted down to -2 for reasons not apparent to me. Any reasons other than disagreement?

Comment author: steven0461 14 May 2009 12:02:32AM *  3 points [-]

Deontological principles often help maximize utility indirectly, as I'm sure most utilitarians agree in contexts like war and criminal justice. Still, I agree deontology can bias people in the direction of libertarian politics. On the other hand, folk economics can bias people away from libertarian politics.

Since utilitarianism values the sum of all future generations far more than it values the current generation, it seems like (if we ignore that existential risks are even more important) utilitarianism recommends whatever policies grow the economy the fastest in the long run. That might be an argument for libertarianism but it might also be an argument for governments spending lots of money subsidizing research and development.

Comment author: SoullessAutomaton 14 May 2009 12:39:54AM 5 points [-]

Deontological principles often help maximize utility indirectly, as I'm sure most utilitarians agree in contexts like war and criminal justice. Still, I agree deontology can bias people in the direction of libertarian politics.

It seems more the other way to me--die-hard libertarians tend toward deontological positions, typically by gradual reification of consequentialist instrumental values into deontological terminal values ("free markets usually produce the best results" becomes "free markets are Good", &c.).

On the other hand, folk economics can bias people away from libertarian politics.

This is true, of course, and it's worth noting that I agree with a substantial majority of libertarian positions, which is part of why I find some aspects of libertarianism so irritating--it helps marginalize a political outlook that could be doing some good.

Since utilitarianism values the sum of all future generations far more than it values the current generation, it seems like (if we ignore that existential risks are even more important) utilitarianism recommends whatever policies grow the economy the fastest in the long run. That might be an argument for libertarianism but it might also be an argument for governments spending lots of money subsidizing research and development.

I'd think more likely it'd be an argument for both--subsidized research combined with lowered barriers to entry for innovative businesses--tile the country with alternating universities and silicon valley-type startup hotbeds, essentially (see also: Paul Graham's wet dream).

Anyway, I don't think it's the case that all forms of utilitarianism assign value to future generations that may or may not ever exist. Assigning value to potential entities seems fraught with peril.

Comment author: Nick_Tarleton 14 May 2009 12:55:55AM 0 points [-]

Assigning value to potential entities seems fraught with peril.

Such as?

Comment author: mattnewport 14 May 2009 01:00:19AM 1 point [-]

It would seem to support the biblical condemnation of onanism.

Comment author: Nick_Tarleton 19 May 2009 08:04:50PM 2 points [-]

"Potential entities" here doesn't mean "currently existing non-morally-significant entities that might give rise to morally significant entities", just "entities that don't exist yet". A much clearer phrasing would be something like "Does my utility function aggregate over all entities existing in spacetime, or only those existing now?" IMO, the latter is obviously wrong, either being dynamically inconsistent if "now" is defined indexically, or, if "now" is some specific time, implying that we should bind ourselves not to care about people born after that time even once they do exist.

Comment author: SoullessAutomaton 14 May 2009 01:05:29AM *  0 points [-]

Combinatorial explosion, for starters. There's a very large set of potential entities that may or may not exist, and most won't. Assigning value to these entities seems likely to lead to absurdity. If nothing else, it seems to quickly lead to some manner of obligation to see as many entities created as possible.

Comment author: MichaelBishop 14 May 2009 03:35:28AM 0 points [-]

But not assigning value to potential entities implies that we should make a lot of changes. Ignoring global warming for one. Perhaps enslaving future generations?

Comment author: SoullessAutomaton 14 May 2009 10:37:24AM 0 points [-]

I think it's arguable that global warming could impact plenty of people already alive today, and I'm not sure what you mean by enslaving future generations.

But yes, assigning no value at all to potential entities may also be problematic, but I'm not sure what a reasonable balance is.