In response to comment by Oligopsony on White Lies
Comment author: TheOtherDave 08 February 2014 04:38:48PM 2 points [-]

Can we taboo "intellectual discourse"? As I think about your question I realize that I'm not sure I understand what that phrase is being used to refer to in this context.

In response to comment by TheOtherDave on White Lies
Comment author: Oligopsony 09 February 2014 10:44:10PM -1 points [-]

For present purposes, I suppose it includes any domain including the defense of lying itself.

In response to White Lies
Comment author: Oligopsony 08 February 2014 02:36:21PM 2 points [-]

All this needs the disclaimer that some domains should be lie-free zones. I value the truth and despise those who would corrupt intellectual discourse with lies.

Can anyone point me to a defense of corrupting intellectual discourse with lies (that doesn't resolve into a two-tier model of elites or insiders for whom truth is required and masses/outsiders for whom it is not?) Obviously there is at least one really good reason why espousing such a viewpoint would be rare, but I assume that, by the law of large numbers, there's probably an extant example somewhere.

In response to White Lies
Comment author: Oligopsony 08 February 2014 02:18:07PM *  4 points [-]

At LessWrong there've been discussions of several different views all described as "radical honesty." No one I know of, though, has advocated Radical Honesty as defined by psychotherapist Brad Blanton, which (among other things) demands that people share every negative thought they have about other people. (If you haven't, I recommend reading A. J. Jacobs on Blanton's movement.) While I'm glad no one here is thinks Blanton's version of radical honesty is a good idea, a strict no-lies policy can sometimes have effects that are just as disastrous.

To point out the obvious, speaking from personal experience, this is indeed a terrible idea.

A couple of months ago I told a lie to someone I cared about. This wasn't a justified lie; it was a pretty lousy lie (both in its justifiability and the skill with which I executed it) and I was immediately exposed by facial cues. I felt pretty awful because a lot of my self-concept up to that point had been based around being a very honest person, and from that point on, I decided to treat my "you shouldn't tell her _" intuitions as direct orders from my conscience to reveal exactly that thing, and to pay close attention to whether the meaning of what I've said deviates from the truth in a direction favorable to me, and as a consequence, I now feel rising anxiety whenever I feel some embarrassing thought followed by the need to confess it. I also resolved to search my conscience for any bad deeds I may have forgotten, which actually led to compulsive fantastic searching for terrible things I might have done and repressed, no matter how absurd (I've gotten moslty-successful help about this part.) She's long since forgiven me for the original lie and what I lied about, but continues to find this compulsive confessional behavior extremely annoying, and I doubt I could really function if I experienced it around people in general rather than her specifically.

Comment author: Mestroyer 23 January 2014 10:34:31PM 12 points [-]

Probably written in the sense: "If you were really strong of mind, you'd will yourself into believing because I just threw an infinity into your expected value calculations", and upvoted in the sense: "Atheism is evidence of strength of mind, but it's become too common to serve as a really good test." (I know I've heard this idea on LessWrong before, I can't remember where though).

Comment author: Oligopsony 24 January 2014 02:34:16AM 4 points [-]

This, but in a more general sense for the first: Pascal thought there were a bunch of sophisticated philosophical reasons that you should be a Catholic; the Wager was just the one he's famous for.

Comment author: notsonewuser 22 January 2014 01:04:15AM 7 points [-]

Atheism shows strength of mind, but only to a certain degree.

-- Blaise Pascal

Comment author: Oligopsony 22 January 2014 08:12:53PM *  14 points [-]

I suspect this was written and is being upvoted in very different senses.

In response to The Onrushing Wave
Comment author: Oligopsony 19 January 2014 09:10:25PM 7 points [-]

See also Hanson's less than enthusiastic review.

Comment author: Viliam_Bur 16 January 2014 11:44:58AM *  34 points [-]

If you find yourself on a playing field where everyone else is a TrollBot (players who cooperate with you if and only if you cooperate with DefectBot) then you should cooperate with DefectBots and defect against TrollBots.

An example from real life: DefectBot = God, TrollBots = your religious neighbors. God does not reward you for your prayers, but your neighbors may punish you socially for lack of trying. You defect against your neighbors by secretly being a member of an atheist community, and generally by not punishing other nonbelievers.

I wonder what techniques could we use to make the compartmentalization stronger and easy to turn off when it's no longer needed. Clear boundaries. A possible solution would be to use the different set of beliefs only while wearing a silly hat. Not literally silly, because I might want to use it in public without handicapping myself. But some environmental reminder. An amulet, perhaps?

Comment author: Oligopsony 16 January 2014 11:49:01PM 4 points [-]

Amusingly enough, the example of TrollBot that came to mind was the God expounded on in many parts of the New Testament, who will punish you iff you do not unconditionally cooperate with others, including your oppressors.

Comment author: Chrysophylax 13 January 2014 03:38:17PM -1 points [-]

you should prefer the lesser evil to be more beholden to its base

How would you go about achieving this? The only interpretation that occurs to me is to minimise the number of votes for the less-dispreferred main party subject to the constraint that it wins, thereby making it maximally indebted to (which seems an unlikely way for politicians to think) and maximally (apparently) dependent upon its strongest supporters.

To provide a concrete example, this seems to suggest that a person who favours the Republicans over the Democrats and expects the Republicans to do well in the midterms should vote for a Libertarian, thereby making the Republicans more dependent on the Tea Party. This is counterintuitive, to say the least.

I disagree with the initial claim. While moving away from centre for an electoral term might lead to short-term gains (e.g. passing something that is mainly favoured by more extreme voters), it might also lead to short-term losses (by causing stalemate and gridlock). In the longer term, taking a wingward stance seems likely to polarise views of the party, strengthening support from diehards but weakening appeal to centrists.

Comment author: Oligopsony 13 January 2014 04:28:08PM 1 point [-]

To provide a concrete example, this seems to suggest that a person who favours the Republicans over the Democrats and expects the Republicans to do well in the midterms should vote for a Libertarian, thereby making the Republicans more dependent on the Tea Party. This is counterintuitive, to say the least.

Is it? Again, I haven't done the math, but look at the behavior of minor parties in parliamentary systems. They typically demand a price for their support. If the Republican will get your vote regardless why should they care about you?

Comment author: Oligopsony 13 January 2014 04:42:51AM 4 points [-]

Taking arguments more seriously than you possibly should. I feel like I see all the time on rationalist communities people say stuff like "this argument by A sort of makes sense, you just need to frame it in objective, consequentialist terms like blah blah blah blah blah" and then follow with what looks to me like a completely original thought that I've never seen before.

Rather than - or at least in addition to - being a bug, this strikes me as one of charity's features. Most arguments are, indeed, neither original nor very good. Inasmuch as you can substitute them for more original and/or coherent claims, then so much the better, I say.

Comment author: Oligopsony 13 January 2014 04:10:11AM 2 points [-]

Another consideration is the effects of your decision criteria on the lesser evil itself. All else being equal, and assuming your politics aren't so unbelievably unimaginative that you see yourself somewhere between the two mainstream alternatives, you should prefer the lesser evil to be more beholden to its base. The logic of this should be most evident in parliamentary systems, where third party voters can explicitly coordinate and sometimes back and sometimes withdraw support from their nearest mainstream parties, depending on policy concessions.

View more: Prev | Next