Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
Virge00

Undramatic for me too.

If you've got a talent that keeps you very popular within a group, it's very easy to get sucked into being what those admiring people want you to be. Being bright, clear-thinking, eloquent, confident (and a musician) moves you very easily into a leadership position, and builds the feeling of responsibility for the welfare of the group.

It took me too long to commit to acknowledging my accumulated doubts and misgivings and examine them in anything other than a pro-Christian light. I had enough religious cached thoughts in an interconnected self-supporting web that doubting any one of them was discouraged by the support of the others. However, I was spending more of my time aware of the dissonance between what I knew and what I believed (or, as I later realised, what I was telling myself I believed).

I ended up deciding to spend a few months of my non-work time examining my faith in detail -- clearing the cache, and trying to understand what it was that made me hold on to what I thought I believed. During that time I gradually dropped out of church activities.

I look back on the time and see it as a process of becoming more honest with myself. Had I tried to determine what I really believed by looking at what I anticipated and how that influenced my behaviour, I'd have realised a lot earlier that my true beliefs were non-supernatural. I'd just been playing an expected role in a supportive family and social group, and I'd adjusted my thinking to blend into that role.

Virge00

Thanks Patrick. As it looks like turning out, I think my 3rd is going to be completely taken up anyway. Maybe next time.

Virge20

Apologies from me. My October 2nd is already booked for another party. (Not that I attend a lot of parties.)

Virge80

Hi. I was an occasional contributor on OB and have posted a few comments on LW. I've dropped back to lurking for about a year now. I find most of the posts stimulating -- some stimulating enough to make me want to comment -- but my recent habit of catching up in bursts means that the conversations are often several weeks old and a lot of what needs to be argued about them has already been said.

The last post that almost prompted me to comment was ata's mathematical universe / map=territory post. It forced me to think for some time about the reification of mathematical subjunctives and how similar that was to common confusions about 'couldness'. I decided I didn't have the time and energy to revive the discussion and to refine my ideas with sufficient rigor to make it worth everyone's attention.

Over the past week I've worked through my backlog of LW reading, so I've removed my "old conversation" excuse for not commenting. I'll still be mostly a lurker.

Virge10

Morality is then the problem of developing a framework for resolving conflicts of interest in such a way that all the agents can accept the conflict resolution process as optimal.

I think we're in agreement here.

For me the difficult questions arise when we try to take one universalizable moral principle and try to apply it at every level of organization, from the personal "what should I be doing with my time and energy at this moment?" to the public "what should person A be permitted/obliged to do?"

I was thinking about raising the question of utilitarianism being difficult to institute as an ESS when writing my previous post. To a certain extent, we (in democratic cultures with independent judiciary) train our intuitions to accept the idea of fairness as we grow up. Our parents, kindergarten and school teachers do their best to instill certain values. The fact that racism and sexism can become entrenched during formative years suggests to me that the equality and fairness principles I've grown up with can also be trained. We share a psychological architecture, but there is enough flexibility that we can train our moral intuitions (to some extent).

Utilitarianism is in principle universalizable, but is it practically universalizable at all decision levels? What training (or brainwashing) and threats of defector punishment would we need to implement to completely override our natural nepotism? To me this seems like an impractical goal.

I've been somewhat confused by the idea of anyone wanting to make all their decisions on utilitarian principles (even at the expense of familial obligations), so I wondered if I've been erecting an extreme utilitarian strawman. I think I have, and I'm seeing a glimmer of a solution to the confusion.

Given that we all have relationships we value, and to force ourselves to ignore those relationships in our daily activities represents negative utility, we cannot maximize utility with a moral system that requires everyone to treat everyone else as equal at all times and in all decisions. Any genuine utilitarian calculation must account for everyone's emotional satisfaction from relationship activities.

(I feel less confused now. I'll have to think about this some more.)

Virge00

It's fairly clear that most people do in fact put greater weight on the utility of their family and friends than on that of strangers. I believe that is perfectly ethical and moral but it conflicts with a conception of utilitarianism that requires equal weights for all humans. If weights are not equal then utility is not universal and so utilitarianism does not provide a unique 'right' answer in the face of any ethical dilemma and so seems to me to be of limited value.

If you choose to reject any system that doesn't provide a "unique 'right' answer" then you're going to reject every system so far devised. Have you read Greene's The Terrible, Horrible, No Good, Very Bad Truth about Morality and What to Do About it ?

However, I agree with you that any form of utilitarianism that has to have different weights when applied by different people is highly problematic. So we're left with:

  • Pure selfless utilitarianism conflicts with our natural intuitions about morality when our friends and relatives are involved.

  • Untrained intuitive morality results in favoring humans unequally based on relationships and will appear unfair from a 3rd party viewpoint.

You can train yourself to some extent to find a utilitarian position more intuitive. If you work with just about any consistent system for long enough, it'll start to feel more natural. I doubt that anyone who has any social or familial connections can be a perfect utilitarian all the time: there are always times when family or friends take priority over the rest of the world.

Virge30

Your intuitions will be biased to favoring a sibling over a stranger. Evolution has seen to that, i.e. kin selection.

Utilitarianism tries to maximize utility for all, regardless of relatedness. Even if you adjust the weightings for individuals based on likelihood of particular individuals having a greater impact on overall utility, you don't (in general) get weightings that will match your intuitions.

I think it is unreasonable to expect your moral intuitions to ever approximate utilitarianism (or vice versa) unless you are making moral decisions about people you don't know at all.

In reality, the money I spend on my two cats could be spent improving the happiness of many humans - humans that I don't know at all who are living a long way away from me. Clearly I don't apply utilitarianism to my moral decision to keep pets. I am still confused about how much I should let utilitarianism shift my emotionally-based lifestyle decisions.

Virge50

I've noticed strong female representation (where I least expected to find it) in The Skeptic Zone,an Australian skeptics group. The feeling I get of that community (even just as a podcast lurker) is that it's much more lighthearted than LW/OB. Whether that makes any difference to sex ratios, I don't know.

For most of the time I've listened to the podcast, there's been regular strong contributions from females. My gut feel would have been that having good female role models would encourage more female participation, however I just did a quick eyeballing of the Skeptic Zone's FaceBook fans and it looks typically about 5:1 biased to males.

Virge00

Melbourne, Australia

Virge80

Or what thoughts do you have regarding Michael Vassar's suggestion to practice lying?

(Reusing an old joke) Q: What's the difference between a creationist preacher and a rationalist? A: The rationalist knows when he's lying.

I'm having trouble resolving 2a and 3b.

2a. Hyper-vigilant honesty. Take care never to say anything but what is best supported by the evidence, aloud or to yourself, lest you come to believe it. 3b. Build emotional comfort with lying, so you won’t be tempted to rationalize your last week’s false claim, or your next week’s political convenience. Perhaps follow Michael Vassar’s suggestion to lie on purpose in some unimportant contexts.

I find myself rejecting 3b as a useful practice because:

  • What I think will be an unimportant and undetectable lie has a finite probability of being detected and considered important by someone whose confidence I value. See Entangled Truths, Contagious Lies

  • This post points out the dangers of self-delusion from motivated small lies e.g. "if I hang out with a bunch of Green Sky-ers, and I make small remarks that accord with the Green Sky position so that they’ll like me, I’m liable to end up a Green Sky-er myself." Is there any evidence to show that I'll be safer from my own lies if I deliberately tag them at the time I tell them?

  • Building rationalism as a movement to improve humanity doesn't need to be encumbered by accusations that the movement encourages dishonesty. Even though one might justify the practice of telling unimportant lies as a means to prevent a larger more problematic bias, advocating lies at any level is begging to be quote-mined and portrayed as fundamentally immoral.

  • The justification for 3b ("so you won’t be tempted to rationalize your last week’s false claim, or your next week’s political convenience.") doesn't work for me. I don't know if I'm different, but I find that I have far more respect for people (particularly politicians) who admit they were wrong.

Rather than practising being emotionally comfortable lying, I'd rather practise being comfortable with acknowledging fallibility.

Load More