Comment author: Alicorn 30 May 2009 12:18:16AM 5 points [-]

It seems that in this article, Robin is co-defining "opinion" with "belief". This isn't, exactly, incorrect, but I don't think it maps completely onto the common use, which may be causing misunderstanding. If I say "it's my opinion that [insert factual proposition here]", then Robin's remarks certainly apply. But if it's my opinion that chocolate chip cookie dough ice cream is delicious - which is certainly a way people often use the word "opinion" - then in what way might I not be entitled to that? Unless I turn out to be mistaken in my use of the term "chocolate chip cookie dough ice cream", or something, but assume I'm not.

Comment author: alvarojabril 30 May 2009 12:26:31AM 1 point [-]

Thank you. An opinion is a thought. What does it mean to say that you are not entitled to a thought?

Comment author: Eliezer_Yudkowsky 29 May 2009 08:42:13PM 5 points [-]

I think this is a good distinction, and anyone somehow trying to shift social norms (perhaps within a subcommunity) might be well-advised to shift the norms in order: First, teach people that others have a right to criticize their opinion; then, teach them that they have no right to an opinion.

Comment author: alvarojabril 29 May 2009 11:46:24PM 3 points [-]

"teach them that they have no right to an opinion."

I know people throw the term around (I try not to), but this is maybe the most fascist thing I've seen on this board. They have no right to an opinion? You might want to rephrase this, as many of my opinions are somewhat involuntary.

Comment author: Lightwave 29 May 2009 10:36:03PM *  0 points [-]

Another fun read: Civilization-Level Quantum Suicide

Quantum Suicide Reality Editing

If you accept the idea of quantum suicide then you should be open to the idea of using it for editing reality. You could construct some system that monitored events for you and would immediately cause you to cease to exist if events did not happen as you wanted them. The idea would be that you would continue to exist only in those future worlds in which events happened as desired, so that from your point of view, events would always happen as you wanted. You would be using quantum suicide to control your reality.

Quantum Suicide Computing

Suppose you had some computing problem which would take a long time to solve, but you have some way of checking possible answers. You could set up some system which uses quantum events to generate a random answer to the computation and then automatically causes you to cease to exist if the answer is not the correct answer, or if it is not better, in some sense, than the previous answer that you obtained. The idea would be that future worlds would exist in which all possible answers were generated and you would only exist in those worlds where the answer was correct or better than previously generated hours, thereby giving you the perception of having enormous computing power.

Comment author: alvarojabril 29 May 2009 10:51:24PM *  1 point [-]

Could be a pretty wild dystopia for the people who aren't hooked up - elites constantly disappearing and the clocks are all wrong. Come to think of it, did I say DYStopia?

Comment author: AdeleneDawner 29 May 2009 10:24:08PM 0 points [-]

Quote, please?

Most of what he said condenses to "people who are not practicing rationality are irrational", which is only an insult if you consider 'irrational' to be an insult, which I didn't see any evidence of. I saw frustration at the difficulty in dealing with them without social awkwardness, but that's not the same.

Have I missed something?

Comment author: alvarojabril 29 May 2009 10:38:13PM *  0 points [-]

Yes, and most of what I said reduces to "Annoyance is not practicing rationality with statements like "'social cohesion is one of the enemies of rationality.'" You said you had a "problem" with my contention and then I pointed out that Annoyance had made a qualitatively similar claim that hadn't bothered you. Aside from our apparent disagreement on the point I don't get how my claim could be a problem for you.

I think I've made myself clear and this is getting tiresome so I'll invite you to have the last word.

Comment author: AdeleneDawner 29 May 2009 10:11:34PM 0 points [-]

I don't at all disagree that the skills are good to learn, especially if you're going to be focusing on tasks that involve dealing with non-rationalists. I think it may be a bit of an over generalization to say that they should be a high priority for everyone, but probably not much of one.

I do have a problem with judging people for not having already mastered those skills, or for having higher priorities than tackling those skills immediately with all their energy, though, which seems to be what you're doing. Am I inferring too much when I come to that conclusion?

Comment author: alvarojabril 29 May 2009 10:17:16PM 0 points [-]

Look, this whole thread started because of Annoyance's judgment of people who have higher priorities than rationality, right? Did you have a problem with that?

All I'm saying is that this community in general gives way too short shrift to the utility of social cohesion. Sorry if that bothers you.

Comment author: Annoyance 29 May 2009 09:51:51PM -2 points [-]

Social cohesion is one of the enemies of rationality.

It's not necessarily so in that it's not always opposed to it, but it is incompatible with the mechanisms that bring it about and permit it to error-correct. It tends to reinforce error. When it happens to reinforce correctness, it's not needed, and when it doesn't, it makes it significantly harder to correct the errors.

Comment author: alvarojabril 29 May 2009 10:00:17PM *  0 points [-]

"When it happens to reinforce correctness, it's not needed"

Can you elaborate?

I'll note that rationality isn't an end. My ideal world state would involve a healthy serving of both rationality and social cohesion. There are many situations in which these forces work in tandem and many where they're at odds.

A perfect example is this site. There are rules the community follows to maintain a certain level of social cohesion, which in turn aides us in the pursuit of rationality. Or are the rules not needed?

Comment author: AdeleneDawner 29 May 2009 09:17:31PM 2 points [-]

I thought I would get away with that simplification. Heh.

Those skills do come naturally to some people, but not everyone. They certainly don't come naturally to me. Even if I'm in a social group with rules that allow me to notice that a faux pas has occurred (not all do; some groups consider it normal to obscure such things to the point where I'll find out weeks or months later, if at all), it's still not usually obvious what I did wrong or what else I could do instead, and I have to intentionally sit down and come up with theories that I may or may not even have a chance to test.

Comment author: alvarojabril 29 May 2009 09:58:42PM 1 point [-]

Right, I get that people fare differently when it comes to this stuff, but I do think it's a matter of practice and attention more than innate ability (for most people). And this is really my point, that the sort of monastic rationality frequently espoused on these boards can have politically antirational effects. It's way easier to influence others if you first establish a decent rapport with them.

Comment author: AdeleneDawner 29 May 2009 08:10:43PM 5 points [-]

I think you're underestimating the degree of social intelligence required. To pull that off while still keeping the rationalistic habits that such people find offensive, you'd have to:

  • Recognize the problem, which is nontrivial,
  • Find a way of figuring out who falls on which side of the line, without tipping people off,
  • Determine all of the rationalistic habits that are likely to offend people who are not trying to become more rational,
  • Find non-offensive ways of achieving those goals, or find ways of avoiding those situations entirely,
  • Find a way not to slip up in conversation and apply the habits anyway - again, nontrivial. Keeping this degree of focus in realtime is hard.

You'd also probably have to at least to some degree integrate the idea that it's 'okay' (not correct, just acceptable) to be irrational into your general thought process, to avoid unintentional signaling that you think poorly of them. If anything, irrational people are more likely to notice such subtle signals, since so much of their communication is based on them.

Comment author: alvarojabril 29 May 2009 09:08:09PM 0 points [-]

The problems you cite in bullets are only nontrivial if you don't sufficiently value social cohesion. My biggest faux pas have sufficiently conditioned me to make them less often because I put a high premium on that cohesion. So I think it's less a question of social intelligence and more one of priorities. I don't have to keep "constant focus" - after a few faux pas it becomes plainly apparent which subjects are controversial and which aren't, and when we do come around to touchy ones I watch myself a little more.

Comment author: JoeShipley 29 May 2009 08:20:22PM *  1 point [-]

I agree completely. If intelligence-generated problems cannot outpace the solutions total destruction awaits.

I apologize if the stupid pill characterization feels wrong, I just was trying to think of a viable alternative to increasing intelligence.

Comment author: alvarojabril 29 May 2009 08:32:59PM 0 points [-]

I'm glad we've hashed this out. I think that bias about the messianic/apocalyptic role of technology has largely been overlooked on this site, so I was glad to see this entry of Eliezer's.

Regardless of whether or not they're true I tend to think that arguments about the arc of history etc are profoundly counterproductive. People won't vote if they think it's a landslide, either for their guy or against. And I suspect I differ from others on this site in this respect, but I find it hard to get ginned up about cosmic endeavors, simply because they seem so remote from my experience.

And I don't think we need an alternative! What I was trying to point out from the start was that increasing our predictive ability is necessary but not sufficient to save the world. Entirely selfish, entirely rational actors will doom the planet if we let them.

Comment author: JoeShipley 29 May 2009 08:00:58PM 2 points [-]

I think those problems weren't caused by too much intelligence, but by too little. I know, intelligence enables these problems to form in the first place -- These entities wouldn't be making the problems if they weren't volitional agents with intelligence, but that seems like a kind of cop-out complaint -- Without intelligence there wouldn't be any problems, sure, but there also wouldn't be anything positive either, no concepts whatsoever.

Pollution is a great example: It's intelligent thought that allowed us to start making machines that polluted. Intelligence allowed us to realize we could capitalize on the well-being of the environment and save money by trashing it.

More intelligence realizes that this is still a value trade off, that you aren't getting something for nothing -- Depending on the rate at which you do this, you could seriously damage yourself and the people around you for the trade-off. You have to weigh the costs with the benefits, and if the benefit is 'some money' and the cost is 'destroying the world', the intelligent choice becomes clear. To continue to act for the money isn't intelligence, it's just insanity, overpowering greed.

The cuban missile crisis may have been caused by intelligence building the structures that led up to it, but the solution wasn't making everyone dumber so they couldn't build that kind of thing -- that just reduces overall utility. The solution is to act intelligently in ways that don't destroy the world.

I see your point about moral intelligence being considered separately though, I hadn't thought of that in the context. It's a more elegant package to wrap everything up together, but not always the right thing to do... Thanks for the reply.

Comment author: alvarojabril 29 May 2009 08:16:41PM *  0 points [-]

So you agree that yes, intelligence is continually generating "extra problems" for us to deal with. As you point out, many of the most pressing problems in the modern world are unforeseen consequences of useful technologies. You just believe that increases in human intelligence will invariably outpace the destructive power of the problems, whereas I don't.

The premise of this diary was many earths, so I'd submit that certainly there are many earths for which the problem of nuclear warfare outpaced humanity's capacity to intelligently deal with it, and that in the end we could very well share their fate.

I'll also note that I fail to see how anyone could conclude from what I've written above that my prescription for humanity is stupid pills.

View more: Prev | Next