This premise sounds interesting, but I feel like concrete examples would really help me be sure I understand
Oh, I've thought of another example:
Less Wrongers and other rationalists frequently get told that "rationality is nice but emotion is important too". Less Wrongers typically react to this by:
1) Mocking it as a fallacy because "rationality is defined as winning so it is not opposed to emotion", before eagerly taking it up as a strawman and posting the erroneous argument all over the place to show everyone how poor the enemies of reason are at reasoning.
Instead of:
2) Actually considering for five minutes whether or not there might be a correlation or even an inverse causal relationship between rationality and emotional control/ability to read emotions, which causes this observation in the first place.
Needless to say, I blame Yudkowsky.
This premise sounds interesting, but I feel like concrete examples would really help me be sure I understand
Hm, okay, let me try to make it more concrete.
My main example is one where people (more than once, in fact) told me that "I might have my own truth, but other people have their truth as well". This was incredibly easy to dismiss as people being unable to tell map from territory, but after the third time I started to wonder why people were telling me this. So I asked them what made them bring it up in the first place, and they replied that they felt uncomfortable when I was stating facts with the confidence they warranted. I was reminded of something Richard Dawkins said: "clarity is often seen as offensive." I asked some other people if they felt the same way, and a helpful people-person told me that the reason for this is that those people felt threatened by my intelligence (they were HR) and my stating things with confidence reminded them of this. So I got the advice to phrase my statements of belief in a more friendly way. I hated this because it felt dishonest, having to use weasel words to hide the fact that I felt confident, but I could no longer deny that my current method wasn't working.
The meta-level I learned was the one presented in the OP: When people give you advice/objections, they almost never say what they mean or what the actual problem is. They substitute something that sounds nice and not-offensive sounding, making it easy to dismiss their advice as nonsense. So what you are supposed to do is find out what they originally meant and draw a lesson from that instead.
Another example: My father often tells me not to be cynical, but this doesn't make much sense to me because he is very cynical himself. It turns out that what he actually means is that I should be more upbeat, or as Scott Adams would put it: "Be a huge phony." The reason my father does not state this outright is because he is following his own rule even while giving the advice: he is rephrasing "be a huge phony" as "don't be cynical", because "be a huge phony" sounds cynical.
How do you present insecurity so it ends up being read as arrogance?
This surprised me as well when I first heard it, but it's apparently a really common problem for shy people. I tend to shy back and do my own thing, and apparently some people took that as meaning I felt like I was too good to talk to them.
Now that I've trained myself to be more arrogant, it's become much less of an issue.
This is an extremely important lesson and I am grateful that you are trying to teach it.
In my experience it is almost impossible to actually succeed in teaching it, because you are fighting against human nature, but I appreciate it nonetheless.
(A few objections based on personal taste: Too flowery, does not get to the point fast enough, last paragraph teaches false lesson on cleverness)
Btw, I am curious as to whether a post like this one could be put in Main. I put it in discussion right now because I wrote it down hastily, but I think the lesson taught is important enough for main. Could someone tell me what I would need to change to make this main-worthy?
Hey, where are you guys? I am terrible at finding people and i see no number i can call
My own personal experience in the Netherlands did not show one specific bias, but rather multiple groups within the same university with different convictions. There was a group of people/professors who insisted that people were rational and markets efficient, and then there was the 'people are crazy and the world is mad' crowd. I actually really liked that people held these discussions, made it much more interesting and reduced bias overall I think.
In terms of social issues, I never noticed much discussion about this. People were usually pretty open and tolerant to any ideas, if it wasn't too extreme. The exception was during the debating club where any and all rhetorical tricks were considered okay.
I do remember some instances where professors were fired/persecuted for professing the "wrong" beliefs, but that was a while ago now. For example, my uncle was not allowed to say that Jewish people were more likely to have diabetes and that medical students should take this into account. Also, there was a scientist who was hounded in the media for 40 years because he said that crime had a large genetic component, until recently when people suddenly went "oops looks like he was right after all, how about that".
Ooh, debiasing techniques, sounds cool. My brother and I will be attending this one. Is there any pre-reading we should do?
I'd liken it to a chemical reaction. Many of them are multistep, and as a general statement chemical processes take place over an extremely wide range of orders of magnitude of rate (ranging from less than a billionth of a second to years). So, in an overall reaction, there are usually several steps, and the slowest one is usually orders of magnitude slower than any of the others, and that one's called the rate determining step, for obvious reasons: it's so much slower than the others that speeding up or slow down the others even by a couple of orders of magnitude is negligible to the overall rate of reaction. it's pretty rare that more than one of them happen to be at nearly the same rate, since the range of orders of magnitude is so large.
I think that the evolution of intelligence is a stochastic process that's pretty similar to molecular kinetics in a lot of ways, particularly that all of the above applies to it as well, thus, it's more likely that there's one rate determining step, one Great Filter, for the same reasons.
However (and I made another post about this here too), I do think that the filters are interdependent (there are multiple pathways and it's not a linear process, but progress along a multidimensional surface.) that's not really all that different than molecular kinetics either though.
Interesting. However, I still don't see why the filter would work similarly to a chemical reaction. Unless it's a general law of statistics that any event is always far more likely to have a single primary cause, it seems like a strange assumption since they are such dissimilar things.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
"Observations" are not always caused by people observing things.
The most well-known example of rationality associated with emotional control is Spock from Star Trek. And Spock is fictional. And fiction affects how people think about reality.
The point is that you don't ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that's all it is, you'll still have sinned for not considering it.
Otherwise clever people would always find excuses to justify their existing beliefs, and then where would we be?