How's that related?
I actually wrote it
Oh, very good! I wonder why I thought it was Eliezer. I see that he endorsed the idea, anyway. But I think my objection to it still stands (and is closely related to the one I expressed two comments upthread here).
Isn't that what rationality is supposed to reduce?
Inter alia, yes. But the step from "rationality is supposed to reduce X" to "I will act as if X has been reduced to negligibility" is not a valid one.
Inter alia, yes. But the step from "rationality is supposed to reduce X" to "I will act as if X has been reduced to negligibility" is not a valid one.
Well, isn't that a good technique to reduce X? Obviously not in all cases, but I think it's a valid technique in the cases we're talking about.
Isn't that what rationality is supposed to reduce?
No, rationality is about winning. Having certain values isn't irrational.
If you value your belief that's there are no ghost then it's irrational to be scared by ghosts.
The relationship of most of us to democracy is different. We generally do value it and think the rituals of democracy are valuable for our society.
If you value your belief that's there are no ghost then it's irrational to be scared by ghosts.
Are you talking about "real" ghosts? You shouldn't be afraid of real ghosts because they don't exist, not because you value your belief that there are no ghosts. Why should beliefs have any value for you beyond their accuracy?
I think pwno is proposing that we do it precisely because it doesn't align with our convictions. (He might advise Trump supporters to vote for Clinton.)
I'm sure I remember reading, but can't now find, an anecdote from Eliezer back in the OB days: he was with a group of people at the Western Wall in Jerusalem, where there's this tradition of writing prayers on pieces of paper and sticking them in cracks in the wall, so as a test of the sincerity of his unbelief he wrote "I pray for my parents to die" and stuck that in the wall. Same principle.
(Personally I think it's a silly principle. Human brains aren't very good at detaching themselves from their actions, and I would only cast a vote if I were happy for my preferences to get shifted a little bit towards the candidate I was voting for.)
Funny you mention that anecdote because I actually wrote it http://lesswrong.com/lw/1l/the_mystery_of_the_haunted_rationalist/w9
Human brains aren't very good at detaching themselves from their actions
Isn't that what rationality is supposed to reduce?
The government picks arbitrary ages for when an individual has the mental capacity to make certain decisions, like drinking alcohol or having sex. But not everyone mentally matures at the same rate. It'd be nice to have an institution that allows minors with good backgrounds and who pass certain intelligence/rationality tests to be exempt from these laws.
It's really hard, that's why almost nobody knows how to do it :P.
Roughly speaking, the solution for me was to develop deep intuition in a lot of different domains, observe the features common to the intuitions in different domains, and abstract the common features out.
Finding the common features was very difficult, as there are a huge number of confounding factors that mask over the underlying commonalities. But it makes sense in hindsight - we wouldn't be able to develop deep intuitions in so many different domains if not for there being subtle underlying commonalities - there weren't evolutionary selective pressures specifically for the ability to develop general relativity and quantum field theory - the fact that it's possible for us means that the relevant pattern recognition abilities are closely related to the ones used in social contexts, etc.
observe the features common to the intuitions in different domains, and abstract the common features out.
Have you explicitly factored these out? If so, what are some examples?
I have no idea what Thiel is thinking of, but I'll volunteer to get a brainstorm started:
Male to female love is 70% physical attraction. Yes, love.
Edit: I guess this related to race and gender, but I don't want to hold back one of my edgiest beliefs.
I agree
I think it's because system 1 and system 2 update differently. System 1 often needs experiential evidence in order to update, while system 2 can update using logical deduction alone. Doing a bunch of research is effective in updating system 2, but less so system 1. I'd guess that if you continue being positive and and don't experience any downside to it, then eventually your system 1 will update.
I think interviewers rely more on their intuition to evaluate candidates for managerial positions. For purely engineering positions, a longer, more systematic evaluation is needed.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Certainly, as you say, not in all cases. I don't see any very good reason to think it would be effective in this case. Apparently you do; what's that reason?
In the case of voting for Trump and writing the note in the Wailing Wall, I think there's little to no risk of having it change your prior beliefs or weaken your self-deception defense mechanisms. They both require you to be dishonest about something that clashes with so many other strong beliefs that it's highly unlikely to contaminate your belief system. The more dangerous lies are the ones that don't clash as much with your other beliefs.