I would prefer a variation of bullet point number 3:
I am here for e-rationality discussion. It's "cool" to know that deodorant is most effective when applied at night, before I go to bed, but that doesn't do anything to fundamentally change the way I think.
Epistemic rationality alone might be well enough for those of us who simply love truth (who love truthseeking, I mean; the truth itself is usually an abomination)
What motivation is there to seek out an abomination? I read the linked comment and I disagree strongly... The curious, persistent rationalist should find the truth seeking process rewarding, but shouldn't it be rewarding because your working toward something wonderful? Worded another way - of what value is truth seeking if you hold the very object you seek in contempt?
If you take the strictly c...
Eliezer, I don't know if you're familiar with the CIA's Intellipedia, but you seem to have hit the nail on the head.
The CIA have had huge success doing exactly what you describe here. You can read more about it in the paper here. The basic idea is that the intelligence community should harness the synergy of the blog/wiki combo.
From the paper:
...The Wiki and the Blog are complimentary companion technologies that together form the core workspace that will allow intelligence officers to share, innovate, adapt, respond, and be—on occasion—brilliant. Blogs will
Thanks for this reference. This concept is what I was going at at the IRC meetup. The main disagreement with Eliezer's model seems to be that he thinks that the blog posts still have to hold the majority of content, with wiki only referencing them with very short introductions, whereas I think that the wiki should grow into a thing in itself over time, converting the content of blog posts into wiki articles. Naturally, articles should be organized in a zoom-in manner, with few-sentences summary, then couple-paragraphs introduction, and only then full-lengt...
The karma system is a integral part of the Reddit base code that this site is built on top of. It's designed to do one thing - increase the visibility of good content - and it does that one thing very well.
I agree, though, that there is untapped potential in the karma system. Personally I would love to see - if not by whom - at least when my comments are up/down voted.
Also, for it to be an unbiased comparison the two statements, "smart cars for all" and "cryopreservation for only the people who actually died that year" should be limited to the same domain.
If you compare different sets, one substantially larger than the other, then of course cryo is going to be cheaper!
A more balanced statement would be: "buying smart cars to save the lives of only the people who would have otherwise died by car accident in any given year would probably cost less than cryo-surance for the same set of people."
Plus you don't die. Which, for me, is preferable.
Great post.
Here's some additional reading that supports your argument:
Distract yourself. You're more honest about your actions when you can't exert the mental energies necessary to rationalize your actions.
And the (subconcious) desire to avoid appearing hypocritical is a huge motivator.
I've noticed this in myself often. I faithfully watched LOST through the third season, explaining to my friends who had lost interest around the first season that it was, in fact, an awesome show. And then I realized it kind of sucked.
Picture of Eliezer in monk's robes (That is you, right?), stories about freemason-esque rituals, specific vocabulary with terms like, "the Bayesian conspiracy".
It's all tongue in cheek, and I enjoy it. But if you're trying to not look like a cult, then you're doing it wrong.
In the modern world, karate is unlikely to save your life. But rationality can.
The term "bayesian black-belt" has been thrown around a number of times on OB and LW... this, in my mind, seems misleading. As far as I can tell there are two ways in which bayesian reasoning can be applied directly: introspection and academia. Within those domains, sure, the metaphor makes sense... in meatspace life-and-death situations? Not so much.
"Being rational" doesn't develop your quick-twitch muscle fibers or give you a sixth sense.
Perhaps, where ...
That's not the situation in question. The scenario laid out by Vladimir_Nesov does not allow for an equal probability of getting $10000 and paying $100. Omega has already flipped the coin, and it's already been decided that I'm on the "losing" side. Join that with the fact that me giving $100 now does not increase the chance of me getting $10000 in the future because there is no repetition.
Perhaps there's something fundamental I'm missing here, but the linearity of events seems pretty clear. If Omega really did calculate that I would give him the...
I feel like a man in an Escher painting, with all these recursive hypothetical mes, hypothetical kuriges, and hypothetical omegas.
I'm saying, go ahead and start by imagining a situation like the one in the problem, except it's all happening in the future -- you don't yet know how the coin will land.
You would want to decide in advance that if the coin came up against you, you would cough up $100.
The ability to precommit in this way gives you an advantage. It gives you half a chance at $10000 you would not otherwise have had.
So it's a shame that in the prob...
I work on AI. In particular, on decision systems stable under self-modification. Any agent who does not give the $100 in situations like this will self-modify to give $100 in situations like this. I don't spend a whole lot of time thinking about decision theories that are unstable under reflection. QED.
There are various intuition pumps to explain the answer.
The simplest is to imagine that a moment from now, Omega walks up to you and says "I'm sorry, I would have given you $10000, except I simulated what would happen if I asked you for $100 and you refused". In that case, you would certainly wish you had been the sort of person to give up the $100.
Which means that right now, with both scenarios equally probable, you should want to be the sort of person who will give up the $100, since if you are that sort of person, there's half a chance you'll get $10000.
If you want to be the sort of person who'll do X given Y, then when Y turns up, you'd better bloody well do X.
This post goes hand in hand with Crisis of Faith. Eliezer's post is all about creating an internal crisis and your post is all about applying that to a real world debate. Like peanut-butter and jelly.
If you want to correct and not just refute then you cannot bring to the table evidence that can only be seen as evidence from your perspective. Ie. you cannot directly use evolution as evidence when the opposing party has no working knowledge of evolution. Likewise, a christian cannot convince an atheist of the existence of God by talking about the wonders of ...
There is an excellent example of "priming" the mind here.
The idea is that specific prior knowledge drastically changes the way we process new information. You listen to a sine-wave modulated recording that is initially unintelligible. You then listen to the original recording. You are now primed. Listen again to the modulated recording and suddenly the previously unintelligible recording is clear as day.
I first listened to all of the samples on December 8th, when the link was posted on kottke.org. If I'm not mistaken that means it's been exactly ...
Just did a quick search of this page and it didn't turn up... so, by far, the most memorable and referred-to post I've read on OB is Crisis of Faith.
His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.
From the original comment:
One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.
I don't have the origina...
I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:
One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.
I realize that my views do not agree with the large majority of those who frequent LW and OB - but I'd just like to take a moment to...
No, my experience with alone/together situations is quite different.
I usually don't laugh when I'm watching a funny movie by myself and, although I might flinch during jump scenes, I don't normally find scary movies to be all that scary when I watch them by myself.
There are hotels that tout themselves as "haunted hotels" and even bring in teams of "ghost experts" to get an official certificate proudly declaring the amount and type of "haunting" taking place at that location.
If it's known to be a joke, then sure, it's all fun a...
You have to be careful when dismissing subconscious fears as irrational. They were put there for a reason, and they may still be relevant. If I was staying in a "haunted house" in a city where it was not isolated or abandoned or anything, I don't think it'd scare me one bit. A secluded/abandoned haunted house might be scary, and for good reasons. It would be unwise to assume that your fear is entirely irrational.
I went to a local park with some friends one night to hang out. Both I and another friend were uneasy about it, but dismissed our fears ...
If you're reading this, Kurige, you should very quickly say the above out loud, so you can notice that it seems at least slightly harder to swallow - notice the subjective difference - before you go to the trouble of rerationalizing.
There seems to be some confusion here concerning authority. I have the authority to say "I like the color green." It would not make sense for me to say "I believe I like the color green" because I have first-hand knowledge concerning my own likes and dislikes and I'm sufficiently confident in my own menta...
"I chose to believe in the existance of God - deliberately and conciously."
I cannot conceive of how it is possible to deliberately and consciously choose to believe in something.
I grew up in a religious family. I served as a missionary for my church. I married a woman of the same religion. For most of my first 28 years I believed not only that there was a God but that he had established the church of which I and my family were all members.
But once I started examining my beliefs more closely, I realized that I was engaging in the most dishonest sort of special pleading in their favor. And then it was no longer possible to continue believing.
Correct me if I'm wrong, but from a Bayesian perspective the only difference between first-hand knowledge and N-th hand knowledge (where N>1) are the numbers. There is nothing special about first-hand.
Suppose you see a dog in the street, and formulate this knowledge to yourself. What just happened? Photons from the sun (or other light sources) hit the dog, bounced, hit your eye, initiated a chemical reaction, etc. Your knowledge is neither special nor trivial, but is a chain of physical events.
Now, what happens when your friend tells you he sees a dog? ...
Does anyone have a good model of what people in fact do, when they talk about "choosing" a particular belief? At least two possibilities come to mind:
(1) Choosing to act and speak in accordance with a particular belief.
(2) Choosing to "lie" to other parts of one's mind -- to act and speak in accordance with a particular belief internally, so that one's emotional centers, etc., get at least some of their inputs "as though" one held that belief.
Is "choosing to trust someone" any more compatible with lack of self-dec...
The scientist who says "according to our model M, the higgs-boson should exist" has, as his actual beliefs, a wider distribution of hypotheses than model M. He thinks model M could be right, but he is not sure -- his actual beliefs are that there's a certain probability of {M and higgs-bosons}, and another probability of {not M}.
Is something analogous true for your belief in God? I mean, are you saying "There's this framework I believe in, and, if it's true, then God is true... but that framework may or may not be true?"
On this we agree. If we have 60% confidence that a statement is correct, we would be misleading others if we asserted that it was true in a way that signalled a much higher confidence. Our own beliefs are evidence for others, and we should be careful not to communicate false evidence.
Stripped down to essentials, Eliezer is asking you to assert that God exists with more confidence than it sounds like you have. You are not willing to say it without weasel words because to do so would be to express more certainty than you actually have. Is that right?
I'm afraid I must disagree kurige, for two reasons. The first is that they smack of false modesty, a way of insuring yourself against the social consequences of failure without actually taking care not to fail. The second is that the use of such terms don't really convey any new information, and require the use of the passive voice, which is bad style.
"Evidence indicates an increase in ice cream sales" really isn't good science writing, because the immediate question is "What evidence?". It's much better to say "ice cream sales have increased by 15%" and point to the relevant statistics.
I was once told that half of Nobel laureates were the students of other Nobel laureates. ... Even after discounting for cherry-picking of students and political pull, this suggests to me that you can learn things by apprenticeship - close supervision, free-form discussion, ongoing error correction over a long period of time - that no Nobel laureate has yet succeeding in putting into any of their many books.
What is it that the students of Nobel laureates learn, but can't put into words?
You can't put mentornship in a book. When I face a problem that may...
This I can understand.
I am a protestant Christian and your friend's experience with "belief" are similar to mine. Or seem to be, from what I gather in your post.
One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.
The double-think comes into play when you're faced with non-axiomatic concepts such as morality. I believe that there i...
1) You can summarize arguments voiced by EY.
2) You cannot write a book that will be published under EY's name.
3) Writing a book takes a great deal of time and effort.
You're reading into connotation a bit too much.