Comment author: Dagon 25 July 2016 04:08:52PM 2 points [-]

I think you're focusing too much on the label "rational", and not enough on the actual effect of beliefs.

I'll admit I'm closer to logical positivism than is Eliezer, but even if you make the argument (which you haven't) that the model of the universe is simpler (in the Kolmogorov complexity sense) by believing Adam killed Able, it's still not important. Unless you're making predictions and taking actions based on a belief (or on beliefs influenced by that belief), it's neither rational nor irrational, it's irrelevant.

Now, a somewhat more complicated example, where Eve has to judge Cain's likelihood of murdering her, and thinks the circumstances of the locked room in the past are relevant to her future, there are definite predictions she should be making. Her confidence in Adam's innocence implies Cain's guilt, and she should be concerned.

It's still the case that she cannot possibly have enough evidence for her confidence to be 1.00.

Comment author: Arielgenesis 25 July 2016 05:22:56PM 1 point [-]

Thank you, that was a very nice extension to the story. I should have included the scenario to make her belief relevant. I agree with you, assigning 100% probability is irrational in her case. But, if she is not rationally literate enough to express herself in fuzzy, non-binary way, I think she would maintain rationality through saying "Ceteris paribus, I prefer to be not locked in the same room with Cain because I believe he is a murder because I believe Adam was innocent" (ignoring ad hominem)

I was under the impression that the golden standard for rationality is falsifiability. However, I now understand that Eve is rational despite unfalsifiablity, because she remained Bayesian.

Comment author: Dagon 25 July 2016 03:11:36AM 1 point [-]

This belief pays no rent. It's unfalsifiable precisely because it's irrelevant - there is no prediction that Eve can make which would give different outcomes based on Adam's past behavior. The belief just doesn't matter.

Separately, if she assigns 0.0 probability to anything, she's probably not actually as rational as she claims.

Comment author: Arielgenesis 25 July 2016 04:01:22AM 1 point [-]

What if we were to take one step back and Adam didn't die. Eve claims that her believe pays rent because it could be falsified if Adam changed in character. In this scenario, I suppose that you would agree to say that Eve is still rational.

Now, I cannot formulate my arguments properly at the moment, but I think it is weird that Adam's death make Eve's belief irrational, as per:

So I do not believe a spaceship blips out of existence when it crosses the cosmological horizon of our expanding universe, even though the spaceship's existence has no further experimental consequences for me.

http://lesswrong.com/lw/ss/no_logical_positivist_i/

Comment author: ike 25 July 2016 02:42:10AM 1 point [-]

Can believing an unfalsifyable believe be rational?

Sure, see http://lesswrong.com/lw/ss/no_logical_positivist_i/

Comment author: Arielgenesis 25 July 2016 03:57:41AM 0 points [-]

Thank your the link. I just read the article. It is exactly what I had mind, but my mind works better with narrative.

What I am wondering is if a theist could use this as a foundation of their arguments and remain rational.

A rational unfalsifyable believe

1 Arielgenesis 25 July 2016 02:15AM
A rational unfalsifyable believe

I'm trying to argue that it is possible for someone rational to hold on to a believe that is unfalsifyable and remain rational.

There are three people in a room. Adam, Cain, and Able. Able was murdered. Adam and Cain was taken into police custody. The investigation was thorough but it remains inconclusive. The technology was not advanced enough to produce conclusive evidence. The arguments are basically you did it, no, you did it.

Adam has a wife, her name is Eve. Eve believed that Adam is innocent. She believed so because she has known Adam very well and the Adam that she knew, wouldn't commit murder. She uses Adam's character and her personal relationship with him as evidence.

Cain, trying to defend himself, asked Eve. "What does it take for her to change her believe". She replied, "show me the video recording, then I would believe". But there was no video recording. Then she said, "show me any other evidence that is as strong as a video recording". But there was no such evidence as well.

Cain pointed out, "the evidence that you use for your believe is personal relationship and his character. Then if there are evidence against his character, would you change your mind?"

After some thinking and reflection, she finally said. "Yes, if it could be proven that I have been deceived all these years, then I will believe otherwise."

All of Adam's artifact were gathered, collected and analysed. The search was so thorough, there could never be any new evidence about what Adam had did before the custody that could be presented in the future. All points to Adam good character.

Eve was happy. Cain was not. Then he took one step further. He proposed, "Eve, people could change. If Adam change in the future into man of bad character, would you be convinced that he could have been the murderer?"

"Yes, if Adam changed, then I would believe that it is possible for Adam to be the murderer." Eve said. 

Unfortunately, Adam died the next day. Cain said to Eve, "how do you propose that your belief about Adam's innocence be falsified now?"

"It cannot be falsified now." Eve replied. 

"Then you must be irrational."

  • Is Eve irrational?
  • Can believing an unfalsifyable believe be rational?
  • Can this argument be extended to believe in God?
 
Comment author: ignoranceprior 24 July 2016 07:27:10PM *  2 points [-]

You need at least 10 karma points to vote (you currently have 2 points, according to your profile). Once you have 10 points you should be able to see the voting buttons. Incidentally, after a troll downvoted me from 12 to 4, I lost the ability to vote, and now I can no longer see the buttons.

Comment author: Arielgenesis 25 July 2016 02:14:15AM 1 point [-]

Thank you, that is very helpful. I wish it is said in the FAQ, or I could have missed it. I would have upvoted you if I could.

Comment author: Arielgenesis 24 July 2016 07:05:16PM 0 points [-]

Hi, I have silly question. How do I vote? It seems obvious but I cannot see any upvote or downvote button anywhere in this page. I have tried:

  1. looking at the top of the comment. Next to OP/TS is date, and then time, and then the points. At the far right is the 'minimize'
  2. looking at the bottom of the comment. I see Parent, Edit, Permalink, get notification
  3. The FAQ says: >you can vote submissions and comments up or down just like you can on Reddit but I cannot find the vote button anywhere near comments or post.
Comment author: ChristianKl 18 July 2016 03:29:14PM 0 points [-]

What do you actually want to do with your life? There are careers like politics where personal connection that are gathered during university years are very important.

There are other careers such as starting a startup where personal connections with high status people might not be central and a lot of the YC founders don't have them.

Either there's some sort of self-selection, or do graduates from there have better prospects than graduates of 'University of X, YZ'?

Why "either or"?

Comment author: Arielgenesis 24 July 2016 06:52:04PM 1 point [-]

Post-high education LWers, do you think the place you studied at had a significant effect on your future prospects?

I went to Melbourne University and did an exchange program to UCSD. So I have comparison. I think the distribution of the quality of teaching is sufficiently narrow that it should not play a major factor..

There are careers like politics where personal connection that are gathered during university years are very important.

Depending on the job and your part of the world, personal connection might be a very important factor in carer success. It is more likely that you will would gain more, better personal connection in better university.

Comment author: Arielgenesis 24 July 2016 06:33:41PM 1 point [-]

I bought a $1400 mattress in my quest for sleep, over the Internet hence much cheaper than the mattress I tried in the store, but non-returnable. When the new mattress didn’t seem to work too well once I actually tried sleeping nights on it, this was making me reluctant to spend even more money trying another mattress. I reminded myself that the $1400 was a sunk cost rather than a future consequence, and didn’t change the importance and scope of future better sleep at stake (occurring once per day and a large effect size each day).

from http://rationality.org/checklist/

Is it rational for someone to choose to NOT buy another mattress, not because of the sunk cost, but in order to "punish" oneself (stick and carrot style) to change their behavior and not buy non-returnable, expensive things, ever again? (or to be more careful when buying expensive things)

Comment author: benjaminhaley 10 April 2015 10:53:32PM *  29 points [-]

I've established a habit of putting my money where my mouth is to encourage myself to make more firm predictions. When I am talking with someone and we disagree, I ask if they want to bet a dollar on it. For example, I say, "Roger Ebert directed Beyond the Valley of the Dolls". My wife says, "No, he wrote it.". Then I offer to bet. We look up the answer, and I give her a dollar.

This is a good habit for many reasons.

  1. It is fun to bet. Fun to win. And (kinda) fun to lose.
  2. It forces people to evaluate honestly. The same people that say "I'm sure..." will back off their point when asked to bet a dollar on the outcome.
  3. It forces people to negotiate to concrete terms. For example, I told a friend that a 747 burns 30,000 lbs of fuel an hour. He said no way. We finally settled on the bet "Ben thinks that a fully loaded 747 will burn more than 10,000 lbs of fuel per hour under standard cruising conditions". (I won that bet, it burns ~25,000 lbs of fuel/hour under these conditions).
  4. A dollar feels more important than it actually is, so people treat the bets seriously even though they are not very serious. For this reason, I think it is important to actually exchange a dollar bill at the end, rather than treating it as just an abstract dollar.

I've learned a lot from this habit.

  1. I'm right more often than not (~75%). But I'm wrong a lot too (~25%). This is more wrong than I feel. I feel 95% confident. I shouldn't be so confident.
  2. The person proposing the bet is usually right. My wife has gotten in the habit too. If I propose we bet, I'm usually right. If she proposes we bet I've learned to usually back down.
  3. People frequently overstate their confidence. I mentioned this above, but it bears repeating. Some people regularly will use phrases like "I am sure" or say something emphatically. People are calibrated to express their beliefs differently. But when you ask them to bet a dollar you get a more consistent calibration. People that are over-confident often back away from their positions. Really interesting considering that its only a dollar on the line.
  4. Over time people learn to calibrate better. At first my wife would agree to nearly every bet I proposed. Now she usually doesn't want to. When she agrees to a bet now, I get worried.
Comment author: Arielgenesis 24 July 2016 05:51:28PM 1 point [-]

A dollar feels more important than it actually is, so people treat the bets seriously even though they are not very serious.

Although there is a weight in the dollar, I think there is also another reason why people take it more seriously. People adjust their believe according to what other people believe and their confidence level. Therefore, when you propose a bet, even only for a dollar, you are showing a high confidence level and this decrease their confidence level. As a result, system 2 kicks in and they will be > [forced] to evaluate honestly.

Comment author: Arielgenesis 24 July 2016 05:45:37PM 0 points [-]

To the best of my knowledge, human brain is a simulation machine. It unconsciously making prediction about what sensory input it should expect. This include the higher level input, like language and even concepts. This is the basic mechanism underlying surprise and similar emotion. Moreover, it only makes simulation on the things it cares about and filter the rest.

Given this, I would think that most of your prediction is obsolete, because we are doing this unconsciously. Example:

  1. You predict you will finish the task one week early. But you are ended up finishing one day early. You are not surprised. But if you ended up finishing one day late, then you would be surprised. When people are surprised by the same trigger often enough, most normal people I presume, will update their believe. I know this is related to planning fallacy, but I think my arguments still hold water.

  2. You post a post on Facebook. You didn't make any conscious prediction on the reaction of the audience. You got one million likes. I bet you will be surprised and scratching your mind about why and how you could get such reaction.

Otherwise, I still see some value in what you are doing, but not because of prediction per se, but because you it effectively mitigate bias. For example. "Predict how well you understand someone's position by trying to paraphrase it back to him." It addresses illusion of transparency. But I think there is not much more value in making prediction rather than simply making a habit to paraphrase more often than otherwise without making prediction.

Making conscious prediction, on top of the unconscious one, is cognitively costly. I do think it might improve one's calibration and accuracy and is superior to the improvement made by the surprise mechanism alone. However, the question is, is the calibration and accuracy improvement worth the extra cognitive cost?

View more: Prev | Next