John_Maxwell_IV comments on Welcome to Less Wrong! (July 2012) - Less Wrong

20 Post author: ciphergoth 18 July 2012 05:24PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (843)

You are viewing a single comment's thread. Show more comments above.

Comment author: findis 26 December 2012 06:20:13AM *  9 points [-]

Hi, I'm Liz.

I'm a senior at a college in the US, soon to graduate with a double major in physics and economics, and then (hopefully) pursue a PhD in economics. I like computer science and math too. I'm hoping to do research in economic development, but more relevantly to LW, I'm pretty interested in behavioral economics and in econometrics (statistics). Out of the uncommon beliefs I hold, the one that most affects my life is that since I can greatly help others at a small cost to myself, I should; I donate whatever extra money I have to charity, although it's not much. (see givingwhatwecan.org)

I think I started behaving as a rationalist (without that word) when I became an atheist near the end of high school. But to rewind...

I was raised Christian, but Christianity was always more of a miserable duty than a comfort to me. I disliked the music and the long services and the awkward social interactions. I became an atheist for no good reason in the beginning of high school, but being an atheist was terrible. There was no one to forgive me when I screwed up, or pray to when the world was unbearably awful. My lack of faith made my father sad. Then, lying in bed and angsting about free will one night, I had some philosophical revelation, and it seemed that God must exist. I couldn't re-explain the revelation to myself, but I clung to the result and became seriously religious for the next year or so. But objections to the major strands of theism began to creep up on me. I wanted to believe in God, and I wanted to know the truth, and I found out that (surprise) having an ideal set of beliefs isn't compatible with seeking truth. I did lots of reading (mostly old-school philosophy), slowly changed my mind, then came out as an atheist (to close friends only) once the Bible Quiz season was over. (awk.)

At that point I decided to never lie to myself again. Not just to avoid comforting half-truths, but to actively question all beliefs I held, and to act on whatever conclusions I come to. After hard practice, unrelenting honesty towards myself is a habit I can't break, but I'm not sure it's actually a good policy. For example, a few white lies would've helped me move past a situation of extreme guilt last year.

Anyway, more recently, I read HPMOR and I'm now reading Kahneman's Thinking, Fast and Slow. I'm slowly working through the Sequences too. I always appreciate new reading recommendations.


I have some thoughts on Newcomb's Paradox. (Of course I am new to this, probably way off base, etc.) I think two boxes is the right way to go, and it seems that intuition towards one-boxing often comes from the idea that your decision somehow changes the contents of the boxes. (No reverse causality is supposed to be assumed, right?) Say that instead of an infallible superintelligence, the story changes to

"You go to visit your friend Ann, and her mom pulls you into the kitchen, where two boxes are sitting on a table. She tells you that box A has either $1 billion or $0, and box B has $1,000. She says you can take both boxes or just A, and that if she predicted you take box B she didn't put anything in A. She has done this to 100 of Anne's friends and has only been wrong for one of them. She is a great predictor because she has been spying on your philosophy class and reading your essays."

Terribly small sample size, but a friend told me this changes his answer from one box to two. As far as I can tell these changes are aesthetic and make the story clearer without changing the philosophy.


And, a question. Why is Bayes so central to this site? I use Bayesian reasoning regularly, but I learned Bayes' Theorem around the time I started thinking seriously about anything, so I'm not clear on what the alternative is. Why do y'all celebrate Bayes, rather than algebra or well-designed experiments?

Edit: Read farther in Thinking, Fast and Slow; question answered.

Comment author: John_Maxwell_IV 12 January 2013 08:48:19AM *  2 points [-]

Welcome to LW.

Also not an expert on Newcomb's Problem, but I'm a one-boxer because I choose to have part of my brain say that I'm a one-boxer, and have that part of my brain influence my behavior if I get in to a Newcomb-like situation. Does that make any sense? Basically, I'm choosing to modify my decision algorithm so I no longer maximize expected value because I think having this other algorithm will get me better results.