I'm reading Dan Ariely's book Predictably Irrational. The story of what got him interested in rationality and human biases goes something like this.
He was the victim of a really bad accident, and had terrible burns covering ~70% of his body. The experience was incredibly painful, and so was the treatment. For treatment, he'd have to bathe in some sort of disinfectant, and then have bandages ripped off his exposed flesh afterwards, which was extremely painful for him.
The nurses believed that ripping it off quickly would produce the least amount of pain for the patient. They thought the short and intense bursts of pain were less (in aggregate) than the less intense but longer periods of pain that a slower removal of the bandages would produce. However, Dan disagreed about what would produce the least amount of pain for patients. He thought that a slower removal would be better. Eventually, he found some scientific research that supported/proved his theory to be correct.
But he was confused. These nurses were smart people and had a ton of experience giving burn victims baths - shouldn't they have figured out by now what approaches best minimize patient pain? He knew their failure wasn't due to a lack of intelligence, and that it wasn't due to a lack of sympathy. He ultimately concluded that the failure was due to inherent human biases. He then became incredibly interested in this and went on to do a bunch of fantastic research in the area.
In my experience, the overwhelming majority of people are uninterested in rationality, and a lot of them are even put off by it. So I'm curious about how members of this incredibly small minority of the population became who they are.
Part of me thinks that extreme outputs are the result of extreme inputs. Like how Dan's extreme passion for his work has (seemingly) originated from his extreme experiences with pain. With this rule-of-thumb in mind, when I see someone who possesses some extreme character trait, I expect there to be some sort of extreme story or experience behind it.
But another part of me thinks that this doesn't really apply to rationality. I don't have much data, but from the limited experience I've had getting to know people in this community, "I've just always thought this way" seems common, and "extreme experiences that motivated rational thinking" seems rare.
Anyway, I'm interested in hearing people's "rationalist backstories". Personally, I'm interested in reading really long and detailed backstories, but am also interested in reading "just a few paragraphs". I'm also eager to hear people's thoughts on my "extreme input/output" theory.
I have always loved intelligence and creativity. When I was about 12 years old, I have discovered 3D computer graphics, and got addicted to it - learning, understanding, and creating things was the most fun thing I have ever experienced.
As I got older, I have spent a lot of time trying to figure out what I want out of life and what are my values. After thinking for a long time and reading books like "Atlas Shrugged" and "Surely You're Joking, Mr. Feynman!", I have identified that "being clever" is my main drive in life, my main value. I realized that whatever "being clever" means - this is what I want to live for, this is something I want as my end goal, intelligence(and creativity) for it's own sake.
Once I've realized that, I have started looking for ways to learn things and become more intelligent. I have stumbled upon Paul Graham's essays, and decided that startups, programming, and writing are the best paths for me, mastering these things will make me the kind of person I want to be, teach me things, and improve my brain.
I have never explicitly pursued "rationality", I was just trying to read books, learn from smart people, and do what makes sense.
Later I happened upon HPMOR, found out about LessWrong, and really enjoyed EY's essays. So here I am now.
"Being clever" is not a goal. It's just the state where you are (or you look) smarter than people around you. That doesn't seem to be a worthwhile aim in life.