My parents always used to downplay the value of intelligence. And play up the value of—effort, as recommended by the latest research? No, not effort. Experience. A nicely unattainable hammer with which to smack down a bright young child, to be sure. That was what my parents told me when I questioned the Jewish religion, for example. I tried laying out an argument, and I was told something along the lines of: "Logic has limits, you'll understand when you're older that experience is the important thing, and then you'll see the truth of Judaism." I didn't try again. I made one attempt to question Judaism in school, got slapped down, didn't try again. I've never been a slow learner.
Whenever my parents were doing something ill-advised, it was always, "We know better because we have more experience. You'll understand when you're older: maturity and wisdom is more important than intelligence."
If this was an attempt to focus the young Eliezer on intelligence uber alles, it was the most wildly successful example of reverse psychology I've ever heard of.
But my parents aren't that cunning, and the results weren't exactly positive.
For a long time, I thought that the moral of this story was that experience was no match for sheer raw native intelligence. It wasn't until a lot later, in my twenties, that I looked back and realized that I couldn't possibly have been more intelligent than my parents before puberty, with my brain not even fully developed. At age eleven, when I was already nearly a full-blown atheist, I could not have defeated my parents in any fair contest of mind. My SAT scores were high for an 11-year-old, but they wouldn't have beaten my parents' SAT scores in full adulthood. In a fair fight, my parents' intelligence and experience could have stomped any prepubescent child flat. It was dysrationalia that did them in; they used their intelligence only to defeat itself.
But that understanding came much later, when my intelligence had processed and distilled many more years of experience.
The moral I derived when I was young, was that anyone who downplayed the value of intelligence didn't understand intelligence at all. My own intelligence had affected every aspect of my life and mind and personality; that was massively obvious, seen at a backward glance. "Intelligence has nothing to do with wisdom or being a good person"—oh, and does self-awareness have nothing to do with wisdom, or being a good person? Modeling yourself takes intelligence. For one thing, it takes enough intelligence to learn evolutionary psychology.
We are the cards we are dealt, and intelligence is the unfairest of all those cards. More unfair than wealth or health or home country, unfairer than your happiness set-point. People have difficulty accepting that life can be that unfair, it's not a happy thought. "Intelligence isn't as important as X" is one way of turning away from the unfairness, refusing to deal with it, thinking a happier thought instead. It's a temptation, both to those dealt poor cards, and to those dealt good ones. Just as downplaying the importance of money is a temptation both to the poor and to the rich.
But the young Eliezer was a transhumanist. Giving away IQ points was going to take more work than if I'd just been born with extra money. But it was a fixable problem, to be faced up to squarely, and fixed. Even if it took my whole life. "The strong exist to serve the weak," wrote the young Eliezer, "and can only discharge that duty by making others equally strong." I was annoyed with the Randian and Nietszchean trends in SF, and as you may have grasped, the young Eliezer had a tendency to take things too far in the other direction. No one exists only to serve. But I tried, and I don't regret that. If you call that teenage folly, it's rare to see adult wisdom doing better.
Everyone needed more intelligence. Including me, I was careful to pronounce. Be it far from me to declare a new world order with myself on top—that was what a stereotyped science fiction villain would do, or worse, a typical teenager, and I would never have allowed myself to be so cliched. No, everyone needed to be smarter. We were all in the same boat: A fine, uplifting thought.
Eliezer1995 had read his science fiction. He had morals, and ethics, and could see the more obvious traps. No screeds on Homo novis for him. No line drawn between himself and others. No elaborate philosophy to put himself at the top of the heap. It was too obvious a failure mode. Yes, he was very careful to call himself stupid too, and never claim moral superiority. Well, and I don't see it so differently now, though I no longer make such a dramatic production out of my ethics. (Or maybe it would be more accurate to say that I'm tougher about when I allow myself a moment of self-congratulation.)
I say all this to emphasize that Eliezer1995 wasn't so undignified as to fail in any obvious way.
And then Eliezer1996 encountered the concept of the Singularity. Was it a thunderbolt of revelation? Did I jump out of my chair and shout "Eurisko!"? Nah. I wasn't that much of a drama queen. It was just massively obvious in retrospect that smarter-than-human intelligence was going to change the future more fundamentally than any mere material science. And I knew at once that this was what I would be doing with the rest of my life, creating the Singularity. Not nanotechnology like I'd thought when I was eleven years old; nanotech would only be a tool brought forth of intelligence. Why, intelligence was even more powerful, an even greater blessing, than I'd realized before.
Was this a happy death spiral? As it turned out later, yes: that is, it led to the adoption even of false happy beliefs about intelligence. Perhaps you could draw the line at the point where I started believing that surely the lightspeed limit would be no barrier to superintelligence. (It's not unthinkable, but I wouldn't bet on it.)
But the real wrong turn came later, at the point where someone said, "Hey, how do you know that superintelligence will be moral? Intelligence has nothing to do with being a good person, you know—that's what we call wisdom, young prodigy."
And lo, it seemed obvious to the young Eliezer, that this was mere denial. Certainly, his own painstakingly constructed code of ethics had been put together using his intelligence and resting on his intelligence as a base. Any fool could see that intelligence had a great deal to do with ethics, morality, and wisdom; just try explaining the Prisoner's Dilemma to a chimpanzee, right?
Surely, then, superintelligence would necessarily imply supermorality.
Thus is it said: "Parents do all the things they tell their children not to do, which is how they know not to do them." To be continued, hopefully tomorrow.
Post Scriptum: How my views on intelligence have changed since then... let's see: When I think of poor hands dealt to humans, these days, I think first of death and old age. Everyone's got to have some intelligence level or other, and the important thing from a fun-theoretical perspective is that it should ought to increase over time, not decrease like now. Isn't that a clever way of feeling better? But I don't work so hard now at downplaying my own intelligence, because that's just another way of calling attention to it. I'm smart for a human, if the topic should arise, and how I feel about that is my own business. The part about intelligence being the lever that lifts worlds is the same. Except that intelligence has become less mysterious unto me, so that I now more clearly see intelligence as something embedded within physics. Superintelligences may go FTL if it happens to be permitted by the true physical laws, and if not, then not.
By the Rule of Comparative Advantage, on a planet of billions of people, both can be worked on at once. Since Eliezer -- the person originally addressed -- is not a biologist, there's nothing he's likely to be able to do about senescence, beyond convincing other people that curing death would be great and hoping they come up with something. Fixing akrasia, though, is something that there is at yet no specialised knowledge about, so he has about as much chance as anyone of similar smarts.
Ok, that's my New Year Resolution: don't die. Sorted!
I'd take all of it right now. And a pony. (Yes, I'm rejecting the hypothetical. I do that.)
Well done, you missed the point.