The opposite of a Great Truth is unpretentiousness.
Mr. Tyler:
I admire your persistence; however, you should be reminded that preaching to the deaf is not a particularly worthwhile activity.
My own complaints regarding the Brave New World consist mainly of noting that Huxley's dystopia specialized in making people fit the needs of society. And if meant whittling down a square peg so it would fit into a round hole, so be it.
Embryos were intentionally damaged (primarily through exposure to alcohol) so that they would be unlikely to have capabilities beyond what society needed them to.
This is completely incompatible with my beliefs about the necessity of self-regulating feedback loops, and developing order from the bottom upwards.
Or, to put it another way:
"Fixing" the future, in a way that renders human beings completely redundant and unnecessary even to themselves, isn't fixing anything. It's creating a problem of unlimited scope.
If that's the ultimate outcome of, say, producing superhuman minds - whether they're somehow enslaved to human preferences or not - then we're trying very hard to create a world in which the only rational treatment of humanity is extinction. Whether imposed from without or from within, voluntarily, is irrelevant.
Based on the comments here, it would seem that it's the people who reject ultimately-meaningless forms of play - that is, 'play' that doesn't develop skills useful to perpetuation - and concentrate on the "real world" who will end up existing.
And the Luddites will inherit the Earth...
The mere fact that he has put so much time and energy into working on this issue over many years is strong evidence that he sincerely believes that it is a real possibility
Only if there are no other consequences of his actions that he desires. People working to forward an ideology don't necessary believe the ideology they're selling - they only need to value some of the consequences of spreading it.
I hope you are both willing at least to say that the other's contrary stance tells you that there is a good likelihood that you are wrong.
If Robin knows that Eliezer believes there is a good likelihood that Eliezer's position is wrong, why would Robin then conclude that his own position is likely to be wrong? And vice versa?
The fact that Eliezer and Robin disagree indicates one of two things: either one possesses crucial information that the other does not, or at least one of the two have made a fatal error.
The disagreement stems from the fact that each believes the other to have made the fatal error, and that their own position is fundamentally sound.
Eric, it's more amusing that both often cite a theorem that agreeing to disagree is impossible.
It's only impossible for rational Bayesians, which neither Hanson nor Yudkowsky are. Or any other human beings, for that matter.
Maybe neurons are just what brains happen to be made out of, because the blind idiot god is too stupid to sit down and invent transistors.
Maybe transistors are just what computers happen to be made out of, because the idiotic short-sighted humans are too stupid to sit down and build neurons.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I would suggest that this book, and the two books immediately preceding it, are an examination of the difference between what people believe they want the world to be and what they actually want and need it to be. When people gain enough power to create their vision of the perfect world, they do - and then find they've constructed an elaborate prison at best and a slow and terrible death at worst.
An actual "perfect world" can't be safe, controlled, or certain -- and the inevitable consequence of that is pain. But so is delight.