mwengler comments on Cult impressions of Less Wrong/Singularity Institute - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (247)
The c-word is too strong for what LW actually is. But "rational" is not a complete descriptor either.
It is neither rational nor irrational to embrace cryonics. It may be rational to conclude that someone who wants to live forever and believes body death is the end of his life will embrace cryonics and life extension technologies.
It is neither rational nor irrational to vaunt current human values over any other. It is most likely that current human values are a snapshot in the evolution of humans, and as such are an approximate optimum in a natural selection sense for an environment that existed 10,000 years ago. The idea that "we" lose if we change our values seems more rooted in who "we" decide "we" are. Presumably in the past a human was more likely to have a narrower definition of "we" to include only a few hundred or a few thousand culture-mates. As time has gone on, "we" has grown to cover nationalities, individual races, pan-national, pan-race, for most people. Most Americans don't identify American with a particular race or national background, and many of us don't even require being born within the US or of US parents to be part of "we." Why wouldn't we extend our concept of "we" to include mammals, or all life that evolved on earth, or even all intelligences that evolved or were created on earth? Why would we necessarily identify a non-earth intelligence as "they" and not "we" as in "we intelligences can stick together and do a better job exploting the inanimate universe."
Rationality is a tool, not an answer. Having certain value decisions vaunted over others restricts LessWrong to being a community that uses rationality rather than a community of rationalists or a community serving all who use rationality. It is what Buffett calls "an unforced error."
Let the downvotes begin! To be clear, I don't WANT to be downvoted, but my history on this site suggests to me that I might be.
Dunno bout you, but I value my values.
I think I have the same emotional response to "wrong" things as most people. The knowledge that this is bred in to me by natural selection sorta takes the wind out of my rationalizations of these feelings in two ways. 1) Although they "feel" like right and wrong, I realize they are just hacks done by evolution. 2) If evolution has seen fit to hack our values in the past to keep us outsurviving others, than it stands to reason that the "extrapolated" values of humanity are DIFFERENT from the "evolved" values of humanity. So no matter how Coherent our Extrapolation of Values will be, it will actually subvert whatever evolution might do to our race. So once we have an FAI with great power and a sense of CEV, we stop evolving. Then we spend the rest of eternity relatively poorly adapted for the environment we are in, with FAI scarmbling to make it alright for us. Sounds like the cluster version of wireheading in a way.
On the other hand, I suppose I value the modifications that occur to us through evolution and natural selection. Presumably an attempt at CEV would build that in and perhaps the FAI would decide to leave us alone. Don't we keep reading sci fi where that happens?