wedrifid comments on Welcome to Less Wrong! (July 2012) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (843)
Hello everyone, Like many people, I come to this site via an interest in transhumanism, although it seems unlikely to me that FAI implementing CEV can actually be designed before the singularity (I can explain why, and possibly even what could be done instead, but it suddenly occurred to me that it seems presumptuous of me to criticize a theory put forward by very smart people when I only have 1 karma...).
Oddly enough, I am not interested in improving epistemic rationality right now, partially because I am already quite good at it. But more than that, I am trying to switch it off when talking to other people, for the simple reason (and I'm sure this has already been pointed out before) that if you compare three people, one who estimates the probability of an event at 110%, one who estimates it at 90%, and one who compensates for overconfidence bias and estimates it at 65%, the first two will win friends and influence people, while the third will seem indecisive (unless they are talking to other rationalists). I think I am borderline asperger's (again, like many people here) and optimizing social skills probably takes precedence over most other things.
I am currently doing a PhD in "absurdly simplistic computational modeling of the blatantly obvious" which better damn well have some signaling value. In my spare time, to stop my brain turning to mush, among other things I am writing a story which is sort of rationalist, in that some of the characters keep using science effectively even when the world is going crazy and the laws of physics seem to change dependent upon whether you believe in them. On the other hand, some of the characters are (a) heroes/heroines (b) awesomely successful (c) hippies on acid who do not believe in objective reality (not that I am implying that all hippies/people who use lsd are irrational). Maybe the point of the story is that you need more than just rationality? Or that some people are powerful because of rationality, while others have imagination, and that friendship combines their powers in a my little pony like fashion? Or maybe its all just an excuse for pretentious philosophy and psychic battles?
Many here would agree with you. (And, for instance, consider a ~10% chance of success better than near certain extinction.)
I agree that 10% chance of success is better than near zero, and furthermore I agree that expected utility maximization means that putting in a great deal of effort to achieve a positive outcome is wiser than saying "oh well, we're doomed anyway, might as well party hard and make the most of the time we have left". However, the question is whether, if FAI has a low probability of success, are other possibilities, e.g. tool AI a better option to pursue?
Would you say that many people here (and yourself?) believe that the probable end of our species is within the next century or two?
The last survey reported that Less Wrongers on average believe that humanity has about a 68% chance of surviving the century without a disaster killing >90% of the species. (Median 80%, though, which might be a better measure of the community feeling than the mean in this case.) That's a lower bar than actual extinction, but also a shorter timescale, so I expect the answer to your question would be in the same ballpark.
For myself: Yes! p(extinct within 200 years) > 0.5