Lumifer comments on Welcome to Less Wrong! (7th thread, December 2014) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (635)
Hello, everyone!
LW came to my attention not so long ago, and I've been commited to reading it since that moment about a month ago. I am a 20-year old linguist from Moscow, finishing my bachelor's. Due to my age, I've been pondering with usual questions of life for the past few years, searching for my path, my philosophy, essentially, a best way to live for me.
I studied a lot of religions, philosophies, and they all seemed really flat, essentially because of the reasons stated in some articles here. I came close to something resembling a nice way to live after I read "Atlas shrugged", but something about it bothered me, and after thorough analysis of this philosophy I decided to take some good things from it and move on, as I did a lot of times before.
I found this gem of a site through reddit and roko's basilisk (is it okay if I say it here? I heard discussion was banned). I am deeply into the whole idea of rationality and nearly all ideas that are presented on this site, but something really bothers me here, too.
The thing is that it is implied that altruism and rationality go hand in hand; maybe I missed some important articles that could explain me, why?
Let's imagine a hypothetical scenario: there is a guy, Steve, who really does not feel anything when he helps other people nor when does other "good" things generally; he does this only because his philosophy or religion tells them to. Say this guy was introduced to ideas of rationality and thus he is no longer bound by his philosophy/religion. And if Steve also does not feel bad about other people suffering (or even takes pleasure in it?)?
What i wanted to say is that rationality is a gun that can point both ways: and it is a good thing that LessWrong "sells" this gun with a safety mechanism (if it is such "safety mechanism". Once again, maybe I missed something really critical that explains why altruism and "being good" is the most rational strategy).
In other ways, Steve does not really care about humanity; he cares about his well-being and will utilize all knowledge he got just to meet his ends ( people are different, aren't they? and ends are different, too).
Or even another, average rationalist Jack estimated that his own net gain will be significantly bigger if he hurts or kills someone (considering his emotions and feelings about overall humanity net gain, and all other possible factors). That means he must carry on? Or is it a taboo here? Or maybe it is a problem of this site's demographics and nobody even considered this scenario (which fact I really doubt).
I feel that i dive too deep into metaphors, but i am not yet a good writer. I hope you understood my thought and can make me less wrong. :)
edit: fixed formatting
That is not so. There is a certain overlap between the population of rationalists and the population of altruists, people from this set intersection are unusually well represented on LW. But there is no "ought" here -- it's perfectly possible to be a non-altruist rationalist or to be a non-rational altruist.