PhDre comments on Welcome to Less Wrong! (July 2012) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (843)
Those are not at all at odds. Read e.g. Why Spock is Not Rational, or Feeling Rational.
Relevant excerpts from both:
and
Your purely emotion / empathetic desire for altruism governs setting your goals, your pure rational thinking governs how you go about reaching your goals. You're allowed to be emotionally suckered, eh, influenced into doing your best (instrumental rationality) to do good in the world (for your values of 'good')!
Thank you for the reading suggestions! Perhaps my mind has already packaged Spock / lack of emotion into my understanding of the concept of 'Rationality.'
To respond directly -
Though if pure emotion / altruism sets my goals, the possibility of irrational / insignificant goals remains, no? If for example, I only follow pure emotion's path to... say... becoming an advocate for a community through politics, there is no 'check' on the rationality of pursuing a political career to achieve the most good (which again, is a goal that requires rational analysis)?
In HPMoR, characters are accused of being 'ambitious with no ambition' - setting my goals with empathetic desire for altruism would seem to put me in this camp.
Perhaps my goal, as I work my way through the sequences and the site, is to approach rationality as a tool / learning process of its own, and see how I can apply it to my life as I go. Halfway through typing this response, I found this quote from the Twelve Virtues of Rationality:
There is no "correct" way whatsoever in setting your terminal values, your "ultimate goals" (other agents may prefer you to pursue values similar to their own, whatever those may be). Your ultimate goals can include anything from "maximize the number of paperclips" to "paint everything blue" to "always keep in a state of being nourished (for the sake of itself!)" or "always keep in a state of emotional fulfillment through short-term altruistic deeds".
Based on those ultimate goals, you define other, derivative goals, such as "I want to buy blue paint" as an intermediate goal towards "so I can paint everything blue". Those "stepping stones" can be irrational / insignificant (in relation to pursuing your terminal values), i.e. you can be "wrong" about them. Maybe you shouldn't buy blue paint, but rather produce it yourself. Or rather invest in nanotechnology to paint everything blue using nanomagic.
Only you can (or can't, humans are notoriously bad at accurately providing their actual utility functions) try to elucidate what your ultimate goals are, but having decided on them, they are supra-rational / beyond rational / 'rational not applicable' by definition.
There is no fault in choosing "I want to live a life that maximizes fuzzy feelings through charitable acts" over "I'm dedicating my life to decreasing the Gini index, whatever the personal cost to myself."