You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Kawoomba comments on On Straw Vulcan Rationality - Less Wrong Discussion

7 Post author: BrienneYudkowsky 02 February 2014 08:11AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

You are viewing a single comment's thread.

Comment author: Kawoomba 02 February 2014 09:35:28AM *  2 points [-]

Apart from the mostly confusing qualia aspect*, emotions can be thought of as heuristics which predispose an agent towards a certain set of actions, a processing-filter putting an agent in a certain mode of reasoning. Like instagram. (Long-term) decisions which are made under the influence of strong emotions aren't necessarily worse, although they typically are.

The explanation for evolution selecting for such filters is that when doing "fast reasoning" under relevant resource constraints (there always are boundaries on available resources, just not always relevant ones), your cognition takes some cues (e.g. hormonal) to 'gently' (or not) steer your decision-making process (or your consciously accessible illusion thereof) towards the 'right' direction (Fight! Flight! Cry! Submit!).

A rationalist who were hypothetically able to tweak his/her emotions such that ze always experiences the correct filter, given the time constraints, would experience a full range of emotions and still make optimal decisions. This is, of course, out of reach.

Vulcans "master" (scare quotes) these heuristics by suppressing them, since in canon those heuristics are stronger and as such less useful (which always made the genesis of the whole scenario somewhat contrived). "Not being at the mercy of their emotions" is akin to "the actions of a Vulcan who experiences emotions are indistinguishable from a Vulcan who does not". Which reduces their emotions to their qualia dimension, which, through probably inferable via StarTrek!MRIs, does equate "mastery" with "ignoring":

So Surak prefers to experience emotions, he just refuses to have those emotions have any impact on the world whatsoever. Relevant functional difference to going through with "kolinahr"? I don't see it.

* "What (some) algorithms feel like from the inside" is just kicking the can down the definitional road, now "what does 'feel like from the inside' mean?" and "which algorithms?" need to do the exact same work (any physical process instantiates an algorithm, kind of a corollary from the Church-Turing thesis).