Comment author: Robin 07 December 2014 11:55:40PM *  11 points [-]

"As a human being, you have no choice about the fact that you need a philosophy. Your only choice is whether you define your philosophy by a conscious, rational, disciplined process of thought and scrupulously logical deliberation—or let your subconscious accumulate a junk heap of unwarranted conclusions, false generalizations, undefined contradictions, undigested slogans, unidentified wishes, doubts and fears, thrown together by chance, but integrated by your subconscious into a kind of mongrel philosophy and fused into a single, solid weight: self-doubt, like a ball and chain in the place where your mind’s wings should have grown."

Ayn Rand

Comment author: anandjeyahar 08 July 2014 06:21:29AM 1 point [-]

The only part I object to what you wrote is * emotions shouldn't interfere with cognition*. I think they already are a part of cognition and it's a bit like calling "quantum physics is weird". Perhaps you meant "emotions shouldn't interfere with rationality" in which case I'll observe that it doesn't seem to be a popular view around lesswrong. Also observe, I used to believe that emotions should be ignored, but later came to the conclusion that it's a way too heavy-handed strategy for the modern world of complex systems. I'll try to conjecture further, by saying, cog, psychologists tend to classify emotion, affect, and moods differently. AFAIK, it's based on the temporal duration it exists with short - long in order of emotion, mood, affect. My conjecture is emotions can and should be ignored, mood can be ignored ( but not necessarily should) and affect should not be ignored, while rational decision-making.

Comment author: Robin 08 July 2014 06:12:46PM 1 point [-]

The only part I object to what you wrote is * emotions shouldn't interfere with cognition*.

This is an ideal which Objectivists believe in, but it is difficult/impossible to actually achieve. I've noticed that as I've gotten older, emotions interfere with my cognition less and less and I am happy about that. You can define cognition how you wish, but given the number of people who see it as separate from emotion it's probably worth having a backup definition in case you want to talk to those people.

RE: emotions, affect, moods. I do think that emotions should be considered when making rational decisions, but they are not the tools by which we come to decisions, here's an example.

If you want to build a house to shelter your family, your emotional connection to your family is not a tool you will use to build the house. It's important to have a strong motivation to do something, but that motivation is not a tool. You'll still need hammers, drills, etc to build the house.

I believe we can and should use drugs (I include naturally occurring hormones) to modify our emotions in order to better achieve our goals.

Comment author: DanielLC 08 July 2014 07:58:16AM 4 points [-]

but to say "I have a good feeling about this" is not a rational statement, it's an emotional statement.

If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational.

Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn't mean that ignoring them is a good idea. It's like how getting rid of overconfidence bias and risk aversion is good, but getting rid of overconfidence bias OR risk aversion is a terrible idea. Everything we've added since emotion was built around emotion. If emotion will give you an irrational bias, then you'll evolve a counter bias elsewhere.

Comment author: Robin 08 July 2014 06:03:21PM 1 point [-]

If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational.

If your hunches have a good track record, I think you should explore that and come up with a rational explanation, and make sure it's not just a coincidence. Additionally, while following your hunches isn't inherently bad, rational people shouldn't be convinced of an argument merely based on somebody else's hunch.

Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn't mean that ignoring them is a good idea.

Nobody is suggesting we ignore emotions, merely that we don't let them interfere with rational thought (in practice this is very difficult).

It's like how getting rid of overconfidence bias and risk aversion is good, but getting rid of overconfidence bias OR risk aversion is a terrible idea. .

I don't follow this argument. Your biases can be evaluated absolutely, or relative to the general population. If everybody is biased underconfidence, the being biased in towards overconfidence can be an advantage. There's a similar argument for risk aversion.

Everything we've added since emotion was built around emotion. If emotion will give you an irrational bias, then you'll evolve a counter bias elsewhere

I'm not sure I agree with this, do you think that The Big Bang Theory is based on emotion? You can draw a path from emotion to the people who came up with the Big Bang Theory, but you can do that with things other than emotion as well.

My issue with emotions is only partly that they cause biases, it's also that you can't rely on other people having the same emotions as you. So you can use emotions to better understand your own goals. But you won't be able to convince people who don't know your emotions that your goals are worth achieving.

Comment author: Jayson_Virissimo 06 July 2014 01:07:39AM 5 points [-]

This seems to be in tension with what she has stated elsewhere. For instance:

emotions...are lightning-like estimates of the things around you, calculated according to your values.

-- Ayn Rand, Philosophy: Who Needs It?

Wouldn't immediately available estimates be a good tool of cognition?

Comment author: Robin 08 July 2014 04:44:16AM 1 point [-]

Very interesting... it would seem that Rand doesn't actually define emotion consistently, that was not the definition I was using. But the Ayn Rand Lexicon has 11 different passages related to emotions.

http://aynrandlexicon.com/lexicon/emotions.html

Comment author: DanielLC 06 July 2014 07:19:33AM 2 points [-]

The article I linked to wasn't just saying that emotions exist. It was saying that they're part of rationality.

If emotions didn't make people behave rationally, then people wouldn't evolve to have emotions.

Comment author: Robin 08 July 2014 04:13:29AM 1 point [-]

Rand doesn't deny that emotions are part of rationality, she denies that they are tools of rationality. It is rational to try to make yourself experience positive emotions, but to say "I have a good feeling about this" is not a rational statement, it's an emotional statement. It isn't something that should interfere with cognition.

As for emotions affecting humans behavior, I think all mammals have emotions, so it's not easy for humans to discard them over a few generations of technological evolution. Emotions were useful in the ancestral environment, they are no longer as useful as they once were.

Comment author: DanielLC 27 June 2014 07:41:08PM 6 points [-]

I beg to differ. Or are you saying that, if Ayn Rand says it, it must be wrong? In which case, I still disagree.

Comment author: Robin 06 July 2014 12:29:37AM 1 point [-]

How does the definition you link to contradict Rand's statement? You can acknowledge emotions as real while denying their usefulness in your cognitive process.

Comment author: Robin 21 June 2014 11:17:38PM -1 points [-]

"Emotions are not tools of cognition"

Ayn Rand

Comment author: Vika 20 June 2014 06:41:10PM 1 point [-]

What kind of questions would you expect the organizations to disagree about?

Comment author: Robin 21 June 2014 12:29:13AM 3 points [-]

I don't know, but if you ask intelligent people what they think about x-risk related to AI it's unlikely they'll come to the exact same conclusions that MIRI etc have.

If you present the ideas of MIRI to intelligent people, some of them will be excited and want to help with donations or volunteering. Others will dismiss you and think you are wrong/crazy.

So to expand on my question... if you find intelligent people who disagree with MIRI on significant things, will you work with them?

Comment author: Vika 16 June 2014 04:11:24AM 14 points [-]

MIRI is focusing on technical research into Friendly AI, and their recent mid-2014 strategic plan explicitly announced that they are leaving the public outreach and strategic research to FHI, CSER and FLI. Compared to FHI and CSER, we are less focused on research and more on outreach, which we are well-placed to do given our strong volunteer base and academic connections. Our location allows us to directly engage Harvard and MIT researchers in our brainstorming and decision-making.

Comment author: Robin 20 June 2014 03:24:43AM 5 points [-]

OK, so it seems like FLI promotes the conclusions of other x-risk organizations, but doesn't do any actual research itself.

Do you think it's not worth questioning the conclusions that other organizations have come to? Seems to me that if there are four xrisk organizations (each with reasonably strong connections to each other) there should be some debate between them.

Comment author: Lethalmud 13 June 2014 12:38:57PM 3 points [-]

I cringe at the term x-risk.

Comment author: Robin 14 June 2014 08:16:49PM 5 points [-]

I cringe at the term x-risk.

Can you think of another five letter description? The shorter the term, the easier of a time people will have remembering it and thus the meme will spread faster than a longer term.

View more: Prev | Next