All of Robin's Comments + Replies

Interesting article. But I do not see how the article supports the claim its title makes.

I think there's a connection between bucket errors and Obsessive Compulsive Disorder.

0PaulinePi
Well, it applies to the article... but also to cases in which one variable is actually related to the theory, not as in falsely related this time. You do reject the new information to protect your theory, To the second point: What makes you think that? And on which point do you think it acceses? Do yout think OCD prevents people from incorporating new information in general, or does it increase the chance of two variables ending up in "one bucket" that are not actually related (probably not in general, but in one aspect, like cleanliness or such)?

Is this an admission that CFAR cannot effectively help people with problems other than AI safety?

1[anonymous]
Or an admission that this was their endgame all along now that they have built a base of people who like them... I've been expecting that for quite some time. It fits the modus operendi.

I'm not sure what you mean and I'm not sure that I'd let a LWer falsify my hypothesis. There are clear systemic biases LWers have which are relatively apparent to outsiders. Ultimately I am not willing to pay CFAR to validate my claims and there are biases which emerge from people who are involved in CFAR whether as employees or people who take the courses (sunk cost as well as others).

2Duncan Sabien (Deactivated)
I can imagine that you might have hesitated to list specifics to avoid controversy or mud-slinging, but I personally would appreciate concrete examples, as it's basically my job to find the holes you're talking about and try to start patching them.

I'd take your bet if it were for the general population, not LWers...

My issue with CFAR is it seems to be more focused on teaching a subset of people (LWers or people nearby in mindspace) how to communicate with each other than in teaching them how to communicate with people they are different from.

0Duncan Sabien (Deactivated)
That's an entirely defensible impression, but it's also actually false in practice (demonstrably so when you see us at workshops or larger events). Correcting the impression (which again you're justified in having) is a separate issue, but I consider the core complaint to be long-since solved.

I think the Less Wrong website diminished in popularity because of the local meetups. Face to face conversation beats online conversation for most practical purposes. But many Less Wrongers have transitioned to being parents, or have found more professional success so I'm not sure how well the meetups are going now. Plus some of the meetups ban members rather than rationally explaining why they are not welcome in the group. This is a horrible tactic and causes members to limit how they express themselves... which goes against the whole purpose of rationality meetups.

How much will you bet that there aren't better strategies for resolving disagreement?

Given the complexity of this strategy it seems to me like in most cases it is more effective to do some combination of the following:

1) Agree to disagree 2) Change the subject of disagreement 3) Find new friends who agree with you 4) Change your beliefs, not because you believe they are wrong but because other people believe they are wrong. 5) Violence (I don't advocate this in general, but in practice it's what humans do when they have disagreed through history)

6Duncan Sabien (Deactivated)
The word "better" is doing a lot of work (more successful? Lower cost?), but in my personal experience and the experience of CFAR as a rationality org, double crux looks like the best all-around bet. (1) is a social move that sacrifices progress for happiness, and double crux is at least promising insofar as it lets us make that tradeoff go away. (2) sort of ... is? ... what double crux is doing—moving the disagreement from something unresolvable to something where progress can be made. (3) is absolutely a good move if you're prioritizing social smoothness or happiness or whatever, but a death knell for anyone with reasons to care about the actual truth (such as those working on thorny, high-impact problems). (4) is anathema for the same reason as (3). And (presumably like you), we're holding (5) as a valuable-but-costly tool in the toolkit and resorting to it as rarely as possible. I would bet $100 of my own money that nothing "rated as better than double crux for navigating disagreement by 30 randomly selected active LWers" comes along in the next five years, and CFAR as an org is betting on it with both our street cred and with our actual allotment of time and resources (so, value in the high five figures in US dollars?).

The short example (from somebody who went to college with Scott and took Calc II in the same class with him) is yes. But that's an answer relative to the students of an elite college and only based on the fact that he asked me for to work on math homework with him.

I hope they've managed to advance past "if somebody criticizes your idea, ban them from the group!" because that's what happened to me after a criticized Comfort Zone Expansion.

At that time, though I think much of hypnosis can be explained by the placebo effect.

0DanielLC
That doesn't make it bunk.
1alienist
This isn't really an explanation.

Maybe Rand is referering to a specific situation where she knows Branden's thought processes and her statements are correct.

It was about arguing with collectivists (AKA people who were sympathetic to the USSR). Whether she was correct about communism being inferior to capitalism isn't easy to analyze objectively but in a sense history has validated her.

In that case, I wouldn't know. But if it's meant generally enough to be a rationality quote - if it's meant to explain why we get angry at dishonest people - then it's just an unsupported claim

It's s... (read more)

2Vaniver
At the time, or now? Because hypnosis is a demonstrably effective treatment for some conditions, and clearly something is going on- but people vary in susceptibility and most people are familiar with the variety of hypnosis that stage magicians do rather than the type that hypnotherapists do.

I am an intransigent atheist, but not a militant one. This means that I am an uncompromising advocate of reason and that I am fighting for reason, not against religion. I must also mention that I do respect religion in its philosophical aspects, in the sense that it represents an early form of philosophy.

Ayn Rand, to a Catholic Priest.

4advancedatheist
Philosophers have played a game going way back where they believe that popular religion comes in handy as a fiction for keeping the mob in line, but they view themselves as god-optional. The philosophes in the Enlightenment started the experiment of letting the mob in on the truth, and the experiment has apparently gone so far in parts of Europe like Estonia that some populations have lost familiarity with christian beliefs, or even how to pronounce Jesus' name in their own language. Or so Phil Zuckerman claims: https://books.google.com/books?id=C-glNscSpiUC&lpg=PP1&dq=phil%20zuckerman&pg=PA96#v=onepage&q=estonia&f=false

I'm not entrenched enough in this community to know what's worthy of upvotes and what's not, so I'm selecting quotes that I personally like and seeing how they fare.

Do you remember what you liked about Ayn Rand? I've found that people like her for very different reasons.

0[anonymous]
More things I liked: Fred Kinnan. I'd really like to see his offscreen conversation with John Galt. What do you think: does Fred Kinnan "want to live", in the sense that the book tells us Jim Taggart doesn't, or not?
3MarkusRamikin
I remember I liked the characters who understood that a technical understanding of an issue screens off vaguer impressions (like with whether Rearden Metal was safe or not), I liked the individualism, and the idea that you don't have to feel guilty about every obligation which others would like to saddle you with just by their expectations... there were other things, but hard to list right now. As to the quote, well, I can't speak for the whole community, but here's why I didn't like it. Maybe Rand is referering to a specific situation where she knows Branden's thought processes and her statements are correct. In that case, I wouldn't know. But if it's meant generally enough to be a rationality quote - if it's meant to explain why we get angry at dishonest people - then it's just an unsupported claim. I don't see anything showing that Rand has a model-with-moving-parts understanding of the psychology of anger response, and didn't just make up an answer that fit her preferred moral categories. And equating dishonesty with both evil AND irrationality rubs me wrong. Rand believed that she's basically solved morality, and rationality only allowed one kind of morality, namely hers. Not just metamorality, but specific values. I believe this is part of what locked her into an inescapable worldview, beyond correction and updating (like what Branden wrote about how, once she decided that Reason's verdict on hypnosis was that it was bunk and had no foundation in reality, nothing could reach her on the subject), because once she decided something was incorrect, it was not just incorrect, but Evil. I think it more useful to consider rationality (correct reading of reality and decision making) separately from values held.
0[anonymous]
Methinks you should upvote what you find worthy, not what you think the community would find worthy.

You lost me at "junk heap."

Sorry you're so averse to negative descriptions of the average person's philosophy.

There is no conscious choice available to a layperson ignorant of philosophy and logic

Yes there is, they can choose what music, TV, movies, videos etc to buy/view/play.

and such ways of life are perfectly copacetic with small-enough communities

Do you mean communities where the leader knows about philosophy and can order people around?

If anything, it is the careful thinker who is more shackled by self-doubt

It's reasonable to ... (read more)

8MarkusRamikin
I would like to see some quotes from Rand that would be worthy of upvotes here. But after seeing your efforts lately, I am starting to wonder if my remembered fondness of Rand's writing persists only because I haven't actually re-read anything by her in a decade...

Would you consider having Less Wrong members record the sequences or do you already have people you've promised to give the job to?

7Rick_from_Castify
Thanks for the offer Robin but we've decided to go with professionals. Early on we auditioned people from the LessWrong community and we decided that having a well produced reading with a professional voice actor makes a big difference on the listening experience.

Trying to overcome biases takes effort. Wasted effort is bad. It's better to pursue mixed strategies that aim at instrumental rationality

I think you are assuming hyperbolic discounting/short time preference. It requires a lot of effort to overcome bias, perhaps years. But there are times when it is worth it.

than to aim at the perfection described in the Rand quotation

What perfection? Choosing philosophy? You can always update your philosophy.

327chaos
There are also times when it's not worth it, in my opinion. Rand contrasts "a conscious, rational, disciplined process of thought and scrupulously logical deliberation" with "a junk heap of unwarranted conclusions, false generalizations, undefined contradictions, undigested slogans, unidentified wishes, doubts and fears, thrown together by chance, but integrated by your subconscious into a kind of mongrel philosophy and fused into a single, solid weight: self-doubt, like a ball and chain". I think it's possible to avoid becoming such a disgrace without scrupulously logical deliberation. Most people are severely biased but are not as unhappy or helpless as Rand's argument would imply. Trimming the excesses of our biases seems more reasonable than eliminating them, to me.

This quote was from a speech given to West Point cadets. By no means are they identical but it would be relatively hard to find a group of people more identical (from the perspective of being of the same gender, same age (within a few years) same nationality, and same general ideology).

A. How would you implement that choice?

B. We is a loaded term, speak for yourself. There's benefit to realizing that as a human you have bias. There's no benefit to declaring that you can't overcome some of this bias.

C Wouldn't that depend on your philosophy?

127chaos
C. Yes. B. Agreed that there's benefit to realizing we have bias, disagree that there's no benefit to declaring some biases aren't overcomeable. Trying to overcome biases takes effort. Wasted effort is bad. It's better to pursue mixed strategies that aim at instrumental rationality than to aim at the perfection described in the Rand quotation. Thoughts that seem complex or messy should not be something we shy away from, reality is complicated and our brains are imperfect. A. I don't know how to describe how to do it, but I do it all the time. It's something humans have to fight against to avoid doing, as it's essentially automatic under normal conditions.

This argument has gone far away from the original quote. I'm not going to argue about the details. If you want to try to disprove your ability to become successful by using your intelligence, go ahead.

It's very difficult to make economic comparisons between countries while simultaneously acknowledging all of the cultural differences between countries. You can do it, but the results aren't necessarily meaningful.

Thanks for the information. My point is that money is a poor predictor of happiness and success.

You just spent half this thread claiming that success is subjective

Really? I'm pretty sure I didn't. Success is hard to define, but that doesn't mean it's subjective.

Bill Gates and James Harrison are going by their own ideas of altruistic success, not yours.

Oh really? Can you read their minds? I've read about Bill Gates motivations and I didn't see the word altruism once. It's all good and well to claim Bill Gates is part of your movement but for all you know he's never heard of it.

Why don't you call Jesus an altruist? Or some other religious figure?

Malnutrition and Starvation are different things. It's much better to be malnourished than to starve. And it's much harder to feed people the optimal food than to just feed them some food...

But you're missing the point. There are successful people in Somalia, if you manage to not be malnourished in Somalia then you are successful (unless you value eating bad food for religious reasons...).

Please tell us more about your inside information on the psychology of Bill and Melinda Gates

I have none. Just an opinion that given my posts downvote counts suggests that I shouldn't share.

Ayn Rand did not invent the term "altruism"?

Neither did the Effective Altruism people. But Ayn Rand's books have sold a lot and are read by influential people, so I'll use her definition until I have a reason not to.

0gjm
Please tell us more about your inside information on the psychology of Bill and Melinda Gates. You do understand, don't you, that Ayn Rand did not invent the term "altruism"?
2Wes_W
Who cares? You just spent half this thread claiming that success is subjective. Bill Gates and James Harrison are going by their own ideas of altruistic success, not yours. (For what it's worth, I personally do consider James Harrison successful at helping people. It explicitly was his goal, he made a pledge and everything.)

The annual US GDP per capita is $55,036. For Somalia, it's $145

This is availability bias. There are clearly other factors differentiating Somalia and the US. If there weren't, there would be massive starvation in Somalia because you can't get by on $145 a year in the US.

I can assure you that successful people are not born in the US by chance.

Really? Do you think successful people don't have children? And that they don't try to make these children US citizens by 'immigrating' (often illegally) to the USA? I can assure you this happens frequently... (read more)

3DanielLC
The GDP statistics I cited were nominal. The $2 a day thing was not. They don't make $2 a day. The make enough to go as far as $2 would in the US. Only 13% of the US population is immigrants. 20% of the world's immigrant population is in the US, so it works out to about two million immigrants. Less than a thirtieth of a percent of the world population. I does not explain the discrepancy of income. It's not up for you to define either. It seems highly unlikely that living on a fifteenth of what the US would call poor is successful. There are certainly people who value living on next to nothing, but I don't think there are billions of them. It would take powerful evidence to show that they consider themselves more successful than a US citizen. How much evidence do you have of this?
2Nornagest
There's a couple of things going on there. One is that Somalia is in fact a very malnourished country. Another is that the GDP figures DanielLC cites are nominal, not based on purchasing power parity, and therefore can be skewed by exchange rates. The currencies of poor third-world nations tend to be very weak, so going by nominal GDP will end up making them look even poorer than they actually are. PPP estimates for Somalia seem uncommon for some reason, but the CIA estimated a per-capita annual value of around $600 USD in 2010.
2Wes_W
There is, in fact, massive starvation in Somalia, price differences notwithstanding. The first sentence of the first link from a Google search for "malnutrition statistics somalia" says that "Somalia has some of the highest malnutrition rates in the world".

Is that supposed to be funny? The fact that you have a computer means you have won something. I'd be willing to guess that more technologies will emerge and you'll use them. That's like winning a lottery. But you don't get more successful unless you make intelligent decisions. Stupid decisions are punished, there are exceptions to this...

But seriously, lottery is a loaded term. It's often used as a metaphor for 'capitalist trick' (which smart people avoid).

0ChristianKl
The idea that thing average out depend on the assumption of success being due to a lot of independent events. Computer simulations of markets with trades of equal skill have no problem to produce the kind of difference in financial results that the traders we observe in reality produce. The fact that some authors write books that are more popular than the book of other authors is explainable without difference in skill or book quality.

The counterexample is the many people who have succeeded through luck

That's not an example, it's a claim with no evidence to support it. Give me an example of a person who has succeeded with only luck. There are about seven billion candidates so it shouldn't be hard to select one.

Everybody gets lucky sometimes, but they might not get lucky on the really important things

What is really important is subjective.

If you're born to a poor family in Africa, the law of large numbers is not going to make up for this setback.

Time will tell. African peop... (read more)

1DanielLC
The annual US GDP per capita is $55,036. For Somalia, it's $145. I cannot give you a specific example of someone who succeeded by luck, but I can assure you that successful people are not born in the US by chance. As of 2005, there were 2.6 billion people who lived on the equivalent of under $2 per day [source]. What possible values could they have where that could be considered success?
3Wes_W
James Harrison is the first example that leaps to my mind. His blood plasma contains a unique antibody which can be used to treat Rhesus disease, which seems like a near-perfect example of pure luck: neither he nor his parents nor anyone earned those antibodies in any useful sense of the word. He just coincidentally discovered that he had them. His lifetime blood donations are estimated to have treated two million children. Now, James Harrison surely gets some credit for his. He has, after all, donated blood a thousand times, which is far better than most of us. And he made a pledge to start donating blood before he learned about his antibodies! But a thousand blood donations, if you don't happen to have unique biology, will be multiple orders of magnitude less effective at helping people than James Harrison was, for the same effort. To find people as successful in their goal of helping others as James Harrison, you have to look far beyond "people who donate blood regularly". Perhaps Bill Gates, having become one of the richest men alive and then dedicating his life to charity, can claim to have accomplished more? When blind luck can put some random guy in the same league as the world's top altruist, it seems unreasonable to claim that literally nobody succeeds primarily through luck or by accident. So? Whatever you subjectively consider really important, you can get unlucky on those things. Also, some things like "not starving to death" or "not constantly being in pain" are subjectively important to basically everyone, and some get unlucky on these too.

Did you read what you linked to?

"No true Scotsman is an informal fallacy, an ad hoc attempt to retain an unreasoned assertion.[1] When faced with a counterexample to a universal claim ("no Scotsman would do such a thing"), rather than denying the counterexample or rejecting the original universal claim, this fallacy modifies the subject of the assertion to exclude the specific case or others like it by rhetoric, without reference to any specific objective rule ("no true Scotsman would do such a thing")"

Where is the counter... (read more)

1ChristianKl
But not everybody wins sometimes the lottery.
2DanielLC
The counterexample is the many people who have succeeded through luck. Everybody gets lucky sometimes, but they might not get lucky on the really important things. If you're born to a poor family in Africa, the law of large numbers is not going to make up for this setback. Given what I know if Ayn Rand, I'm inclined to think that the quote is suggesting that successful people deserve to be successful, so you shouldn't take their money and give it to unsuccessful people.

Primarily, by pretending that a "usually" is an "always". "Real success is never accidental" is, empirically, definitely false. "Real success is almost never accidental" would be the less strong, but more correct, version.

That would depend on what you mean by success now wouldn't it? If you believe people who take calculated risks and get unlucky aren't successful, then perhaps you're right. But you can't claim you can make a statement more correct by assuming you know what every word means. Parsing ambiguity is part of rationality. (Though my downvotes would indicate it's not...)

7Wes_W
Primarily, by pretending that a "usually" is an "always". "Real success is never accidental" is, empirically, definitely false. "Real success is almost never accidental" would be the less strong, but more correct, version. On the other hand, this objection can be applied to a very large fraction of rationality quotes. I'm not sure it matters much, when we're essentially just collecting proverbs, and including all the necessary caveats for perfect technical accuracy tends to take away the punchiness that makes proverbs worth collecting.

"Don’t let anybody discourage you or tell you that intelligence doesn’t pay or that success in life has to be achieved through dishonesty or through sheer blind luck. That is not true. Real success is never accidental and real happiness cannot be found except by the honest use of your intelligence."

Ayn Rand

Too strong.

Nobody EVER got successful from luck? Not even people born billionaires or royalty?

Nobody can EVER be happy without using intelligence? Only if you're using some definition of happiness that includes a term like "Philosophical fulfillment" or some such, which makes the issue tautological.

"As a human being, you have no choice about the fact that you need a philosophy. Your only choice is whether you define your philosophy by a conscious, rational, disciplined process of thought and scrupulously logical deliberation—or let your subconscious accumulate a junk heap of unwarranted conclusions, false generalizations, undefined contradictions, undigested slogans, unidentified wishes, doubts and fears, thrown together by chance, but integrated by your subconscious into a kind of mongrel philosophy and fused into a single, solid weight: self-doubt, like a ball and chain in the place where your mind’s wings should have grown."

Ayn Rand

0Grif
You lost me at "junk heap." There is no conscious choice available to a layperson ignorant of philosophy and logic, and such ways of life are perfectly copacetic with small-enough communities. If anything, it is the careful thinker who is more shackled by self-doubt, better understood as the Dunning-Kruger effect, but Ayn Rand has made it obvious she never picked up any primary literature on cognitive science so it's not surprising to see her confusion here. Quote from 1971's The Romantic Manifesto.
6Luke_A_Somers
If the bolded pair of words were struck, I'd agree completely. Different people will have different balls and chains.
327chaos
A. False dichotomy - there are other choices. We might choose to compartmentalize our rationality, for example. B. False dichotomy in a different sense - we actually don't have access to this choice. No matter how hard we work, our brains are going to be biased and our philosophies are going to be sloppy. It's a question of making one's brain marginally more organized or less disorganized, not of jumping from insanity onto reason. I'm suspicious that working with the insanity and trying to guide its flow is a better strategy than trying to destroy it. C. Although not having a philosophy leaves us open to bias, having a philosophy can sometimes expose us to bias even further. It's about comparative advantage. Agnosticism has wiggle room that sometimes can be a place for bias to hide, but conversely ideology without self-doubt often serves to crush the truth.
0[anonymous]
1. False dichotomy - there are other choices than those. We might choose to compartmentalize our rationality, for example. 2. False dichotomy in the other direction - we don't have access to this choice. No matter how hard we work, our brains are going to be biased and our philosophies are going to be sloppy. It's a question of making one's brain marginally more organized or less disorganized, not of jumping from insanity onto reason.
2Gondolinian
Yes, I believe it is.

The only part I object to what you wrote is emotions shouldn't interfere with cognition.

This is an ideal which Objectivists believe in, but it is difficult/impossible to actually achieve. I've noticed that as I've gotten older, emotions interfere with my cognition less and less and I am happy about that. You can define cognition how you wish, but given the number of people who see it as separate from emotion it's probably worth having a backup definition in case you want to talk to those people.

RE: emotions, affect, moods. I do think that emotions s... (read more)

If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational.

If your hunches have a good track record, I think you should explore that and come up with a rational explanation, and make sure it's not just a coincidence. Additionally, while following your hunches isn't inherently bad, rational people shouldn't be convinced of an argument merely based on somebody else's hunch.

Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn't mean tha

... (read more)
4DanielLC
My explanation is that hunches are based on aggregate data that you are not capable of tracking explicitly. Hunches aren't scientific. They're not good for social things. Anyone can claim to have a hunch. That being said, if you trust someone to be honest, and you know the track record of their hunches, there's no less reason to trust their hunches than your own. I mean ignore the emotion for the purposes of coming up with a solution. Overconfidence bias causes you to take too many risks. Risk aversion causes you to take too few risks. I doubt they counter each other out that well. It's probably for the best to get rid of both. But I'd bet that getting rid of just one of them, causing you to either consistently take too many risks or consistently take too few, would be worse than keeping both of them. Emotions are more about considering theories than finding them. That being said, you don't come up with theories all at once. Your emotions will be part of how you refine the theories, and they will be involved in training whatever heuristics you use. I'm certainly not arguing that rationality is entirely about emotion. Anything with a significant effect on your cognition should be strongly considered for rationality before you reject it. This looks like you're talking about terminal values. The utility function is not up for grabs. You can't convince a rational agent that your goals are worth achieving regardless of the method you use. Am I misunderstanding this comment?

Very interesting... it would seem that Rand doesn't actually define emotion consistently, that was not the definition I was using. But the Ayn Rand Lexicon has 11 different passages related to emotions.

http://aynrandlexicon.com/lexicon/emotions.html

2Jayson_Virissimo
More charitably we could say her conception of emotions evolved over time. Thanks for the link, I actually found some of that insightful. Also, I had forgotten how blank slatey her theory of mind was.

Rand doesn't deny that emotions are part of rationality, she denies that they are tools of rationality. It is rational to try to make yourself experience positive emotions, but to say "I have a good feeling about this" is not a rational statement, it's an emotional statement. It isn't something that should interfere with cognition.

As for emotions affecting humans behavior, I think all mammals have emotions, so it's not easy for humans to discard them over a few generations of technological evolution. Emotions were useful in the ancestral environment, they are no longer as useful as they once were.

5DanielLC
If your hunches have a bad track record, then you should learn to ignore them, but if they do work, then ignoring them is irrational. Even if emotions are suboptimal tools in virtually all cases (which I find unlikely), that doesn't mean that ignoring them is a good idea. It's like how getting rid of overconfidence bias and risk aversion is good, but getting rid of overconfidence bias OR risk aversion is a terrible idea. Everything we've added since emotion was built around emotion. If emotion will give you an irrational bias, then you'll evolve a counter bias elsewhere.
1anandjeyahar
The only part I object to what you wrote is emotions shouldn't interfere with cognition. I think they already are a part of cognition and it's a bit like calling "quantum physics is weird". Perhaps you meant "emotions shouldn't interfere with rationality" in which case I'll observe that it doesn't seem to be a popular view around lesswrong. Also observe, I used to believe that emotions should be ignored, but later came to the conclusion that it's a way too heavy-handed strategy for the modern world of complex systems. I'll try to conjecture further, by saying, cog, psychologists tend to classify emotion, affect, and moods differently. AFAIK, it's based on the temporal duration it exists with short - long in order of emotion, mood, affect. My conjecture is emotions can and should be ignored, mood can be ignored ( but not necessarily should) and affect should not be ignored, while rational decision-making.

How does the definition you link to contradict Rand's statement? You can acknowledge emotions as real while denying their usefulness in your cognitive process.

1DanielLC
The article I linked to wasn't just saying that emotions exist. It was saying that they're part of rationality. If emotions didn't make people behave rationally, then people wouldn't evolve to have emotions.
5Jayson_Virissimo
This seems to be in tension with what she has stated elsewhere. For instance: -- Ayn Rand, Philosophy: Who Needs It? Wouldn't immediately available estimates be a good tool of cognition?
8DanielLC
I beg to differ. Or are you saying that, if Ayn Rand says it, it must be wrong? In which case, I still disagree.

I don't know, but if you ask intelligent people what they think about x-risk related to AI it's unlikely they'll come to the exact same conclusions that MIRI etc have.

If you present the ideas of MIRI to intelligent people, some of them will be excited and want to help with donations or volunteering. Others will dismiss you and think you are wrong/crazy.

So to expand on my question... if you find intelligent people who disagree with MIRI on significant things, will you work with them?

OK, so it seems like FLI promotes the conclusions of other x-risk organizations, but doesn't do any actual research itself.

Do you think it's not worth questioning the conclusions that other organizations have come to? Seems to me that if there are four xrisk organizations (each with reasonably strong connections to each other) there should be some debate between them.

1Vika
What kind of questions would you expect the organizations to disagree about?

I cringe at the term x-risk.

Can you think of another five letter description? The shorter the term, the easier of a time people will have remembering it and thus the meme will spread faster than a longer term.

7A1987dM
What matters is not how many people will remember it, it's how many people will remember it and take it seriously.
8soreff
Can one use the backwards-E existence symbol as one of the letters?
2Lumifer
Well... Is x-risk what happens when x-men do x-rated x-treme stuff?

We consider ourselves a sister organization to FHI, CSER and MIRI, and touch base with them often

How would you differentiate yourself from those organizations?

MIRI is focusing on technical research into Friendly AI, and their recent mid-2014 strategic plan explicitly announced that they are leaving the public outreach and strategic research to FHI, CSER and FLI. Compared to FHI and CSER, we are less focused on research and more on outreach, which we are well-placed to do given our strong volunteer base and academic connections. Our location allows us to directly engage Harvard and MIT researchers in our brainstorming and decision-making.

What makes facts surprising and is it the same for you and your conversation partner? If they take your question as an insult, you probably have different knowledge of the field in question (they could be completely wrong, or you could be ignorant of something that is well known to some).

I think it's a better idea to explain why you're surprised.

Does CFAR feel developed enough that it would prefer money to feedback?

I.E, I presume there are many people out there who could help CFAR either by dedicating a few hours of there time thinking about how to improve CFAR or earning money to donate to CFAR.

I helped create CFAR, and work every day in the same office as they do, and I still need to talk with the co-founders for several hours before I understand enough detail about CFAR's challenges and opportunities to have advice that I'm decently confident will be useful rather than something they've already tried, or something they have a good reason for not doing, etc.

Having spent a fair amount of time around CFAR staff, in the office and out, I can testify to their almost unbelievable level of self-reflection and creativity. (I recall, several months ago, Julia joking about how much time in meetings was spent discussing the meetings themselves at various levels of meta.) For what it's worth, I can't think of an organization I'd trust to have a greater grasp on its own needs and resources. If they're pushing fundraising, I'd estimate with high confidence that it's because that's where the bottleneck is.

I think donating ... (read more)

I think CFAR feels poor enough to prefer money to feedback.

Also they've tried a lot of the obvious things - I had a conversation with Anna where I suggested about 10 things for CFAR to try, they'd already tried about 9, and the 10th wasn't obviously better than the stuff already on their list. Maybe you're smarter than me, though :)

The problem is that (according to Kahneman and Tverksy) losses are felt more strongly than gains. So it requires a good deal of effort to not be offended.

Load More