Lumifer comments on Why CFAR? The view from 2015 - Less Wrong

46 Post author: PeteMichaud 23 December 2015 10:46PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (63)

You are viewing a single comment's thread.

Comment author: Lumifer 18 December 2015 03:49:11PM *  6 points [-]

Couple of notes...

We created a metric for strategic usefulness

What is that metric?

But it seems to many of us that there is a kind of “deep epistemic rationality” that doesn’t change one’s goals, but does help one make actual contact with the deep caring that already exists within a person.

I think this is a dangerous path to take. If you stay on it, I suspect that soon enough you'll come to the conclusion that absence of appropriate "caring" is irrational and should be fixed. And from there it's only a short jump and a hop to declaring that just those people who share your value system are rational. That would be an... unfortunate position for you to find yourselves in.

Comment author: taygetea 20 December 2015 07:40:44PM 0 points [-]

I could very well be in the grip of the same problem (and I'd think the same if I was), but it looks like CFAR's methods are antifragile to this sort of failure. Especially considering the metaethical generality and well-executed distancing from LW in CFAR's content.

Comment author: Lumifer 20 December 2015 10:05:47PM 1 point [-]

CFAR's methods are antifragile

What does that mean?

Comment author: FeepingCreature 25 December 2015 12:55:13AM 0 points [-]

book

Basically, systems that can improve from damage.

Comment author: ChristianKl 25 December 2015 07:39:07PM 0 points [-]

Basically, systems that can improve from damage.

The question isn't about what the word means in general but in what way CFAR's methods are supposedly antifragile.

Comment author: Lumifer 28 December 2015 04:01:01PM 1 point [-]

I know what the word means, kinda (Taleb isn't particularly coherent). I don't understand how CFAR's methods can improve from damage.

Comment author: ChristianKl 20 December 2015 09:02:40PM 0 points [-]

If I understand the position correctly it's that people who don't care about what they are working on won't be working effective and procrastinate.

In HPMOR Harry beats Voldemort because the has access to the superpower of caring and having something to protect.

I don't think that there a push to declare people who feel that they have the "wrong" things as somthing to protect as irrational. Even if there would be such a push the goal of the research in rationality isn't to label people are rational or irrational.

Comment author: Lumifer 20 December 2015 10:07:53PM 2 points [-]

If I understand the position correctly it's that people who don't care about what they are working on won't be working effective and procrastinate.

No, I don't think so. I think the "deep caring" CFAR talks about is only a particular kind of caring.

Let's get some people who very deeply care about money and large diamonds and arriving to parties in Monaco on their own private jet (or a superyacht, at least). They do care. Just not about the right thing.

Comment author: ChristianKl 21 December 2015 11:01:06AM 1 point [-]

Let's get some people who very deeply care about money and large diamonds and arriving to parties in Monaco on their own private jet (or a superyacht, at least). They do care.

Recently I had a conversation with a person from the LW sphere, who felt a bit empty. At the end of that conversation the solution was that the person comitted to making a plan to increase their professional skills and earn more money in the future.

I think that kind of caring is completely fine and I wouldn't expect anybody in CFAR to object to that solution.

As far of the people who go to Monaco with their private jet, they do care. With the rationality is winning frame, you would also call people who earn enough money to have a private jet rational. A person who doesn't care deeply about money won't work 80 hours per week at an investment bank.

there is a kind of “deep epistemic rationality” that doesn’t change one’s goals, but does help one make actual contact with the deep caring that already exists within a person That's not about saying that people's goals are wrong but about actually making them care towards working towards their goals instead of suffering from akrasia.

Comment author: malcolmocean 21 December 2015 11:15:25AM 1 point [-]

We created a metric for strategic usefulness

What is that metric?

It wouldn't surprise me if they didn't want to publish it because some of the aspects of the measure might be gameable, allowing people to pretend to be super useful by guessing the teacher's password.

Comment author: ChristianKl 21 December 2015 12:18:48PM 1 point [-]

GIven that the use it to justify the claim that CFAR made progress in the last year, it seems that the relevant people already know the metric.