Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Robin 20 December 2016 01:44:31PM 0 points [-]

Interesting article. But I do not see how the article supports the claim its title makes.

I think there's a connection between bucket errors and Obsessive Compulsive Disorder.

Comment author: Robin 10 December 2016 04:47:26AM 1 point [-]

Is this an admission that CFAR cannot effectively help people with problems other than AI safety?

Comment author: Duncan_Sabien 09 December 2016 11:11:01PM 0 points [-]

That's an entirely defensible impression, but it's also actually false in practice (demonstrably so when you see us at workshops or larger events). Correcting the impression (which again you're justified in having) is a separate issue, but I consider the core complaint to be long-since solved.

Comment author: Robin 10 December 2016 04:44:35AM 0 points [-]

I'm not sure what you mean and I'm not sure that I'd let a LWer falsify my hypothesis. There are clear systemic biases LWers have which are relatively apparent to outsiders. Ultimately I am not willing to pay CFAR to validate my claims and there are biases which emerge from people who are involved in CFAR whether as employees or people who take the courses (sunk cost as well as others).

Comment author: Duncan_Sabien 30 November 2016 09:45:03PM 4 points [-]

The word "better" is doing a lot of work (more successful? Lower cost?), but in my personal experience and the experience of CFAR as a rationality org, double crux looks like the best all-around bet. (1) is a social move that sacrifices progress for happiness, and double crux is at least promising insofar as it lets us make that tradeoff go away. (2) sort of ... is? ... what double crux is doing—moving the disagreement from something unresolvable to something where progress can be made. (3) is absolutely a good move if you're prioritizing social smoothness or happiness or whatever, but a death knell for anyone with reasons to care about the actual truth (such as those working on thorny, high-impact problems). (4) is anathema for the same reason as (3). And (presumably like you), we're holding (5) as a valuable-but-costly tool in the toolkit and resorting to it as rarely as possible.

I would bet $100 of my own money that nothing "rated as better than double crux for navigating disagreement by 30 randomly selected active LWers" comes along in the next five years, and CFAR as an org is betting on it with both our street cred and with our actual allotment of time and resources (so, value in the high five figures in US dollars?).

Comment author: Robin 09 December 2016 07:30:27PM 0 points [-]

I'd take your bet if it were for the general population, not LWers...

My issue with CFAR is it seems to be more focused on teaching a subset of people (LWers or people nearby in mindspace) how to communicate with each other than in teaching them how to communicate with people they are different from.

Comment author: Robin 30 November 2016 08:11:14PM *  0 points [-]

I think the Less Wrong website diminished in popularity because of the local meetups. Face to face conversation beats online conversation for most practical purposes. But many Less Wrongers have transitioned to being parents, or have found more professional success so I'm not sure how well the meetups are going now. Plus some of the meetups ban members rather than rationally explaining why they are not welcome in the group. This is a horrible tactic and causes members to limit how they express themselves... which goes against the whole purpose of rationality meetups.

Comment author: Robin 30 November 2016 08:05:30PM 2 points [-]

How much will you bet that there aren't better strategies for resolving disagreement?

Given the complexity of this strategy it seems to me like in most cases it is more effective to do some combination of the following:

1) Agree to disagree 2) Change the subject of disagreement 3) Find new friends who agree with you 4) Change your beliefs, not because you believe they are wrong but because other people believe they are wrong. 5) Violence (I don't advocate this in general, but in practice it's what humans do when they have disagreed through history)

Comment author: Lumifer 07 July 2015 06:19:49PM 2 points [-]

I don't know (don't know of, even) any high-profile transgender people.

Bruce/Caitlyn Jenner.

Comment author: Robin 30 July 2015 10:43:45PM 0 points [-]
Comment author: Robin 26 May 2015 12:14:12AM 1 point [-]

The short example (from somebody who went to college with Scott and took Calc II in the same class with him) is yes. But that's an answer relative to the students of an elite college and only based on the fact that he asked me for to work on math homework with him.

Comment author: Robin 26 May 2015 12:12:25AM 0 points [-]

I hope they've managed to advance past "if somebody criticizes your idea, ban them from the group!" because that's what happened to me after a criticized Comfort Zone Expansion.

Comment author: Robin 20 December 2014 06:02:40PM 7 points [-]

I am an intransigent atheist, but not a militant one. This means that I am an uncompromising advocate of reason and that I am fighting for reason, not against religion. I must also mention that I do respect religion in its philosophical aspects, in the sense that it represents an early form of philosophy.

Ayn Rand, to a Catholic Priest.

View more: Next