Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Kawoomba comments on CFAR’s new focus, and AI Safety - LessWrong

30 Post author: AnnaSalamon 03 December 2016 06:09PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread.

Comment author: Kawoomba 09 December 2016 05:05:40PM 0 points [-]

The catch-22 I would expect with CFAR's efforts is that anyone buying their services is already demonstrating a willingness to actually improve his/her rationality/epistemology, and is looking for effective tools to do so.

The bottleneck, however, is probably not the unavailability of such tools, but rather the introspectivity (or lack thereof) that results in a desire to actually pursue change, rather than simply virtue-signal the typical "I always try to learn from my mistakes and improve my thinking".

The latter mindset is the one most urgently needing actual improvements, but its bearers won't flock to CFAR unless it has gained acceptance as an institution with which you can virtue-signal (which can confer status). While some universities manage to walk that line (providing status affirmation while actually conferring knowledge), CFAR's mode of operation would optimally entail "virtue-signalling ML students in on one side", "rationality-improved ML students out on the other side", which is a hard sell, since signalling an improvement in rationality will always be cheaper than the real thing (as it is quite non-obvious to tell the difference for the uninitiated).

What remains is helping those who have already taken that most important step of effective self-reflection and are looking for further improvement. A laudable service to the community, but probably far from changing general attitudes in the field.

Taking off the black hat, I don't have a solution to this perceived conundrum.

Comment author: Lumifer 09 December 2016 06:49:58PM 0 points [-]

The self-help industry (as well as, say, gyms or fat farms) mostly sells what I'd call "willpower assists" -- motivation and/or structure which will push you to do what you want to do but lack sufficient willpower for.

Comment author: MrMind 12 December 2016 08:20:22AM 0 points [-]

To the extent that this is true, I would say that they are failing abismally.

Comment author: Lumifer 12 December 2016 03:45:03PM 2 points [-]

You're a bit confused: they are selling willpower assists, but what they want is to get money for them. They are not failing at collecting the money, and as to willpower assists turning out to be not quite like the advertisements, well, that's a standard caveat emptor issue.

Comment author: MrMind 13 December 2016 04:22:02PM 0 points [-]

Ha, you're absolutely right!