Vaniver comments on CFAR in 2014: Continuing to climb out of the startup pit, heading toward a full prototype - Less Wrong

61 Post author: AnnaSalamon 26 December 2014 03:33PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (73)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 28 December 2014 01:05:46PM 9 points [-]

CFAR seems to many of us to be among the efforts most worth investing in. This isn’t because our present workshops are all that great. Rather, it is because, in terms of “saving throws” one can buy for a humanity that may be navigating tricky situations in an unknown future, improvements to thinking skill seem to be one of the strongest and most robust.

Why? You tend to be marketing your workshops to people who've already got significant training in much of Traditional Rationality. In my view, much of the world's irrationality comes from people who have not even heard of the basics or people whose resource constraints do not allow them to apply what they know, or both. In this model, broad improvements in very fundamental, schoolchild-level rationality education and the alleviation of poverty and time poverty are much stronger prospects for improving the world through prevention of Dumb Moves than giving semi-advanced cognitive self-improvement workshops to the Silicon Valley elite.

Mind, if what you're really trying to do is propagandize the kind of worldview that leads to taking MIRI seriously, you rather ought to come out and say that.

Comment author: Vaniver 28 December 2014 06:00:17PM *  11 points [-]

In this model, broad improvements in very fundamental, schoolchild-level rationality education and the alleviation of poverty and time poverty are much stronger prospects for improving the world through prevention of Dumb Moves than giving semi-advanced cognitive self-improvement workshops to the Silicon Valley elite.

So, I recently started training in the Alexander Technique, which is a well-developed school of thought and practice on how to use bodies well. It's been taught for about a century, and during the 1940s there was a brief attempt to teach it in schools to children.

My impression is that the children didn't get all that much out of it- yes, they had better posture, and the students who might have been klutzier were more coordinated. But the people that keep Alexander alive are mostly the performers and musicians and people with painful movement problems- that is, the sort of people that get enough value out of it that it makes sense for them to take special lessons and think about it in their off time and so on.

Similarly, it might be true that while there is a great mass of irrationality out there, cognitive labor, like any other labor, can be specialized- and so focusing your rationality training on people who specialize in thinking makes sense just as focusing your movement training on people who specialize in movement makes sense. (Here I'm including speaking as movement for reasons that are anatomically obvious.)

But supposing your model is correct--that a broad rationality education would do the most good--I seem to recall hearing about an undergraduate-level rationality curriculum being developed by Keith Stanovich, a CFAR advisor, and I suspect Anna or others may know more details. Once we've got an undergraduate curriculum being taught, that should teach us enough to develop high-school level curriculum, and so on down to songs that can be sung in kindergarten.

Mind, if what you're really trying to do is propagandize the kind of worldview that leads to taking MIRI seriously, you rather ought to come out and say that.

Why? It seems to me that training people to think well is better, because if they end up disagreeing that gives you valuable information to update on.

Comment author: [deleted] 28 December 2014 09:00:33PM *  6 points [-]

Similarly, it might be true that while there is a great mass of irrationality out there, cognitive labor, like any other labor, can be specialized- and so focusing your rationality training on people who specialize in thinking makes sense just as focusing your movement training on people who specialize in movement makes sense. (Here I'm including speaking as movement for reasons that are anatomically obvious.)

This would imply that CFAR should be pitching its workshops to academics and government policymakers. Not to be a dick, but the latest local-mobile-social app-kerjigger is not intensive cognitive labor with a high impact on the world. Actual scientific research and public policy-making are (or, at least, scientific research is fairly intensive cognitive labor... I wouldn't necessary say it has a high mean impact on any per-unit basis).

Why? It seems to me that training people to think well is better, because if they end up disagreeing that gives you valuable information to update on.

I would hope so! But what information indicates CFAR does this?

But supposing your model is correct--that we a broad rationality education would do the most good--I seem to recall hearing about an undergraduate-level rationality curriculum being developed by Keith Stanovich, a CFAR advisor, and I suspect Anna or others may know more details. Once we've got an undergraduate curriculum being taught, that should teach us enough to develop high-school level curriculum, and so on down to songs that can be sung in kindergarten.

That's good, but I worry that it doesn't go far enough. The issue is not that we're failing to teach probability theory to kindergartners -- they don't need it and don't want it. The issue is that our society allows people to walk around thinking that there isn't actually an external world to which their actions will be held accountable at all, and that subjective feeling both governs reality and normatively dictates correct actions.

To make an offensive political quip: there is the assertion-based community, and the reality-based community; too many people belong to the former and not nearly enough to the latter. The biggest impact we can have on "raising the sanity waterline" is to move people from the group who believe in a Fideist Theory of Truth ("Things are true by virtue of how I feel about them") to people who believe in the Correspondence Theory of Truth ("Things are true when they match the world outside my head!"), which also thus inspires people to listen to educated domain experts at all.

To give a flagrantly stupid example, we really really really don't want society's way of dealing with the Friendly AI problem determined by people who believe that AIs have souls and would never harm anyone because they don't have original sin. Giving Silicon Valley executives effectiveness workshops will not avert this problem, while teaching the broad public the very basic worldview that the universe is lawful, rather than consciously optimizing for recognizably humanoid goals, is likely to affect this problem.

Comment author: Vaniver 29 December 2014 08:35:42AM *  8 points [-]

This would imply that CFAR should be pitching its workshops to academics and government policymakers.

My understanding is that CFAR is attended by both present and likely future academics; I don't know about government policymakers. (I've met people on national advisory boards from at least two countries at CFAR workshops, but I don't pretend to know how much influence they have on those boards, or how much influence those boards have on actual policy.)

Not to be a dick, but the latest local-mobile-social app-kerjigger is not intensive cognitive labor with a high impact on the world.

At time of writing this comment, there are 14 startups listed in the post. What number of them would you consider local-mobile-social apps? (This seems to be an example of "not to be X" signifying "I am aware this is being an X but would like to avoid paying the relevant penalty.")

I would hope so! But what information indicates CFAR does this?

I have always gotten the impression from them that they want to be as cause agnostic as is reasonable, but I can't speak to their probability estimates over time and thus how they've updated.

The biggest impact we can have on "raising the sanity waterline" is to move people from the group who believe in a Fideist Theory of Truth ("Things are true by virtue of how I feel about them") to people who believe in the Correspondence Theory of Truth ("Things are true when they match the world outside my head!"), which also thus inspires people to listen to educated domain experts at all.

Are there people working on a reproducible system to help people make this move? It's not at all obvious to me that this would be the comparative advantage of the people at CFAR. (Though it seems to me that much of the CFAR material is helping people finish making that transition, or, at least, get further along it.)