No One Knows Stuff

7 talisman 12 May 2009 05:11AM

Take a second to go upvote You Are A Brain if you haven't already...

Back?  OK.

Liron's post reminded me of something that I meant to say a while ago.  In the course of giving literally hundreds of job interviews to extremely high-powered technical undergraduates over the last five years, one thing has become painfully clear to me:  even very smart and accomplished and mathy people know nothing about rationality.

For instance, reasoning by expected utility, which you probably consider too basic to mention, is something they absolutely fall flat on.  Ask them why they choose as they do in simple gambles involving risk, and they stutter and mutter and fail.  Even the Econ majors.  Even--perhaps especially--the Putnam winners.

Of those who have learned about heuristics and biases, a nontrivial minority have gotten confused to the point that they offer Kahneman and Tversky's research as justifying their exhibition of a bias!

So foundational explanatory work like Liron's is really pivotal.  As I've touched on before, I think there's a huge amount to be done in organizing this material and making it approachable for people that don't have the basics.  Who's going to write the Intuitive Explanation of Utility Theory?

Meanwhile, I need to brush up on my Python and find a way to upvote Liron more than once.  If only...

Update: Tweaked language per suggestion, added Kahneman and Tversky link.

On the Fence? Major in CS

18 talisman 07 May 2009 04:26AM

I talk to many ABDs in math, physics, engineering, economics, and various other technical fields.

I work with exceptional people from all those backgrounds.

I would like to unreservedly say to any collegians out there, whether choosing an undergrad major or considering fields of study for grad school: if you know you want a technical major but you're not sure which, choose Computer Science.

Unless you're extremely talented and motivated, relative to your extremely talented and motivated peers, you probably aren't going to make a career in academia even if you want to.  And if you want a technical major but you're not sure which, you shouldn't want to!  Academia is a huge drag in many ways.  When a math ABD starts telling me about how she really likes her work but is sick of the slow pace and the fact that only six people in the world understand her work, I get to take a nice minute alone with my thoughts: I've heard it over and over again, in the same words and the same weary, beaten-down tone.  You shouldn't be considering a career in academia unless you're passionately in love with your field, unless you think about it in the shower and over lunch and as you drift off to sleep, unless the problem sets are a weekly joy.  A lesser love will abandon you and leave you stranded and heartbroken, four years into grad school.

What's so great about CS, then?  Isn't it just a bunch of glorified not-real-math and hundreds of hours of grimy debugging?

Let's start with several significant, but peripheral, reasons:

  • CS majors learn to really program.  There's an ocean of difference between the power of a decent, desultory programmer and that of a real programmer.  If you're not a programmer, the power of real programmers to create good stuff borders on magic. 
  • Not least among the good stuff is time.  It's disgraceful, the amount of human effort that goes into work that could be done by a Perl one-liner.
  • CS majors learn to be at home with the guts of computers.  This seems to come in handy in a hundred little ways.
  • CS majors are significantly more likely than other technical majors to get involved in startups, which are one of the best ways around to create wealth.
  • While having abstract and intellectual sides, the good kind of CS is strongly tied to the practical.
  • CS people can do fun side projects.   I've never heard of an engineer doing a bit of engineering on the side from their management consulting job; with CS people it's rare that they don't have a little something cooking.

None of that gets to my real point, which is the modes of thought that CS majors build.  Working with intransigent computer code for years upon years, the smart ones learn a deeply careful, modular, and reductionist mindset that transfers shockingly well to all kinds of systems-oriented thinking--

And most significantly to building and understanding human systems.  The questions they learn to ask about a codebase--"What invariants must this process satisfy?  What's the cleanest way to organize this structure?  How should these subsystems work together?"--are incredibly powerful when applied to a complex human process.  If I needed a CEO for my enterprise, not just my software company but my airline, my automaker, my restaurant chain, I would start by looking for candidates with a CS background.

You can see some of this relevance in the multitude of analogies CS people are able to apply to non-CS areas.  When's the last time you heard a math person refer to some real-world situation as "a real elliptic curve"?  The CS people I know have a rich vocabulary of cached concepts that address real-world situations: race conditions, interrupts, stacks, queues, bandwidth, latency, and many more that go over my head, because...

I didn't major in CS.  I saw it as too "applied," and went for more "elevated" areas.  I grew intellectually through that study, but I've upped my practical effectiveness enormously in the last few years by working with great CS people and absorbing all I can of their mindset.


Rational Groups Kick Ass

27 talisman 25 April 2009 02:37AM

Reply to: Extreme Rationality: It's Not That Great
Belaboring of: Rational Me Or We?
Related to: A Sense That More Is Possible

The success of Yvain's post threw me off completely.  My experience has been opposite to what he describes: x-rationality, which I've been working on since the mid-to-late nineties, has been centrally important to successses I've had in business and family life.  Yet the LessWrong community, which I greatly respect, broadly endorsed Yvain's argument that:

There seems to me to be approximately zero empirical evidence that x-rationality has a large effect on your practical success, and some anecdotal empirical evidence against it.

So that left me pondering what's different in my experience.  I've been working on these things longer than most, and am more skilled than many, but that seemed unlikely to be the key.

The difference, I now think, is that I've been lucky enough to spend huge amounts of time in deeply rationalist organizations and groups--the companies I've worked at, my marriage, my circle of friends.

And rational groups kick ass.

An individual can unpack free will or figure out that the Copenhagen interpretation is nonsense.  But I agree with Yvain that in a lonely rationalist's individual life, the extra oomph of x-rationality may well be drowned in the noise of all the other factors of success and failure.

But groups!  Groups magnify the importance of rational thinking tremendously:

  • Whereas a rational individual is still limited by her individual intelligence, creativity, and charisma, a rational group can promote the single best idea, leader, or method out of hundreds or thousands or millions. 
  • Groups have powerful feedback loops; small dysfunctions can grow into disaster by repeated reflection, and small positives can cascade into massive success.
  • In a particularly powerful feedback process, groups can select for and promote exceptional members.
  • Groups can establish rules/norms/patterns that 1) directly improve members and 2) counteract members' weaknesses.
  • Groups often operate in spaces where small differences are crucial.  Companies with slightly better risk management are currently preparing to dominate the financial space.  Countries with slightly more rational systems have generated the 0.5% of extra annual growth that leads, over centuries, to dramatically improved ways of life.  Even in family life, a bit more rationality can easily be the difference between gradual divergence and gradual convergence.

And we're not even talking about the extra power of x-rationality.  Imagine a couple that truly understood Aumann, a company that grokked the Planning Fallacy, a polity that consistently tried Pulling the Rope Sideways.

When it comes to groups--sized from two to a billion--Yvain couldn't be more wrong.

continue reading »

Two Blegs

4 talisman 26 March 2009 04:42AM
I'm not sure where to post this, so, using this comment thread as cover, I will hereby bleg for the following:
  • A good OB-level proof or explication of the innards of Aumann's theorem, much more precise than Hanson and Cowen's but less painful than Aumann's original or this other one.
  • Stories of how people have busted open questions or controversies using rationalist tools. (I think this in particular will be useful to learners.)

Playing Video Games In Shuffle Mode

17 talisman 23 March 2009 11:59AM

One of the missions of OB/LW is to attract new learners, and it's clear that they are succeeding.  But the format feels like a very difficult one for those new to these ideas, with beginner-level ideas interspersed with advanced or unsettled theory and meta-level discussions.    You wouldn't play <insert cool-sounding, anime-ish video game here> with the levels on shuffle mode, but reading Less Wrong must feel like doing so for initiates.

How do we make the site better for learners?  Provide a "syllabus" that shows a series of OB and LW posts which should be read in order?  Have a separate beginner site or feed or header?  Put labels on posts that designate them with a level?