My Way

31 Eliezer_Yudkowsky 17 April 2009 01:25AM

Previously in seriesBayesians vs. Barbarians
Followup toOf Gender and Rationality, Beware of Other-Optimizing

There is no such thing as masculine probability theory or feminine decision theory.  In their pure form, the maths probably aren't even human.  But the human practice of rationality—the arts associated with, for example, motivating yourself, or compensating factors applied to overcome your own biases—these things can in principle differ from gender to gender, or from person to person.

My attention was first drawn to this possibility of individual differences in optimization (in general) by thinking about rationality and gender (in particular).  I've written rather more fiction than I've ever finished and published, including a story in which the main character, who happens to be the most rational person around, happens to be female.  I experienced no particular difficulty in writing a female character who happened to be a rationalist.  But she was not an obtrusive, explicit rationalist.  She was not Jeffreyssai.

And it occurred to me that I could not imagine how to write Jeffreyssai as a woman; his way of teaching is paternal, not maternal.  Even more, it occurred to me that in my writing there are women who are highly rational (on their way to other goals) but not women who are rationalists (as their primary, explicit role in the story).

It was at this point that I realized how much of my own take on rationality was specifically male, which hinted in turn that even more of it might be specifically Eliezer Yudkowsky.

continue reading »

Mechanics without wrenches

33 PhilGoetz 15 April 2009 08:09PM

Say you're taking your car to an auto mechanic for repairs.  You've been told he's the best mechanic in town.  The mechanic rolls up the steel garage door before driving the car into the garage, and you look inside and notice something funny.  There are no tools.  The garage is bare - just an empty concrete space with four bay doors and three other cars.

You point this out to the mechanic.  He shrugs it off, saying, "This is how I've always worked.  I'm just that good.  You were lucky I had an opening; I'm usually booked."  And you believe him, having seen the parking lot full of cars waiting to be repaired.

You take your car to another mechanic in the same town.  He, too, has no tools in his garage.  You visit all the mechanics in town, and find a few that have some wrenches, and others with a jack or an air compressor, but no one with a full set of tools.

You notice the streets are nearly empty besides your car.  Most of the cars in town seem to be in for repairs.  You talk to the townsfolk, and they tell you how they take their cars from one shop to another, hoping to someday find the mechanic who is brilliant and gifted enough to fix their car.

I sometimes tell people how I believe that governments should not be documents, but semi-autonomous computer programs.  I have a story that I'm not going to tell now, about incorporating inequalities into laws, then incorporating functions into them, then feedback loops, then statistical measures, then learning mechanisms, on up to the point where voters and/or legislatures set only the values that control the system, and the system produces the low-level laws and policy decisions (in a way that balances exploration and exploitation).  (Robin's futarchy in which you "vote on values, bet on beliefs" describes a similar, though less-automated system of government.)

And one reaction - actually, one of the most intelligent reactions - is, "But then... legislators would have to understand something about math."  As if that were a bug, and not a feature.

continue reading »

Awful Austrians

34 Swimmy 12 April 2009 06:06AM

Response to: The uniquely awful example of theism

Why is theism such an ever-present example of irrationality in this community? I think ciphergoth overstates the case. Even theism is not completely immune to evidence, as the acceptance of, say, evolution by so many denominations over time will testify. Theism is a useful whipping boy because it needs no introduction.

But I think the case is overstated for another reason. There are terrible epistemologies out there that are just as bad as theism's. Allow me to tell you a tale, of how I gave up my religion and my association with a school of economics at the same time.

I grew up in a southern Presbyterian church in the U.S. While I was taught standard pseudo-evidential defenses for belief, such as "creation science" and standard critiques of evolution, my church was stringently anti-evidentialist. Their preferred apologetic was something called presuppositionalism. It's certainly a minority apologetic among major defenders of Christianity today, especially compared to the cosmological or morality arguments. But it's a particularly rigorous attempt to defend beliefs against evidence nonetheless.

Presuppositionalism (in some forms) hangs on the problem of induction. We cannot ultimately justify any of our beliefs without first making some assumptions, otherwise we end in solipsism. Christianity, then, justifies itself not on evidence, but on internal consistency. It is ok for an argument to be ultimately circular, because all arguments are ultimately circular. Christianity alone maintains perfect worldview consistency when examined through this lens, and is therefore correct.

Since I've spent a lot of time thinking about this--it can take a considerable effort to change one's mind, after all--I can imagine innumerable things wrong with it, but they're not the focus of this entry. First, I just want to note how close it is to a kind of intro-level Bayesian understanding. Bayesians admit that we must have priors, that it's indeed nonsense to think we can even have an argument with one who doesn't. We must ultimately admit that certain justifications are going to be either recursive or based on priors. We believe that we should update our priors based on evidence, but there's nothing in the math that tells us we can't start with a prior for some position of 0% or 100%. (There is something in the math that tells us such probability assignments are very bad ideas, and we have more than enough cognitive bias literature that tells us we shouldn't be so damn overconfident. But then, what if you have a prior that keeps you from accepting such evidence?) It doesn't have any of the mathematical rigor, but it comes very close on a few major points.

continue reading »

Mandatory Secret Identities

28 Eliezer_Yudkowsky 08 April 2009 06:10PM

Previously in seriesWhining-Based Communities

"But there is a reason why many of my students have achieved great things; and by that I do not mean high rank in the Bayesian Conspiracy.  I expected much of them, and they came to expect much of themselves." —Jeffreyssai

Among the failure modes of martial arts dojos, I suspect, is that a sufficiently dedicated martial arts student, will dream of...

...becoming a teacher and having their own martial arts dojo someday.

To see what's wrong with this, imagine going to a class on literary criticism, falling in love with it, and dreaming of someday becoming a famous literary critic just like your professor, but never actually writing anything.  Writers tend to look down on literary critics' understanding of the art form itself, for just this reason.  (Orson Scott Card uses the analogy of a wine critic who listens to a wine-taster saying "This wine has a great bouquet", and goes off to tell their students "You've got to make sure your wine has a great bouquet".  When the student asks, "How?  Does it have anything to do with grapes?" the critic replies disdainfully, "That's for grape-growers!  I teach wine.")

Similarly, I propose, no student of rationality should study with the purpose of becoming a rationality instructor in turn.  You do that on Sundays, or full-time after you retire.

And to place a go stone blocking this failure mode, I propose a requirement that all rationality instructors must have secret identities.  They must have a life outside the Bayesian Conspiracy, which would be worthy of respect even if they were not rationality instructors.  And to enforce this, I suggest the rule:

  Rationality_Respect1(Instructor) = min(Rationality_Respect0(Instructor), Non_Rationality_Respect0(Instructor))

That is, you can't respect someone as a rationality instructor, more than you would respect them if they were not rationality instructors.

continue reading »

Purchase Fuzzies and Utilons Separately

75 Eliezer_Yudkowsky 01 April 2009 09:51AM

Previously in seriesMoney: The Unit of Caring

Yesterday:

There is this very, very old puzzle/observation in economics about the lawyer who spends an hour volunteering at the soup kitchen, instead of working an extra hour and donating the money to hire someone...

If the lawyer needs to work an hour at the soup kitchen to keep himself motivated and remind himself why he's doing what he's doing, that's fine.  But he should also be donating some of the hours he worked at the office, because that is the power of professional specialization and it is how grownups really get things done.  One might consider the check as buying the right to volunteer at the soup kitchen, or validating the time spent at the soup kitchen.

I hold open doors for little old ladies.  I can't actually remember the last time this happened literally (though I'm sure it has, sometime in the last year or so).  But within the last month, say, I was out on a walk and discovered a station wagon parked in a driveway with its trunk completely open, giving full access to the car's interior.  I looked in to see if there were packages being taken out, but this was not so.  I looked around to see if anyone was doing anything with the car.  And finally I went up to the house and knocked, then rang the bell.  And yes, the trunk had been accidentally left open.

Under other circumstances, this would be a simple act of altruism, which might signify true concern for another's welfare, or fear of guilt for inaction, or a desire to signal trustworthiness to oneself or others, or finding altruism pleasurable.  I think that these are all perfectly legitimate motives, by the way; I might give bonus points for the first, but I wouldn't deduct any penalty points for the others.  Just so long as people get helped.

But in my own case, since I already work in the nonprofit sector, the further question arises as to whether I could have better employed the same sixty seconds in a more specialized way, to bring greater benefit to others.  That is: can I really defend this as the best use of my time, given the other things I claim to believe?

continue reading »

Money: The Unit of Caring

95 Eliezer_Yudkowsky 31 March 2009 12:35PM

Previously in seriesHelpless Individuals

Steve Omohundro has suggested a folk theorem to the effect that, within the interior of any approximately rational, self-modifying agent, the marginal benefit of investing additional resources in anything ought to be about equal.  Or, to put it a bit more exactly, shifting a unit of resource between any two tasks should produce no increase in expected utility, relative to the agent's utility function and its probabilistic expectations about its own algorithms.

This resource balance principle implies that—over a very wide range of approximately rational systems, including even the interior of a self-modifying mind—there will exist some common currency of expected utilons, by which everything worth doing can be measured.

In our society, this common currency of expected utilons is called "money".  It is the measure of how much society cares about something.

This is a brutal yet obvious point, which many are motivated to deny.

With this audience, I hope, I can simply state it and move on.  It's not as if you thought "society" was intelligent, benevolent, and sane up until this point, right?

I say this to make a certain point held in common across many good causes.  Any charitable institution you've ever had a kind word for, certainly wishes you would appreciate this point, whether or not they've ever said anything out loud.  For I have listened to others in the nonprofit world, and I know that I am not speaking only for myself here...

continue reading »

Raising the Sanity Waterline

112 Eliezer_Yudkowsky 12 March 2009 04:28AM

To paraphrase the Black Belt Bayesian:  Behind every exciting, dramatic failure, there is a more important story about a larger and less dramatic failure that made the first failure possible.

If every trace of religion was magically eliminated from the world tomorrow, then—however much improved the lives of many people would be—we would not even have come close to solving the larger failures of sanity that made religion possible in the first place.

We have good cause to spend some of our efforts on trying to eliminate religion directly, because it is a direct problem.  But religion also serves the function of an asphyxiated canary in a coal mine—religion is a sign, a symptom, of larger problems that don't go away just because someone loses their religion.

Consider this thought experiment—what could you teach people that is not directly about religion, which is true and useful as a general method of rationality, which would cause them to lose their religions?  In fact—imagine that we're going to go and survey all your students five years later, and see how many of them have lost their religions compared to a control group; if you make the slightest move at fighting religion directly, you will invalidate the experiment.  You may not make a single mention of religion or any religious belief in your classroom, you may not even hint at it in any obvious way.  All your examples must center about real-world cases that have nothing to do with religion.

If you can't fight religion directly, what do you teach that raises the general waterline of sanity to the point that religion goes underwater?

continue reading »