Hey everybody, let's post our self-perceptions regarding our leadership abilities so that Eliezer can get some feedback as to his expectations!
As for me, I couldn't lead a starving man to a buffet.
People not performing an altruistic service to the organizers in clearing the building isn't very surprising. I find deciding on which restaurant/venue/entertainment a group should go to much more interesting. Often everyone in a group will repeatedly defer, until deadweight costs in time eat away a hefty chunk of the benefit of the excursion. That could be a real public good problem, or just a market failure to capture a private good.
Whoever has the strongest frame, whoever has the most certainty of their reality, wins. If your voice had wavered, if you had looked back in doubt to see if the people were actually following you, they would have been less willing to comply. But because you assumed authority, they assumed you had it and felt pressure to conform. It's one of those beautiful and counterintuitive things in life, but not a surprising one.
Eliezer,
Your accent on leadership in this context seems strange: it was in no one's interest to leave, so biased decision was to follow you, not hesitation in choosing to lead others outside.
By that time everyone knew it was time to leave, they had seen the lights repeatedly dimmed, but they were comfortable in the hall, and as long as no individual could be blamed for the antisocial act of staying, they would do so. Nevertheless their discomfort level was rising. Your action precipitated the decision, like seeding a supersaturated solution precipitates crystallisation. It's another example of an unstable group equilibrium just waiting to be disturbed, like the lonely dissenter in a group where the majority have private doubts. If the lights hadn't previously been repeatedly dimmed, the group might well not have followed you.
OC,
How would you define 'arbitrary belief?' The phrase brings to mind a 'belief' that the next toss of two fair dice will come up snake eyes.
I think everybody realizes that many of Eliezer's posts have little to do with overcoming bias. But they're very interesting anyway: compare the number of comments on his posts with the comments on other posts. So it doesn't seem very fair to attack him for that reason.
Rationality for the sake of rationality ends up as an arbitrary morality. If you do not have something more important in your eyes than "being rational", then you cannot learn from your success or failure; you will simply keep whatever definition of "rationality" you started with, because it is more important to you to be "rational" than to succeed.
No one puts a desperate effort (isshoukenmei) into rationality unless more than their own life is at stake. Leaving the pack is scarier than risking your life; that is why there are more motorcycle riders than rationalists.
I therefore decline to entertain the notion that I should provide you with some kind of idealized pure rationality which does not mention the goals that drive me as a rationalist. A pure desire to be "rational" is not the source of the drive to throw away old conceptions of "rationality" and invent better ones.
OC: Eliezer, enough with your nonsense about cryonicism, life-extensionism, trans-humanism, and the singularity. These things have nothing to do with overcoming bias. They are just your arbitrary beliefs.
I guess it's other way around: the point of most of the questions raised by Eliezer is to take a debiased look at controversial issues such as those you list, to hopefully build a solid case for sensible versions of them. For example, existing articles can point at fallacies in your assertions: you assume cryonics, etc. to be separate magisteria outside of the domain of rationality and argue from apparent absurdity of these issues.
My neighbor's daughter believes Santa Claus delivered her presents last week.Considering that everyone she knows will tell her that Santa Claus did indeed deliver those presents, that she can meet Santa Claus (or his representative) in person at a local mall, and that his travel is tracked in real time by the United States Department of Defense, I'd say your neighbor's daughter's belief is quite reasonable.
Your average 6-year-old is presented with far more evidence in favor of the Santa Claus hypothesis than the average adult is given in favor of the hypothesis that Jesus rose from the dead or that Lee Harvey Oswald acted alone...
If you pursue rationality as a means to some end, you will reject it if it ever comes into conflict with your deeper priority.
Never happens unless you get your concept of "rationality" wrong. How exactly does knowingly being irrational actually help you to build a space shuttle? Or a steam shovel? Or cure a disease? Only people with mistaken ideas about "rationality" will find themselves believing that "irrationality" will help.
In like sense, you are far better off wanting to know whether the sky is blue or green, than wanting to know The Truth About The Sky. "Truth" and "rationality" only work as top-level goals if you phrase them exactly correctly on the first try; otherwise you're a lot better off as a rationalist if your real goal is to build a steam shovel, because then you can notice what works and what doesn't.
This is why progress from the ancient Greeks was driven by empirical discovery rather than philosophy. Philosophy just compares the current conception of "rationality" to itself.
See Why truth?, Doublethink, and the nameless virtue.
Caledonian, I think you are confusing goals with truths. If truth is that the goal consists in certain things, rationality doesn't oppose it in any way. It is merely a tool that optimizes performance, not an arbitrary moral constraint.
"Only people with mistaken ideas about "rationality" will find themselves believing that "irrationality" will help."
Conjecture: Take two arbitrary, identical optimization processes. Pick some verifiable piece of information A. Give the first process the belief "A", and the second process the belief "~A". The average end utility of the first process will always be equal to or greater than the average end utility of the second process.
Attempted disproof: Suppose that both processes were programmed by naive human rationalists, who give the processes the supergoal of correcting mistaken beliefs. Also suppose that the universe is finite, and has a finite complexity. Both processes will improve their models of the universe, in accordance with the laws of rationality, until they eventually reach an asymptote due to the finite amount of modelable information. Because the utility is determined by the amount of new information that had to be added to the model, the second process will have a higher end utility, as it had to add the information "A" to correct the false information "~A".
Caledonian,
Why not accompany your assertions of irrationality and unexamined views with some substantive supporting arguments and create your own anonymous blog?
Nobody followed my lead. I guess that means my belief about my leadership abilities is well-founded!
Tom: What actually happens under your scenario is that the naive human rationalists frantically try to undo their work when they realize that the optimization processes keep reprogramming themselves to adopt the mistaken beliefs that are easiest to correct. :D
"Only people with mistaken ideas about "rationality" will find themselves believing that "irrationality" will help."
Because of computational restrictions, humans can easily maintain two mutually-contradictory beliefs, as long as those beliefs are never allowed to interact with each other or their necessary consequences.
Imagine a person that had the goal of avoiding the necessity of recognizing that two of their beliefs were incompatible. This goal cannot be explicitly stated, because if the person recognizes that they're avoiding the realization, they'd automatically failed. They MUST behave irrationally in order to meet their goal. The sub-module of their mind directly connected with this goal is behaving rationally in drawing the rest of the mind into delusion and madness, because that's the only way the goal can be met, but the whole mind can only be described as acting contrary to reason.
I can recognize this truth, that the goal in question can only be reached through irrational thinking. It's an insane, irrational goal, and only an irrational person would desire it.
Only a person with mistaken ideas about 'rationality' would make the claim you just did. Rationality is not compatible with all possible goals, and irrationality is a necessary precondition for some goals to be met. Your denial of this reality is... irrational.
Eliezer Yudkowsky, if you're going to enforce the comments policy then you should also self-enforce the overcoming bias posting policy instead of using posts to blithely proselytize your cryonicism/life-extensionism/trans-humanism/singularity religion.
OC, I'm not clear on what stated Overcoming Bias policy you could possibly be referring to. However, I will schedule a post for tomorrow (Sunday) on this topic, which, I confess, seems entirely unrelated to the subject of this particular post.
Each individual conference guest had the opportunity to do something which was For The Greater Good [lead people out], with some personal risk [drawing attention to himself, embarrassment in the unlikely event that nobody follows him] and very little to personally gain [the chance to not be in a building he probably preferred to be in].
A better question is "why didn't the organiser do what you did?" And my guess is that he found the embarassment factor high, high enough to over-ride the gain [he was one of the few people who did stand to gain by doing what you did]
In which case, everyone acted perfectly rationally, assuming you treat aversion to embarrassment as rational. You're to be saluted for not being especially averse to embarrassment, but I don't think this is a question of rationality or of leadership.
Side point: +2 awesomeness for "FOLLOW ME... TO FREEDOM!"
Reading the comments, I saw it somehow turned into a discussion on whether or not Eliezer Yudkowski elects biased favor for cryonics, transhumanism, etc. Didn't read far enough to see anyone hurl any accusations of Nazism or Hitler-likeness, but I'll weigh in and say that I'm new to Lesswrong and enjoy a good amount of Eliezer's articles and find them to be good tools for learning clear thought, and I also have almost no familiarity with any of his theories (or opinions as it may be) that fall outside the scope of heuristics, fallacies, statistics or decision. So far I've only managed to read a smattering of Bayesian statistics and Feynman (still struggling with both), but I would concider the whole thing a wasted effort if elected any human to a level beyond question. If I read Eliezer's articles on the Affect Heuristic and think "I'll just accept this as true because Mr Yudkowski says it's true. Phew! Thank goodness someone smarter than me did all that heavy thinking for me" than CLEARLY I need to reread it
EY, would you believe that my social life has greatly benefited from my pulling exactly this kind of stunt on a regular basis, and that HPJEV's example was a huge influence towards that?
Also, from now on, I'll be posting under my own name on LW.
Counterpoint to your post; the nail that sticks out gets hammered down. Leading and standing out means painting oneself as a target; so one needs to make sure that one's followers have one's back when it really counts. Even then, it might not be enough.
It's regrettable that so little of the conversation in the comments was about the post itself, because 1- I have found such discussions under other posts to be very insightful and 2- I was disappointed that so few comments were useful.
As someone else mentioned, friends arguing about what place to have dinner at often eats up significant amounts of time. Based on my personal experience, people will often feel grateful if you take responsibility, because they care less about the meal than about not infringing on other's meal preferences. Thankfully, if they are so respectful that they will do that, they will also probably be open to fixing the issue.
More generally, to supplement Yudkowsky's argument that people should be more willing to stand up, here are a couple ways to help you do it : -Make a quick estimate of expected utility : little to no ill consequences if you fail, and big payoff if you succeed. -The payoff is not only in actually achieving something or in acquiring social status, but also in knowing your reasoning was stronger than your instinct. -It will benefit other people greatly. Do it for them ! -It will make it easier to take this kind of decisions in the future.
Followup to: Lonely Dissent
True story: In July, I attended a certain Silicon Valley event. I was not an organizer, or a speaker, or in any other wise involved on an official level; just an attendee. It was an evening event, and after the main presentations were done, much of the audience hung around talking... and talking... and talking... Finally the event organizer began dimming the lights and turning them back up again. And the crowd still stayed; no one left. So the organizer dimmed the lights and turned them up some more. And lo, the people continued talking.
I walked over to the event organizer, standing by the light switches, and said, "Are you hinting for people to leave?" And he said, "Yes. In fact [the host company] says we've got to get out of here now - the building needs to close down."
I nodded.
I walked over to the exit.
I shouted, "LISTEN UP, EVERYONE! WE'VE GOT TO GO! OUR TIME HERE HAS PASSED! YOU CAN TALK OUTSIDE IF YOU LIKE! NOW FOLLOW ME... TO FREEDOM!"
I turned.
I marched out the door.
And everyone followed.
I expect there were at least two or three CEOs in that Silicon Valley crowd. It didn't lack for potential leaders. Why was it left to me to lead the CEOs to freedom?
Well, what was in it for them to perform that service to the group? It wasn't their problem. I'm in the habit of doing work I see being left undone; but this doesn't appear to be a common habit.
So why didn't some aspiring would-be future-CEO take the opportunity to distinguish themselves by acting the part of the leader? I bet at least five people in that Silicon Valley crowd had recently read a business book on leadership...
But it's terribly embarrassing to stand up in front of a crowd. What if the crowd hadn't followed me? What if I'd turned and marched out the door, and been left looking like a complete fool? Oh nos! Oh horrors!
While I have sometimes pretended to wisdom, I have never pretended to solemnity. I wasn't worried about looking silly, because heck, I am silly. It runs in the Yudkowsky family. There is a difference between being serious and being solemn.
As for daring to stand out in the crowd, to have everyone staring at me - that was a feature of grade school. The first time I gave a presentation - the first time I ever climbed onto a stage in front of a couple of hundred people to talk about the Singularity - I briefly thought to myself: "I bet most people would be experiencing 'stage fright' about now. But that wouldn't be helpful, so I'm not going to go there."
I expect that a majority of my readers like to think of themselves as having strong leadership qualities. Well, maybe you do, and maybe you don't. But you'll never get a chance to express those leadership qualities if you're too embarrassed to call attention to yourself, to stand up in front of the crowd and have all eyes turn to you. To lead the pack, you must be willing to leave the pack.