All of talisman's Comments + Replies

I do not think your claim is what you think it is.

I think your claim is that some people mistake the model for the reality, the map for the territory. Of course models are simpler than reality! That's why they're called "models."

Physics seems to have gotten wiser about this. The Newtonians, and later the Copenhagenites, did fall quite hard for this trap (though the Newtonians can be forgiven to some degree!). More recently, however, the undisputed champion physical model, whose predictions hold to 987 digits of accuracy (not really), has the ... (read more)

Belated apologies for cranky tone on this comment.

Done, thanks for the feedback!

I made the mistake I'm talking about---assuming certain things were well-known.

I actually think Liron's slideshow needs a lot of work, but it seems very much like the kind of thing LWers should be trying to do out in the world.

the slideshow was completely useless to me

Yes, of course it was. It was created for teenagers who are utterly unfamiliar with this way of thinking.

its quality was poor

OK. Can you improve it or do better?

1MrHen
I would agree. Possibly, but I have little reason to do so since this sort of thing is not particularly applicable to my life. Of note, I am not trying to be a jerk or make this a big deal. My comment really has little to do with Liron's post. It has everything to do with you telling me to upvote something. I just, politely, want you to not do that again. I had typed up more details on why I downvoted but they are irrelevant for what I wanted to say to you.

Definitely worth reading up. K & T are the intellectual fathers of the entire modern heuristics and biases program. There was some earlier work (e.g. Allais) but from what I hazily recall that work was fairly muddled conceptually.

Funny. I feel like on OB and LW utility theory is generally taken as the air we breathe.

0Eliezer Yudkowsky
It is - but that's OB and LW.

Upvoted for calling your own post "completely wrong"!

Vladimir - "concentrated confusion", "a thousand angry cats": that's exactly the kind of spice that your earlier post needed! :-)

Also fewer function words...

talisman100
  • Let me add to the chorus of "you rock!" This is a nice piece of work. I don't know how you got the chance to present to a group of young people about this stuff, but kudos also to whoever gave you that opportunity.
  • Some have pointed out potential improvements. This seems like a solid way for anyone interested to add a quantum of effort to the cause---improve the presentation a bit, and post your improved version somewhere. (Where?)

I don't at all disagree that for those who can do it, the CS/math parlay is excellent.

5MendelSchmiedekamp
But to be clear, if you're not already a showing talent as a programmer, and you want to be skilled as one, if you have to pick between mathematics and CS, pick mathematics and learn a programming language on the side (give yourself a challenging, sizable project which you care about - and get it done, even if it takes a year or two). The cognitive skills you will learn in mathematics will do more to cover for your gaps as a programmer than most CS programs will teach you. Bear in mind, I'm talking about effectiveness, rather than credentials. Credentials are an entirely different matter - and like most status games it is a constantly evolving mess.

I am very successful in my secret identity life, so no, this is not some kind of grass-is-greener observation; rather, it's an attempt to give practical advice to my younger selves out there. I majored in math and physics, and did well, and am in the world now, and can concretely see the ways that a CS education would have helped me, ways that people less smart than I am think better!

As soon as I graduated with a CS degree I realized I should have been in philosophy the whole time.

I'm comparing CS only to other technical majors.

CS is not something e

... (read more)
1MrHen
Fair enough. I just wanted to be sure. :)

True, but what I want to emphasize is that the CS way of thinking is extremely valuable outside of the software field.

I think the problem is a combination of:

  • length
  • density of ideas too low --- long section resummarizing old posts
  • prose hard to read, feels somehow flat --- try using shorter paragraphs, varying sentence lengths, using more tangible words and examples

Comparing to Robin's and Eliezer's stuff, the gold standards:

Robin's are generally very short, high-level, and high-density. Easy to read quickly for "what's this about? do I care?" and then reread several times to think carefully about.

Eliezer's are long and lower-density but meticulous and car... (read more)

0Vladimir_Nesov
Thank you, that was helpful. I'll write a shorter summary article in a few days (linking to a revised version of this article).

That's because you didn't specify the sequence ahead of time, right?

0Alicorn
Writing down a sequence ahead of time makes it more interesting when it turns up, not more unlikely. Given the possibility of cheating, it might make it more likely.

Groundless or not, if you propoose to run two experiments X and Y, and select outcomes x of experiment X and y of experiment Y before running the experiments, and assign x and y the same probabilities, you have to be equally surprised by x occurring as you are by y occurring, or I'm missing something deep about what you're saying about probabilities. Are you using the word "probability" in a different sense than Jaynes?

1Alicorn
I haven't read Jaynes's work on the subject, so I couldn't say. However, if he thinks that equal probabilities mean equal obligation to be surprised, I disagree with him. It's easy to do things that are spectacularly unlikely - flip through a shuffled deck of cards to see a given sequence, for instance - that do not, and should not, surprise you at all.

This post confused me enormously. I thought I must be missing something, but reading over the comments, this seems to be true for virtually all readers.

What exactly do you mean by "bead jar guess"? "Surprise"? "Actual probability"? Are you making a new point or explaining something existing? Are you purposely being obscure "to make us think"?

I propose replacing this entire post with the following text:

Hey everybody! Read E.T. Jaynes's Probability Theory: The Logic Of Science!

0talisman
Belated apologies for cranky tone on this comment.
2Alicorn
By "bead jar guess" I mean a wild, nearly-groundless assignment of a probability to a proposition. This is as opposed to a solidly backed up estimate based on something like well-controlled sample data, or a guess made with an appeal to an inelegant but often-effective hack like the availability heuristic.

Relatively rational people can form deeply irrational groups, and vice versa.

I would probably take a group with rational institutions but irrational members over a group with irrational institutions but rational members.

Of course, rational people will be better on average at building rational groups, so I would still predict a positive correlation in the experiment.

  • I was several years away from starting to learn about x-rationality when I met my partner.

  • Since there seems to be some interest, I'm going to try to collect my thoughts to describe the contribution of x-rationality to my personal life, but this may take considerable time; I've never tried to put it in words, and there's a strong dash of "dancing about architecture" to it.

0MBlume
Add me to the list who'd really enjoy reading about this.

I wanted to avoid the anecdotes-ain't-data writeoff and to avoid making the post too much about me specifically. Is that a mistake?

8AnnaSalamon
Anecdotes often are significant evidence; it depends how rare the anecdotal successes, how large a population of individuals the anecdotes are selected from (either by you as you choose anecdotes, or implicitly by the community if individuals who by chance have certain sorts of anecdotes are more likely to share), and on how high the prior is on "these tricks really do help" (if the tricks are a priori plausible, it takes less data to establish that they're likely to really work). But whether or not your anecdotes are significant evidence, do share. If nothing else, it'll give us a better idea of what kind of rationality you have found to be what kind of useful. "Rationality" is such an abstract-sounding term; we need to put flesh on it, from scenes in daily life. Being about you specifically is fine.
3Eliezer Yudkowsky
Probably. Specificity really matters for effective writing. Besides, this is, technically, a blog...

I love the word "aspiring." It feels...aspirational. Humble.

I don't like "Less Wronger" or other names that are about the affiliation rather than the thing itself.

talisman110

A week ago I would have thought this was a silly discussion. As I've thought more about LW's group nature, I've realized that this kind of cultural thing does matter.

It feels group-narcissistic to waste time on this, but the small difference this makes will be magnified over years and hundreds of thousands of repetitions. E.g.: at some point a major news outlet will do an article on OB/LW, and it will repeatedly use whatever the self-moniker is, and impressions of OB/LW will be slightly altered. (Parallel: I can't stop being marginally more negative o... (read more)

There are three specific examples linked to; I agree that I could/should have done more.

3hrishimittal
How have you used rationality in your marriage and family life? Did it help you choose the right partner? How do you 'imagine a couple that truly understood Aumann'?
3AnnaSalamon
How about examples from your own work, marriage, or circle of friends?

Judgement Day by Nathaniel Branden. (I think this is the same book as My Years With Ayn Rand.)

Hero becomes Ayn Rand's closest confidant, co-builder of Objectivism, lover; gets drummed out.

Message: every cause wants to be a cult; the enormous power of personal charisma; an excellent antidote for the recalcitrant Objectivist in your life.

I disagree strongly that "the danger of leverage" is the key message of LTCM. The key message for rationalists has to do with the subtle nature of correlation; the key message for risk managers has to do with the importance of liquidity.

"The danger of leverage" also isn't much of a message. That's like saying the message of the Iraq War is "the danger of foreign policy."

Does "popular belief" hold that LTCM wasn't hedged? Anyone who believes that is very far from learning anything from the LTCM story.

0ahaspel
Overconfidence leads one to seek leverage. Leverage in turn magnifies the consequences of errors. Leverage on LTCM's scale, which dwarfed that of its predecessors, turns errors into catastrophes. Another, probably more important lesson concerns the degree to which one should rely on historical data. If something never has happened, or at least not recently, one cannot safely bet the house that it never will happen. I probably shouldn't have speculated on what "popular belief" does or does not hold, but it has often been written, falsely, that LTCM was unhedged against Russia defaulting on its own bonds.

Not to focus exclusively on markets, and it's not a book, but this page of a Michael Lewis piece from Portfolio.com is a crucial read (and short!):

http://www.portfolio.com/news-markets/national-news/portfolio/2008/11/11/The-End-of-Wall-Streets-Boom?page=9#page=9

I was shocked to learn that when the big Wall Street firms all became public companies, which is arguably the most important precursor event to the current unpleasantness, many people knew this was a terrible idea!

There are at least two parts to the failure message: think carefully about the public ... (read more)

1Douglas_Knight
I'm concerned about hindsight. Lewis only says that the old guard objected, but he doesn't make it entirely clear why; "moral disapproval" doesn't suggest consequentialism. Lewis certainly does suffer from hindsight. The article claims that he was "waiting for the end of Wall Street," while his book says "I didn't think Wall Street would collapse."

What are the best books about the fall of the Roman Empire?

This seems like one of the most important failures in history, and most important to understand.

I just started Peter Heather's book, but haven't read enough to say anything except that the writing is a bit clunky.

A Demon Of Our Own Design by Richard Bookstaber

This book came out two years ago but reads like it was just written.

The lesson is similar to e.g. When Genius Failed but it gets into the grit a bit more and is more detailedly insightful about markets. Bookstaber has worked in markets rather than just writing about them, and it shows.

Dull in spots, but that's fairly standard in books like this, since you can't sell a 130-page book.

I don't buy this account of Enron, which has become the standard fable. There was a lot there that was more like straight-up fraud than smart people overcomplicating things and missing the down-home common sense.

Edit: I agree with the "hiding from reality = downward spiral" part strongly.

0Douglas_Knight
I thought that the standard fable focused on the straight-up fraud of the endgame. I focus on the incentives to the sales staff. As I understand it, their exit was well-timed, so the situation wasn't that hard to unravel. But this leaves the question of whether the guys at the top understood what was going on, let alone intended it.
1Eliezer Yudkowsky
The book does represent that in the endgame it was spectacularly complicated, charming fraud. But they do seem to have slid into fraud more than having started with that directly in mind - at least so the book alleges. Complication eases the path into fraud, and other downward spirals.

Thanks; I had forgotten about that post.

I'd like to understand the precise arguments so that I can understand the limits, so that I can think about Robin and Eliezer's disagreement, so I can get intuition for the Hanson/Cowen statement that "A more detailed analysis says not only that people must ultimately agree, but also that the discussion path of their alternating expressed opinions must follow a random walk." I'm guessing that past the terminology it's not actually that complicated, but I haven't been able to find the four hours to understand all the terminology and structure.

This post feels out of accord with the Virtues. It feels like a debate brief against religion rather than a curious, light, humble, empirical exploration. "On the wrong side of every moral issue in American history"? "Denying the government billions in tax revenue"? This doesn't strike me as the talk of a truthseeker; rather, a polemicist.

Religion is true (vague, but you know what I mean I hope) with a very low odds ratio, perhaps 1-to-100k against? In any case way down in the hazy low probability region where the intuition has a h... (read more)

1Sideways
On reflection I agree with you that my language was ill-considered. I'm not going to edit the original post, because there's no point pretending I didn't make the mistake. Incidentally, since creating this post, my karma has gone from 45 to 4294967341. I would guess that this post was downvoted to a negative number, and it broke some sort of counter. I don't know of any better way than a comment to report the bug, so there you have it.

Hm. I'm far from an expert, and it could well be that there are ten times as many anonymous attacks, but off the top of my head I think of WTC '93, the Millenium plot, 9/11, London trains, Madrid trains, Israel suicide bombings, Munich massacre, Iraq beheadings, USS Cole, bombings of US embassies.

Not off the top of my head: Golden Mosque bombing, Tamil Tigers numerous bombings, IRA-related terrorism, etc. Scanning through this I find many more terrorist attacks that were done with a clear political or propaganda purpose.

1thomblake
A good heuristic I use when I'm tempted to write comments such as these: "The plural of anecdote is not data!" Note also that attacks for a reason may well be more memorable than anonymous attacks.
2gwern
No, it's not quite that bad! It's more like twice as many: Abrahms references his analysis of a RAND dataset, and also Bruce Hoffman's "Why Terrorists Don't Claim Credit" (in Terrorism and Political Violence, Vol 9 #1 1997). I haven't read the latter, but his analysis seems enough for me. I think there's definitely something of a mental bias here - it's vastly easier to remember the rare dramatic attack (which sooner or later someone will claim credit for) than the many anonymous ones.
talisman110

The fear and hatred of gambling. Contra Tyler Cowen, betting your beliefs is one of the best paths to both individual and group rationality. You should be doing it twice a day, like brushing your teeth. The beliefs that don't get bet get cavities and rot; the beliefs that are unbettable create unbreakable deadlocks that later require ophtalmological intervention. Bet!

3William
One warning though: Gambler's ruin is very possible with betting systems, even if your strategy has a positive expected value.

Modeling terrorists as trying to kill as many people as possible strikes me as insufficient. In Terror and Consent, Philip Bobbitt models their aims as propagandistic, which feels more like the right angle---hence the focus on inefficient but spectacular killing.

0gwern
Incidentally, I've expanded my above comment into an essay called 'Terrorism is not Effective' (http://www.gwern.net/Terrorism%20is%20not%20Effective).
4gwern
This would, I think, fall afoul of Abrahm's point, '5) terrorist organizations generally carry out anonymous attacks, precluding target countries from making policy concessions;'. It's hard to be propagandistic if it's unclear what this deed is the propaganda of.

That assumes beginners know what they know, which strikes me as a poor assumption.

That link doesn't work due to the angle brackets.

Rereading some of those old posts it's fascinating to see how much Eliezer's writing has sharpened from then to now!

I can't get the html links to work; can someone help?

2Eliezer Yudkowsky
The editor for posts is not the same as the editor for comments; it's a WYSIWYG editor and the key is those little buttons at the top. Fixed it for you, anyway, although the "beginner" and "advanced" links seem to reference the same post.
0[anonymous]
That's weird. That looks like the correct syntax to me. One of the missions of OB/LW is to attract new learners, and it's clear that they are succeeding. But the format feels like a very difficult one for those new to these ideas, with beginner-level ideas interspersed with advanced or unsettled theory and meta-level discussions. You wouldn't play with the levels on shuffle mode, but reading Less Wrong must feel like doing so for initiates. How do we make the site better for learners? Provide a "syllabus" that shows a series of OB and LW posts which should be read in order? Have a separate beginner site or feed or header? Put labels on posts that designate them with a level?

To be clear, my comment above isn't meant to be a "charge"! Among other things, Eliezer is exceptionally gifted at making ideas interesting and accessible in a way that Robin isn't at all. I'm looking forward to his book coming out and changing the world.

I personally love his stuff, and think it's great 1) for people that are completely new to these ideas; 2) for people that are fairly advanced and have the ideas deep in their bones.

For people in between, I sometimes feel like his writing presents too much of a glide path---answers too many q... (read more)

talisman100

It's much more than peer pressure. Eliezer, along with the other authors, use a confident, rhythmic, almost biblical style, which is very entertaining and compelling. You don't just learn deep things with EY, you feel like you're learning deep things. Robin Hanson's thought is incredibly deep, but his style is much more open, and I would guess you find his writings not to have this property.

Robin and Eliezer have debated writing style over at OB, and I highly recommend you read that debate, Patrick.

You should also, in my opinion, be very cautious abou... (read more)

3Roko
Eliezer's writing is clearly not absolutely persuasive, because it didn't persuade me, even when it was correct!

I should note that if I'm teaching deep things, then I view it as important to make people feel like they're learning deep things, because otherwise, they will still have a hole in their mind for "deep truths" that needs filling, and they will go off and fill their heads with complete nonsense that has been written in a more satisfying style.

talisman460

Occasionally, well-respected community members could say things that are intentionally false, but persuasive and subtle, a la http://www.overcomingbias.com/2008/02/my-favorite-lia.html.

You get points for catching these mistakes. Perhaps you submit your busts privately to some arbiter so others have the same challenge.

Later, the error is revealed and discussed.

This would also have the benefit of causing everyone to read the most-respected members' writings ultra-critically, rather than sitting back and being spoon-fed.

One key thing this idea has is short t... (read more)

5Jiro
This doesn't work, because people here say controversial things. By definition, controversial means that many people think they are wrong, but they do not think they are wrong themselves. Anyone who finds a mistake might have found one of the intentional mistakes, or might happen to disagree on a controversial issue and believes the community member made a mistake where the community member thinks otherwise. Unless you think that community members are perfectly correct 100% of the time on controversial issues or at least always recognize their own mistakes when pointed out to them (and no human being is like that), the idea will become unworkable. Everyone will have to think "is this an intentional misake, or is an unintentional mistake that the community member won't recognize as such, earning me demerits for pointing it out?"
3MBlume
I can see the need for anonymity to avoid spoilers, but I think doing the thing publicly has benefits too -- that way there's the risk on the other side of having publicly denounced the Great Teacher when he was speaking truthfully.

The phraseology "raise them X" suggests to me inculcating deep, emotional, childhood-locked belief in X. The only X for which that seems supportable is rationality itself.

5Eliezer Yudkowsky
There's no way you can teach a child any particular definition of rationality that's true enough to be locked in that way. Hell, there's no way you can teach an adult that. You could teach them to respect the truth... but no matter what you teach them, sooner or later it's going to fail them, and the best you can do is try to deliver that warning.

Raise them many-worlds.

0drnickbone
An excellent idea... Yes, Virginia, there is a Santa Claus; he's just not in our world. Here's a little exercise for MWI fans: estimate how many worlds there have to be before we can be sure Santa exists in one of them (his sleigh flying around because of random molecular collisions, getting into and out of houses by quantum tunnelling...)? Stuff like this reveals just how strange MWI is, and the attractions of single-world interpretations.
4Eliezer Yudkowsky
Don't you mean "raise them to believe in a collapse postulate"? And isn't that a little too strong a test?

I do a lot of interviewing candidates for jobs, and it's essential to be aware of both those concepts. In working on our hiring process, we discuss both concepts, in words very similar to yours.

I've heard occasional complaints about certain things we do in our interviews, of the form "what does X have to do with being a good Y?!". These complaints invariably come from people who didn't get offers, and give me a warm glow at having made the correct decision.

No idea the extent to which EY's approval upped this, but what I can say is that I was less than half through the post before I jumped to the bottom, voted Up, and looked for any other way to indicate approval.

It's immediately surprising, interesting, obvious-in-retrospect-only, and most importantly, relevant to everyday life. Superb.