Histocracy: Open, Effective Group Decision-Making With Weighted Voting

14 HonoreDB 17 January 2012 10:35PM

The following is slightly edited from a pitch I wrote for a general audience. I've added blog-specific content afterwards.


continue reading »

Defecting by Accident - A Flaw Common to Analytical People

86 lionhearted 01 December 2010 08:25AM

Related to: Rationalists Should WinWhy Our Kind Can't Cooperate, Can Humanism Match Religion's Output?, Humans Are Not Automatically Strategic, Paul Graham's "Why Nerds Are Unpopular"

The "Prisoner's Dilemma" refers to a game theory problem developed in the 1950's. Two prisoners are taken and interrogated separately. If either of them confesses and betrays the other person - "defecting" - they'll receive a reduced sentence, and their partner will get a greater sentence. However, if both defect, then they'll both receive higher sentences than if neither of them confessed.

This brings the prisoner to a strange problem. The best solution individually is to defect. But if both take the individually best solution, then they'll be worst off overall. This has wide ranging implications for international relations, negotiation, politics, and many other fields.

Members of LessWrong are incredibly smart people who tend to like game theory, and debate and explore and try to understand problems like this. But, does knowing game theory actually make you more effective in real life?

I think the answer is yes, with a caveat - you need the basic social skills to implement your game theory solution. The worst-case scenario in an interrogation would be to "defect by accident" - meaning that you'd just blurt out something stupidly because you didn't think it through before speaking. This might result in you and your partner both receiving higher sentences... a very bad situation. Game theory doesn't take over until basic skill conditions are met, so that you could actually execute any plan you come up with.

The Purpose of This Post: I think many smart people "defect" by accident. I don't mean in serious situations like a police investigation. I mean in casual, everyday situations, where they tweak and upset people around them by accident, due to a lack of reflection of desired outcomes.

Rationalists should win. Defecting by accident frequently results in losing. Let's examine this phenomenon, and ideally work to improve it.

Contents Of This Post

  • I'll define "defecting by accident."
  • I'll explain a common outcome of defecting by accident.
  • I'll give some recent, mild examples of accidental defections.
  • I'll give examples of how to turn accidental defections into cooperation.
  • I'll give some examples of how this can make you more successful at your goals.
  • I'll list some books I recommend if you decide to learn more on the topic.
continue reading »

Philosophical Landmines

84 [deleted] 08 February 2013 09:22PM

Related: Cached Thoughts

Last summer I was talking to my sister about something. I don't remember the details, but I invoked the concept of "truth", or "reality" or some such. She immediately spit out a cached reply along the lines of "But how can you really say what's true?".

Of course I'd learned some great replies to that sort of question right here on LW, so I did my best to sort her out, but everything I said invoked more confused slogans and cached thoughts. I realized the battle was lost. Worse, I realized she'd stopped thinking. Later, I realized I'd stopped thinking too.

I went away and formulated the concept of a "Philosophical Landmine".

I used to occasionally remark that if you care about what happens, you should think about what will happen as a result of possible actions. This is basically a slam dunk in everyday practical rationality, except that I would sometimes describe it as "consequentialism".

The predictable consequence of this sort of statement is that someone starts going off about hospitals and terrorists and organs and moral philosophy and consent and rights and so on. This may be controversial, but I would say that causing this tangent constitutes a failure to communicate the point. Instead of prompting someone to think, I invoked some irrelevant philosophical cruft. The discussion is now about Consequentialism, the Capitalized Moral Theory, instead of the simple idea of thinking through consequences as an everyday heuristic.

It's not even that my statement relied on a misused term or something; it's that an unimportant choice of terminology dragged the whole conversation in an irrelevant and useless direction.

That is, "consequentialism" was a Philosophical Landmine.

In the course of normal conversation, you passed through an ordinary spot that happened to conceal the dangerous leftovers of past memetic wars. As a result, an intelligent and reasonable human was reduced to a mindless zombie chanting prerecorded slogans. If you're lucky, that's all. If not, you start chanting counter-slogans and the whole thing goes supercritical.

It's usually not so bad, and no one is literally "chanting slogans". There may even be some original phrasings involved. But the conversation has been derailed.

So how do these "philosophical landmine" things work?

It looks like when a lot has been said on a confusing topic, usually something in philosophy, there is a large complex of slogans and counter-slogans installed as cached thoughts around it. Certain words or concepts will trigger these cached thoughts, and any attempt to mitigate the damage will trigger more of them. Of course they will also trigger cached thoughts in other people, which in turn... The result being that the conversation rapidly diverges from the original point to some useless yet heavily discussed attractor.

Notice that whether a particular concept will cause trouble depends on the person as well as the concept. Notice further that this implies that the probability of hitting a landmine scales with the number of people involved and the topic-breadth of the conversation.

Anyone who hangs out on 4chan can confirm that this is the approximate shape of most thread derailments.

Most concepts in philosophy and metaphysics are landmines for many people. The phenomenon also occurs in politics and other tribal/ideological disputes. The ones I'm particularly interested in are the ones in philosophy, but it might be useful to divorce the concept of "conceptual landmines" from philosophy in particular.

Here's some common ones in philosophy:

  • Morality
  • Consequentialism
  • Truth
  • Reality
  • Consciousness
  • Rationality
  • Quantum

Landmines in a topic make it really hard to discuss ideas or do work in these fields, because chances are, someone is going to step on one, and then there will be a big noisy mess that interferes with the rather delicate business of thinking carefully about confusing ideas.

My purpose in bringing this up is mostly to precipitate some terminology and a concept around this phenomenon, so that we can talk about it and refer to it. It is important for concepts to have verbal handles, you see.

That said, I'll finish with a few words about what we can do about it. There are two major forks of the anti-landmine strategy: avoidance, and damage control.

Avoiding landmines is your job. If it is a predictable consequence that something you could say will put people in mindless slogan-playback-mode, don't say it. If something you say makes people go off on a spiral of bad philosophy, don't get annoyed with them, just fix what you say. This is just being a communications consequentialist. Figure out which concepts are landmines for which people, and step around them, or use alternate terminology with fewer problematic connotations.

If it happens, which it does, as far as I can tell, my only effective damage control strategy is to abort the conversation. I'll probably think that I can take those stupid ideas here and now, but that's just the landmine trying to go supercritical. Just say no. Of course letting on that you think you've stepped on a landmine is probably incredibly rude; keep it to yourself. Subtly change the subject or rephrase your original point without the problematic concepts or something.

A third prong could be playing "philosophical bomb squad", which means permanently defusing landmines by supplying satisfactory nonconfusing explanations of things without causing too many explosions in the process. Needless to say, this is quite hard. I think we do a pretty good job of it here at LW, but for topics and people not yet defused, avoid and abort.

ADDENDUM: Since I didn't make it very obvious, it's worth noting that this happens with rationalists, too, even on this very forum. It is your responsibility not to contain landmines as well as not to step on them. But you're already trying to do that, so I don't emphasize it as much as not stepping on them.

How To Have Things Correctly

57 Alicorn 17 October 2012 06:10AM

I think people who are not made happier by having things either have the wrong things, or have them incorrectly.  Here is how I get the most out of my stuff.

Money doesn't buy happiness.  If you want to try throwing money at the problem anyway, you should buy experiences like vacations or services, rather than purchasing objects.  If you have to buy objects, they should be absolute and not positional goods; positional goods just put you on a treadmill and you're never going to catch up.

Supposedly.

I think getting value out of spending money, owning objects, and having positional goods are all three of them skills, that people often don't have naturally but can develop.  I'm going to focus mostly on the middle skill: how to have things correctly1.

continue reading »

The curse of identity

121 Kaj_Sotala 17 November 2011 07:28PM

So what you probably mean is, "I intend to do school to improve my chances on the market". But this statement is still false, unless it is also true that "I intend to improve my chances on the market". Do you, in actual fact, intend to improve your chances on the market?

I expect not. Rather, I expect that your motivation is to appear to be the sort of person who you think you would be if you were ambitiously attempting to improve your chances on the market... which is not really motivating enough to actually DO the work. However, by persistently trying to do so, and presenting yourself with enough suffering at your failure to do it, you get to feel as if you are that sort of person without having to actually do the work. This is actually a pretty optimal solution to the problem, if you think about it. (Or rather, if you DON'T think about it!) -- PJ Eby

I have become convinced that problems of this kind are the number one problem humanity has. I'm also pretty sure that most people here, no matter how much they've been reading about signaling, still fail to appreciate the magnitude of the problem.

Here are two major screw-ups and one narrowly averted screw-up that I've been guilty of. See if you can find the pattern.

  • When I began my university studies back in 2006, I felt strongly motivated to do something about Singularity matters. I genuinely believed that this was the most important thing facing humanity, and that it needed to be urgently taken care of. So in order to become able to contribute, I tried to study as much as possible. I had had troubles with procrastination, and so, in what has to be one of the most idiotic and ill-thought-out acts of self-sabotage possible, I taught myself to feel guilty whenever I was relaxing and not working. Combine an inability to properly relax with an attempted course load that was twice the university's recommended pace, and you can guess the results: after a year or two, I had an extended burnout that I still haven't fully recovered from. I ended up completing my Bachelor's degree in five years, which is the official target time for doing both your Bachelor's and your Master's.
  • A few years later, I became one of the founding members of the Finnish Pirate Party, and on the basis of some writings the others thought were pretty good, got myself elected as the spokesman. Unfortunately – and as I should have known before taking up the post – I was a pretty bad choice for this job. I'm good at expressing myself in writing, and when I have the time to think. I hate talking with strangers on the phone, find it distracting to look people in the eyes when I'm talking with them, and have a tendency to start a sentence over two or three times before hitting on a formulation I like. I'm also bad at thinking quickly on my feet and coming up with snappy answers in live conversation. The spokesman task involved things like giving quick statements to reporters ten seconds after I'd been woken up by their phone call, and live interviews where I had to reply to criticisms so foreign to my thinking that they would never have occurred to me naturally. I was pretty terrible at the job, and finally delegated most of it to other people until my term ran out – though not before I'd already done noticeable damage to our cause.
  • Last year, I was a Visiting Fellow at the Singularity Institute. At one point, I ended up helping Eliezer in writing his book. Mostly this involved me just sitting next to him and making sure he did get writing done while I surfed the Internet or played a computer game. Occasionally I would offer some suggestion if asked. Although I did not actually do much, the multitasking required still made me unable to spend this time productively myself, and for some reason it always left me tired the next day. I felt somewhat unhappy with this, in that I felt I was doing something that anyone could do. Eventually Anna Salamon pointed out to me that maybe this was something that I was more capable of doing than others, exactly because so many people would feel that ”anyone” could do this and thus would prefer to do something else.

It may not be immediately obvious, but all three examples have something in common. In each case, I thought I was working for a particular goal (become capable of doing useful Singularity work, advance the cause of a political party, do useful Singularity work). But as soon as I set that goal, my brain automatically and invisibly re-interpreted it as the goal of doing something that gave the impression of doing prestigious work for a cause (spending all my waking time working, being the spokesman of a political party, writing papers or doing something else few others could do). "Prestigious work" could also be translated as "work that really convinces others that you are doing something valuable for a cause".

continue reading »

What Intelligence Tests Miss: The psychology of rational thought

35 Kaj_Sotala 11 July 2010 11:01PM

This is the fourth and final part in a mini-sequence presenting Keith E. Stanovich's excellent book What Intelligence Tests Miss: The psychology of rational thought.

If you want to give people a single book to introduce people to the themes and ideas discussed on Less Wrong, What Intelligence Tests Miss is probably the best currenty existing book for doing so. It does have a somewhat different view on the study of bias than we on LW: while Eliezer concentrated on the idea of the map and the territory and aspiring to the ideal of a perfect decision-maker, Stanovich's perspective is more akin to bias as a thing that prevents people from taking full advantage of their intelligence. Regardless, for someone less easily persuaded by LW's somewhat abstract ideals, reading Stanovich's concrete examples first and then proceeding to the Sequences is likely to make the content presented in the sequences much more interesting. Even some of our terminology such as "carving reality at the joints" and the instrumental/epistemic rationality distinction will be more familiar to somebody who was first read What Intelligence Tests Miss.

Below is a chapter-by-chapter summary of the book.

Inside George W. Bush's Mind: Hints at What IQ Tests Miss is a brief introductory chapter. It starts with the example of president George W. Bush, mentioning that the president's opponents frequently argued against his intelligence, and even his supporters implicitly conceded the point by arguing that even though he didn't have "school smarts" he did have "street smarts". Both groups were purportedly surprised when it was revealed that the president's IQ was around 120, roughly the same as his 2004 presidential candidate opponent John Kerry. Stanovich then goes on to say that this should not be surprising, for IQ tests do not tap into the tendency to actually think in an analytical manner, and that IQ had been overvalued as a concept. For instance, university admissions frequently depend on tests such as the SAT, which are pretty much pure IQ tests. The chapter ends by a disclaimer that the book is not an attempt to say that IQ tests measure nothing important, or that there would be many kinds of intelligence. IQ does measure something real and important, but that doesn't change the fact that people overvalue it and are generally confused about what it actually does measure.

Dysrationalia: Separating Rationality and Intelligence talks about the phenomenon informally described as "smart but acting stupid". Stanovich notes that if we used a broad definition of intelligence, where intelligence only meant acting in an optimal manner, then this expression wouldn't make any sense. Rather, it's a sign that people are intuitively aware of IQ and rationality as measuring two separate qualities. Stanovich then brings up the concept of dyslexia, which the DSM IV defines as "reading achievement that falls substantially below that expected given the individual's chronological age, measured intelligence, and age-appropriate education". Similarly, the diagnostic criterion for mathematics disorder (dyscalculia) is "mathematical ability that falls substantially below that expected for the individual's chronological age, measured intelligence, and age-appropriate education". He argues that since we have a precedent for creating new disability categories when someone's ability in an important skill domain is below what would be expected for their intelligence, it would make sense to also have a category for "dysrationalia":

Dysrationalia is the inability to think and behave rationally despite adequate intelligence. It is a general term that refers to a heterogenous group of disorders manifested by significant difficulties in belief formation, in the assessment of belief consistency, and/or in the determination of action to achieve one's goals. Although dysrationalia may occur concomitantly with other handicapping conditions (e.g. sensory impairment), dysrationalia is not the result of those conditions. The key diagnostic criterion for dysrationalia is a level of rationality, as demonstrated in thinking and behavior, that is significantly below the level of the individual's intellectual capacity (as determined by an individually administered IQ test).

continue reading »

Lights, Camera, Action!

31 Alicorn 20 March 2010 05:29AM

Sequence index: Living Luminously
Previously in sequence: The ABC's of Luminosity
Next in sequence: The Spotlight

You should pay attention to key mental events, on a regular and frequent basis, because important thoughts can happen very briefly or very occasionally and you need to catch them.

You may find your understanding of this post significantly improved if you read the third story from Seven Shiny Stories.

Luminosity is hard and you are complicated.  You can't meditate on yourself for ten minutes over a smoothie and then announce your self-transparency.  You have to keep working at it over a long period of time, not least because some effects don't work over the short term.  If your affect varies with the seasons, or with major life events, then you'll need to keep up the first phase of work through a full year or a major life event, and it turns out those don't happen every alternate Thursday.  Additionally, you can't cobble together the best quality models from snippets of introspection that are each five seconds long; extended strings of cognition are important, too, and can take quite a long time to unravel fully.

Sadly, looking at what you are thinking inevitably changes it.  With enough introspection, this wouldn't influence your accuracy about your overall self - there's no reason in principle why you couldn't spend all your waking hours noting your own thoughts and forming meta-thoughts in real time - but practically speaking that's not going to happen.  Therefore, some of your data will have to come from memory.  To minimize the error introduction that comes of retrieving things from storage, it's best to arrange to reflect on very recent thoughts.  It may be worth your while to set up an external reminder system to periodically prompt you to look inward, both in the moment and retrospectively over the last brief segment of time.  This can be a specifically purposed system (i.e. set a timer to go off every half hour or so), or you can tie it to convenient promptings from the world as-is, like being asked "What's up?" or "Penny for your thoughts".

continue reading »