After looking at the plan and the previous post, I've realized:
You are a parody of the SIAI, and I claim my five pounds.
Skeptic: The idea that a magic new theory of psychology will unlock human potential and create a new political majority of model citizens is a secular messianism with nothing to back it up.
Leverage Researcher: Have you done the necessary reading? Our ideas are based on years of disjunctive lines of reasoning (see blog post #343, 562 and 617 on why you are wrong).
Skeptic: But you have never studied psychology, why would I trust your reasoning on the topic?
Leverage Researcher: That is magical thinking about prestige. Prestige is not a good indicator of quality. We have written a bunch of blog posts about rationality and cognitive biases.
Skeptic: That's great. But do you have any data that indicates that your ideas might actually be true?
Leverage Researcher: No. You're entitled to arguments, but not (that particular) proof (blog post #898).
Skeptic: Okay. But I asked experts and they disagree with your arguments.
Leverage Researcher: You will soon learn that your smart friends and experts are not remotely close to the rationality standards of Leverage Research, and you will no longer think it anywhere near as plausible that their differing opinion is because they know some incredible se...
No. You're entitled to arguments, but not (that particular) proof (blog post #898).
You would invoke this on someone asking for only specific evidence for your theory. It doesn't make sense to invoke it against someone asking for ANY evidence.
You would invoke this on someone asking for only specific evidence for your theory. It doesn't make sense to invoke it against someone asking for ANY evidence.
You have to take the outside view here. When an outsider asks if you have evidence that AI will go FOOM then they are not talking about arguments because convincing arguments are not enough in the opinion of a lot of people. That doesn't imply that it is wrong to act on arguments but that you are so far detached from the reality of how people think that you don't even get how ridiculous it sounds to an outsider that has not read the sequences. Which your comment and the 11 upvotes it got obviously show.
The way outsiders see it is that a lot of things can sound very convincing and yet be completely wrong and that only empirical evidence or mathematical proofs can corroborate extraordinary predictions like those made by SI.
The wrong way to approach those people is with snide remarks about their lack of rationality.
I thought it was supposed to be funny.
It is supposed to be funny. Humor at the expense of Leverage Research (or, based on content, perhaps SIAI?). Humor is one of the most effective ways of undermining something's credibility. Making jokes based on false premises is rather insidious. If you reject any blatant falsehoods you may be conveyed as unable to take a joke.
Leverage Research is not SingInst. I have some reservations about their ideas, but they don't overlap at all with the ones you've expressed here. Generalizing complaints you have against SingInst comes across as holding a grudge and misapplying it.
I wrote the comment so that SI can improve their public relations.
Really?
How confident were you that your comment would result in noticeable improvements to SI's public relations?
People here have pretty much stopped replying to objections with "you should read the Sequences". This suggests that pointing out socially clunky behaviour is worth at least trying, for all the outcries of the stung.
On a first pass, the Leverage Research website feels like Objectivism. I say this because it is full of dubious claims about morality and psychology but which are presented as basic premises and facts. The explanations of "Connection Theory" are full of the same type of opaque reasoning and fiat statements about human nature which perhaps I am particularly sensitive to as a former Objectivist. Knowing nothing more than this first impression, I am going to make a prediction that there are Objectivist influences present here. That seems at least somewhat testable.
I thought the point was that the comment showed how the arguments, which we've gotten used to and don't fully question anymore, would look ridiculous when applied in a different context. (It was a pretty effective demonstration for me - the same responses did look far less convincing when they were put in the mouth of Leverage Research people rather than LW users..)
I thought the point was that the comment showed how the arguments, which we've gotten used to and don't fully question anymore, would look ridiculous when applied in a different context. (It was a pretty effective demonstration for me - the same responses did look far less convincing when they were put in the mouth of Leverage Research people rather than LW users..)
Exactly right.
Some remarks:
I'm of two minds about this sort of skepticism.
On the one mind, successfully addressing the "meta topics" related to the real hard overwhelming problems of the world seems a far better way of improving the world than devoting one's life to addressing the object-level problems directly. "Give me six hours to chop down a tree and I will spend the first four sharpening the axe," and all that.
On the other mind, the odds of any given attempt to address those "meta topics" being successful are punishingly low.
Back on the first mind, the odds of a successful attempt ever being made become much lower if nobody ever makes or supports attempts.
On the second mind, the expected opportunity costs associated with failed attempts might very well outweigh the expected value of a successful one.
Back on the first mind, those costs probably won't outweigh those associated with, say, World of Warcraft.
That is not how the world works. Most positions of power are already occupied by people who have common sense, good will, and a sense of responsibility - or they have those traits, to the extent that human frailty manages to preserve them, amidst the unpredictability of life.
Huh? That is not how the world works.
OK, let's consider the opposite proposition: Most positions of power are occupied by people who are crazy, resentful, and irresponsible. It might sound cynically plausible.
That just sounds stupid to me. Being crazy and irresponsible are deleterious traits in those seeking power. Resentfulness is both something that the losers in the power game are more likely to feel and dwelling in resentfulness is something of a failure mode when it comes to practical power gaining. Don't get mad, don't get even, just take your next step to power.
The idea of smooth sociopaths who lie their way to power and riches via superior awareness of human gullibility is way overrated. In most cases, power is a reward for boring hard work, taking responsibility, and being effective.
These things complement each other. You need both if you are going to reach the higher echelons.
Unless you're born to power, you only get to have it and to hold onto it by working with some group of people who have a complementary power of their own, the power to depose you - your investors, your voters, your professional colleagues.
And here is where I rest my idealism and my optimism. I don't try to force people - par...
Okay, I'm about 6 boxes into the flowchart of their plan and already releasing "gaaah" noises. I'll add a few further updates, but I may not have the willpower to make it through the whole thing.
Okay, finished. Wasn't as bad as I'd expected given the beginning.
Short summary: spend decades developing a particularly powerful way to understand people, and then use that to do as much good as possible, e.g. by working like a grant-awards agency that really understands who deserves the money, or like a think tank that really understands what messages people will remember, etc.
If you sort of squint your eyes, it makes sense. On the other hand, I won't hold my breath. For example, their plan only works if they beat everyone else to this understanding by greater than the time it takes to recruit all the donors and prestige they'll need (~5 years?). For another example, they're sort of hamstrung by "Connection Theory" already.
Those linked basic claims look well falsified already.
People always believe that all of their intrinsic goods will be achieved...This is, according to Connection Theory, an inviolable law of the mind.
Wishful thinking is not THAT ubiquitous and unbeatable. Lots of people expect to die without an afterlife and wish it wasn't so.
According to Connection Theory, the sole source of a person’s irrationality is that person’s need to believe that all of his or her intrinsic goods will be fulfilled. This need is a constraint; given this constraint, everyone forms the most reasonable beliefs that they can on the basis of the evidence they encounter
Falsified all over the place, by most of the heuristics and biases literature for one, unless "that they can" is interpreted in a slippery fashion to describe whatever people in fact do.
According to Connection Theory, every action that every person takes is part of an implicit plan for achieving all of that person’s intrinsic goods. A person may pursue some intrinsic goods first and others later, but none can be permanently sacrificed
This looks like it denies that people ever make real tradeoffs, but they do.
Most positions of power are already occupied by people who have common sense, good will, and a sense of responsibility - or they have those traits, to the extent that human frailty manages to preserve them, amidst the unpredictability of life.
Wait you are serious? I'm pretty sure a larger than normal fraction of people that rule us are sociopaths.
The idea that a magic new theory of psychology will unlock human potential and create a new political majority of model citizens is a secular messianism with nothing to back it up.
I'm happy people here realize this.
Leverage Research seems, at first glance at least, to be similar to SIAI. Their plans have similar shapes:
1). Grow the organization (donations welcome).
2). Use the now-grown organization to grow even faster.
3). ???
4). Profit ! Which is to say, solve all the world's problems / avoid global catastrophe / usher in a new age of peace and understanding / etc.
I think they need to be a bit more specific there in step 3.
Most positions of power are already occupied by people who have common sense, good will, and a sense of responsibility
Really? I'm sure powerful people have common sense, and I'm sure they have the ordinary good will towards their friends and responsibility for their allies. But I doubt they have the extraordinary good will and responsibility needed to do the right thing. Maybe you believe that such qualities are unrealistic because of "human frailty".
I think your second criticism is solid:
The idea that a magic new theory of psychology will unlock human potential and create a new political majority of model citizens is a secular messianism with nothing to back it up.
Well, Jesus (if he ever lived) or someone who was publishing under that pseudonym, tried to make a beneficial contagious ideology. Immediately thereafter, this ideology was adjusted to be more contagious at expense of not being beneficial any more. The new version immediately out-competed old without much struggle, and after it spread you got crusades, witchhunts, and the like.
This kind of stuff simply doesn't work. The contagious memes are subject to redesign. There is a very recent example for you - Godwin's law. I seen Godwin himself try to compare some...
I think you just invented the anti-Godwin. "Know who else tried to make the world a better place? Jesus.
This is way too harsh. So they don't have a great plan. They have amazing goals and have said that Connection Theory (despite being all over the site) is not completely necessary to accomplish their goals. Why bash them here? Why not bash the millions of groups that have both a bad plan and bad goals?
The idea that a magic new theory of psychology will unlock human potential and create a new political majority of model citizens is a secular messianism with nothing to back it up.
Why add the word "magic" or mention "messianism"? Smells unfair to me.
I think everybody is getting hung up about connection theory which is not the only thing that Leverage Research does. I'm not completely sure, but I'm pretty sure it's not even the main thing they do. EDIT: Why is this tagged politics? Does it have to do with the mind-killing comment thread about meta-trolling?
The basic proposition seems reasonable enough to me, though it might offend the sensibilities of those taken by the correspondence theory of truth.
People spend a lot of time trying to improve the world by imparting The Truth to their neighbors. There might be some mileage spreading ideas for their predicted effect, and not for their Correspondence Truth value. Spread ideas to win, not to convert. Dennett hints at this with his domesticating religion program.
The plan currently revolves around using Connection Theory, a new psychological theory, to design "beneficial contagious ideologies", the spread of which will lead to the existence of "an enormous number of actively and stably benevolent people", who will then "coordinate their activities", seek power, and then use their power to eliminate scarcity, disease, harmful governments, global catastrophic threats, etc.
That is not how the world works. Most positions of power are already occupied by people who have common sense, good will, and a sense of responsibility - or they have those traits, to the extent that human frailty manages to preserve them, amidst the unpredictability of life. The idea that a magic new theory of psychology will unlock human potential and create a new political majority of model citizens is a secular messianism with nothing to back it up.
I suggest that the people behind Leverage Research need to decide whether they are in the business of solving problems, or in the business of solving meta-problems. The real problems of the world are hard problems, they overwhelm even highly capable people who devote their lives to making a difference. Handwaving about meta topics like psychology and methodology can't be expected to offer more than marginal assistance in any specific concrete domain.