Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Lumifer 13 February 2017 09:28:31PM 0 points [-]

So if both you and me clearly understand the main point, and if the main point seems reasonably uncontroversial (everyone agrees that it's better to be smart than to be dumb, right?), then why do you describe this post as an epic fail? I'm sure that it makes some people's undergarments undergo interesting topological transformations, but that's hardly unusual or cause for such a.. forceful rejection.

Comment author: siIver 13 February 2017 10:09:02PM *  3 points [-]

I feel like I am repeating myself. Here is the chain of arguments

1) A normal person seeing this article and its upvote count will walk away having a very negative view of LessWrong (reasons in my original reply)

2) Making the valid points of this article is in no way dependent on the negative consequences of 1). You could do the same (in fact, a better job at the same) without offending anyone.

3) LessWrong can be a gateway for people to care about existential risk and AI safety.

4) AI safety is arguably the biggest problem in the world right now and extremely low efforts go into solving it, globally speaking.

5) Due to 4) getting people to care about AI safety is extrmely important. Due to that and 3), harming the reputation of LessWrong is really bad

6) Therefore, this article is awful, harmful, and should be resented by everyone.

Comment author: Lumifer 13 February 2017 07:53:48PM 0 points [-]

Well, there are these words and expression sprinkled throughout your comment:

... promoting elitism and entitlement ... and sexism ... value the thoughts of other people who are more knowledgeable about sexism over yours ... being offensive and harmful ...

All of this seems to go deeper than "mannerisms".

Your basic beef with the post seems to be that it is mean and insensitive and I think such an approach missed the post's main point. It seems that you think the main point is to stigmatize stupid people, label them sub-human, and, possibly, subject them to mandatory treatments with drugs and such. I think the main point is to stress that stupidity is not an unchanging natural condition ("sky is blue, water is wet, some people are stupid") but something that could be changed.

Comment author: siIver 13 February 2017 09:02:00PM 2 points [-]

No, I fully acknowledge that the post tries to do those things, see the second half of my reply. I argue that it fails at doing so but is harmful for our reputation etc.

Comment author: Lumifer 13 February 2017 03:55:05PM 1 point [-]

I am a bit confused by this comment.

Is it, basically, a rant how LW is not woke enough?

Comment author: siIver 13 February 2017 07:26:51PM 2 points [-]

It's about a set of mannerisms which many people on LW have that are really bad. I don't know what you mean by woke.

Comment author: siIver 12 February 2017 07:27:24PM *  5 points [-]

L. : While obviously being rational is good, LW as a community seems to be promoting elitism and entitlement.

s: Rationality can be scary that way. But it is about seeking truth, and the community does happen to consist of smart people. Denying that is false humility. Similarly, a lot of ideas many people support just happen to be false. It's not our fault that our society got it wrong on so many issues. We're just after truth.

L. : How does it serve truth to label people which aren't smart as mentally ill?

s: That's terrible, of course. But that's not a flaw of rationality, nothing about rationality dictates "you have to be cruel to other people". In fact if you think about this really hard you'll see that rationality usually dictates being nice.

L: Then how come this post on LessWrong is the most upvoted thing of the last 20 submissions?

s: ...

s: I can't defend that.

L. : Oh, okay. So I'm right and Yudkowsky's site does promote entitlement and sexism.

s: wait, sexism?

L. : Yeah. The last thing I saw from LW was two men talking about what a woman needs to do to fit the role they want her to have in society.

s: Okay, but that's not Yudkowsky's fault! He is not responsible for everyone on LW! The sequences don't promote sexism-

L. : I heard HPMoR is sexist, too.

s: That's not remotely true. It actually promotes feminism. Hermione is-

L. : I'm sorry, but I think I value the thoughts of other people who are more knowledgeable about sexism over yours. At least you condemn this article, but you still hang out on this site.


Scott Alexander has said that it's society's biggest mistake to turn away from intelligence (can't find the article). Even minor increases of intelligence correlate meaningfully with all sorts of things (a negative correlation with crime being one of them afaik). Intelligence is the most powerful force in the universe. A few intelligence points on the people working on Friendly AI right now could determine the fate of our entire species. I want to make it extra clear that I think intelligence is ultra-important and almost universally good.

None of this excuses this article. None of it suggests that it's somehow okay to label stupid people as mentally ill. Rationality is about winning, and this article is losing in every sense of the word. It won't be good for the reputation of LW, it won't be good for our agenda, and it won't be good for the pursuit of truth. The only expected positive effect is making people who read it feel good. It essentially says "being intelligent is good. Being stupid is bad. Other people are stupid. They are the problem. We are better than them." Which is largely true, but as helpful as making an IQ test, and emailing a friend saying "look here I am verifiable smarter than you and being smart is the most important thing in our society!"

Okay, but that's not a content critique. I just said I think this is bad and went from there. If the article was actually making a strong case, well then it could still be bad for having an unnecessarily insulting and harmful framing that is bad for our cause, but it might be defend-able on other grounds. Maybe. We want to do both; to win and to pursue truth, and those aren't the same thing. But I strongly feel the article doesn't succeed on that front, either. Let's take a look.


It's great to make people more aware of bad mental habits and encourage better ones, as many people have done on LessWrong.

sure.

The way we deal with weak thinking is, however, like how people dealt with depression before the development of effective anti-depressants:

seems to be true.

"Stupidity," like "depression," is a sloppy "common-sense" word that we apply to different conditions, which may be caused by genetics (for instance, mutations in the M1 or M3 pathways, or two copies of Thr92Ala), deep subconscious conditioning (e.g., religion), general health issues (like not getting enough sleep), environment (ignorance, lack of reward for intelligent behavior), or bad habits of thought.

There is an implicit assumption here that being stupid requires some kind of explanation, but nothing at all in the article provides a reason of why this would be the case. Stupidity is not depression. The reason why it makes sense to label depression as a mental illness is (I assume) that it corresponds to an anomaly in the territory. Suppose we had a function, depressedness(human, time) which displayed how depressed each person on earth has been for, say, the past 10 years. I would expect to see weird behavior of that function, strange peaks over intervals of time on various people, many of whom don't have unusually high values most of the time. This would suggest that it is something to be treated.

If you did the same for intelligence, I'd expect relatively low change on the time axis (aside from an increase at young age and a decrease in the case of actual mental illnesses) and some kind of mathematically typical distribution among the person axis ranging from 60 to dunno 170 or something. I feel really strange about having to make this argument, but this is really the crux of the problem here. The article doesn't argue "here and here are stats suggesting that there are anomalies with this function, therefore there is a thing which we could sensibly describe as a mental illness" it just says "some people are dumb, here are some dumb things they do, let's label that mental illness." To sum the fallacy committed here up in one sentence, it talks about a thing without explaining why that thing should exist.

It is implied that people being ashamed of admitting to depression is a problem, and I infer that the intention is to make being stupid feel less bad by labeling their condition a "mental illness." But it clearly fails in this regard, and is almost certainly more likely to do the opposite.. It's sort of a Lose-Lose dynamic: it implies that there is some specific thing influencing a natural distribution of intelligence, some special condition that covers "stupid "people which explains why they are stupid – which likely isn't the case, in that way having low IQ is probably worse than the article was meant to imply, since there is no special condition, you just got the lower end of the stick – while also being framed in such a way that it will make unintelligent people feel worse than before, not better.

And where is the reverse causation of believing in religion causing stupidity coming from? Postulating an idea like this ought to require evidence.

The article goes on to say that we should do something to make people smarter. I totally, completely, whole-heartedly agree. But saying high IQ is better than low IQ is something that can and has been done without all of the other stuff attached to it. And research in that direction is being done already. If you wanted to make a case for why we should have more of that, then you could do that so much more effectively without all the negativity attached to it.

Here are the accusations I am making. I accuse this article of not making a good case for anything that is both true and non-obvious, on top of being offensive and harmful for our reputation, and consequently our agenda. (Even if it is correct and there is an irregularity in the intelligence function, it doesn't make a good case.) I believe that if arguments of the same quality were brought forth on any other topic, the article would be treated the same way most articles with weak content are treated: with indifference, few upvotes, and perhaps one or two comments pointing out some flaws in it (if Omega appeared before me, I would bet a lot of money on that theory with a pretty poor ratio). I'll go as far as to accuse upvoting this as a failure of rationality. I agree with Pimgd on everything they said, but I feel like it is important to point out how awful this article is, rather than treating it as a simple point of disagreement. The fact that this has 12 upvotes is really, really really bad, and a symptom of a much larger problem.

This is not how you are being nice. This is not how you promote rationality. This is not how you win.

[Link] Changes in AI Safety Funding

3 siIver 11 February 2017 08:36AM
Comment author: Dagon 06 February 2017 04:11:06PM 6 points [-]

It's solved for anyone who doesn't believe in magical "free will". If it's possible for Omega to correctly predict your action, then it's only sane to one-box. Only decision systems that deny this ability to predict will two-box.

Causal Decision Theory, because it assumes single-direction-causality (a later event can't cause an earlier one), can be said to deny this prediction. But even that's easily solved by assuming an earlier common cause (the state of the universe that causes Omega's prediction also causes your choice), as long as you don't demand actual free will.

Comment author: siIver 06 February 2017 05:39:03PM 1 point [-]

I agree that it's clear that you should one box – I'm more talking about justifying why one-boxing is in fact correct when it can't logically influence whether there is money in the box. Initially I found this to be unnerving initially, but maybe I was the only one.

Comment author: siIver 06 February 2017 02:25:45PM 0 points [-]

Reposting this from last week's open thread because it seemed to get buried

Is Newcomb's Paradox solved? I don't mean from a decision standpoint, but the logical knot of "it is clearly, obviously better two one-box, and it is clearly, logically proven better to two-box". I think I have a satisfying solution, but it might be old news.

Comment author: stephen_s 05 February 2017 09:51:53PM 1 point [-]

Interesting points, yea you're getting at the heart of what I'm trying to figure out. I think you're right, that it's easy to see how the story possibilities that use the simplest story types (Hero's Journey, etc) have possibly been ~90% completed.

But what makes you think that more complex story types allow many more possibilities? Along the lines of your point, Game of Thrones is a fantasy epic with a much darker tone that breaks storytelling conventions, but wouldn't any fantasy epic series with similar attributes in the future seem less groundbreaking than Game of Thrones? I agree that you could apply similar attributes to a Sci Fi epic series, or another type of series, but it seems like that type of story would begin to get old in the near future as well. On the television front, there are so many shows being created that it's hard to see how they can keep being groundbreaking.

With arty / more complex films like Being John Malkovich or Adaptation or Eternal Sunshine (I'm a fan of those Kaufman movies), does complexity lead to more possibilities of these types of movies or less? There seems to be a slowdown in more arty / complex stories this decade (than compared to the 90's for example).

With film and television creation being more democratized than ever, I don't see a reason why the creation of these type of films would slow down apart from the remaining stories requiring more complexity and skill to write than ever. I think we agree on the necessity of higher skill in writing currently. But, it seems to me that a slowdown in the category of non-traditional or unique stories would mean that we are running out of those story possibilities as well.

Comment author: siIver 06 February 2017 02:33:16AM *  1 point [-]

But what makes you think that more complex story types allow many more possibilities?

Isn't that an inherent property of complexity? A larger set of elements -> a larger powerset of elements -> more possibilities. In fact the size of the powerset grows at 2^x. I think a second game of thrones would be less groundbreaking, but doesn't have to be worse... and the same goes for the 1000th GoT.

There seems to be a slowdown in more arty / complex stories this decade (than compared to the 90's for example).

With film and television creation being more democratized than ever, I don't see a reason why the creation of these type of films would slow down apart from the remaining stories requiring more complexity and skill to write than ever.

I don't know as much as you about the industry. These sound worrisome.

I still think it is more likely that there is another reason (not that bold of an assumption) than that we really run out of complex things to write, because that just doesn't seem to be true looking at how complexity works and how much seems to be doable just by tweaking those more complex pieces we have. Adaption is another great example.

But, I might be suffering from bias here, because I much prefer the world where I'm right to the one where I'm wrong.

Comment author: stephen_s 05 February 2017 08:59:25PM 0 points [-]

The reason that I bring up classical and jazz, is that there has been a clear slowdown in meaningful additions to the genres over the past few decades. So, if music genres reach a limit of possibilities, then it seems likely to apply to other areas of art as well.

Yes, I agree that there are more intelligent (or less simple) stories that haven't been written yet. I'm not sure if you are saying that you agree that there is a limit of possible stories, or that you think there is no limit? If there is a limit, what do you think would be the signs that we are reaching it?

I would agree that it seems from intuition that there are a lot of available stories still left to be written, but what would explain the slowdown in original properties being created or finding an audience currently (than in previous decades)?

Comment author: siIver 05 February 2017 09:20:37PM *  2 points [-]

Well, there is a provably finite possibility space for stories. You only have so many ways to arrange letters in a script. The question is whether it's meaningful.

To use some completely made-up numbers, I think the current possibility space for movies produced by the bottom 80% of people with the current style may be 90% covered. The space for the top 2%, on the other hand, is probably covered for less than 0.1% (and I resisted putting in more zeros there).

To get more concrete, I'll name some pieces (which I avoided doing in my initial post). Take Game of Thrones. It's a huge deal – why? Well, because there isn't really anything like it, But when you get rid of all the typical story tropes, like main characters with invulnerability, predictable plot progressions, a heroic minority lucking out against an evil majority, typical villains, etc etc, not only does the result get better, the possibility space actually widens. (I'm not saying scripts of this quality don't exist, but it seems to be the only show where a great script and a huge budget and a competent team came together. There could be thousands of shows like this, and there is just one).

Or take the movie Being John Malkovich. Basically, there is one supernatural element placed in an otherwise scientifically operating world, and you have a bunch of character who act like normal humans, meaning largely selfish and emotionally driven, acting over that element. Just thinking about how much you could do following that formula opens up a large area in that seems to be largely untouched.

I think we're shooting at Pluto over and over again while (for the most part) ignoring the rest of the universe. And it still works, because production quality and effects are still improving.

(edited)

Comment author: siIver 05 February 2017 07:28:15PM *  2 points [-]

I'd say no to both. I don't think any genre has come meaningfully close to completion, though I don't know classic of jazz very well.

Let's talk film. If I take a random movie that I didn't like, I find it very similar to others. If, however, I take one that I really like, I find that frustratingly few movies exist that are even similar.

I consider the possibility space to be a function of creativity/intelligence/competence (let's call it skill) of writing, and one that grows faster-than-linearly. The space of medium-skill writing may be nearing completion (though I think this is arguable, too), but the space for more intelligent writing is vast and largely unexplored.

Just think of how many similarities most movies have, starting with the Hero's Journey archetype. This need not be. My two favorite non-animated pieces in film both don't have a main character.

View more: Next