Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Luke_A_Somers 16 January 2017 03:49:18PM 0 points [-]

That isn't the distinction I get between suboptimal and irrational. They're focused on different things.

Irrational to me would mean that the process by which the strategy was chosen was not one that would reliably yield good strategies in varying circumstances.

Suboptimal is just an outcome measurement.

Comment author: hairyfigment 16 January 2017 06:53:26PM 0 points [-]

Outcome? I was going to say that suboptimal could refer to a case where we don't know if you'll reach your goal, but we can show (by common assumptions, let's say) that the action has lower expected value than some other. "Irrational" does not have such a precise technical meaning, though we often use it for more extreme suboptimality.

Comment author: CBHacking 15 January 2017 02:08:29PM *  2 points [-]

For what it's worth, I got relatively little* out of reading the Sequences solo, in any form (and RAZ is worse than LW in this regard, because the comments were worth something even on really old and inactive threads, and surprisingly many threads were still active when I first joined the site in 2014).

What really did the job for me was the reading group started by another then-Seattleite*. We started as a small group (I forget how many people the first meetings had, but it was a while before we broke 10 and longer before we did it regularly) that simply worked through the core sequences - Map & Territory, then How to Actually Change Your Mind - in order (as determined by posts on the sequences themselves at first, and later by the order of *Rationality: AI to Zombies chapters). Each week, we'd read the next 4-6 posts (generally adjusted for length) and then meet for roughly 90 minutes to talk about them in groups of 4-8 (as more people started coming, we began splitting up for the discussions). Then we'd (mostly) all go to dinner together, at which we'd talk about anything - the reading topics, other Rationality-esque things, or anything else a group of smart mostly-20-somethings might chat about - and next week we'd do it again.

If there's such a group near you, go to it! If not, try to get it started. Starting one of these groups is non-trivial. I was already considering the idea before I met the person who actually made it happen (and I met her through OKCupid, not LessWrong or the local rationality/EA community), but I wouldn't have done it anywhere near as well as she did. On the other hand, maybe you have the skills and connections (she did) and just need the encouragement. Or maybe you know somebody else who has what it takes, and need to go encourage them.

  • Reading the Sequences by myself, the concepts were very "slippery"; I might have technically remembered them, but I didn't internalize them. If there was anything I disagreed with or that seemed unrealistic - and this wasn't so very uncommon - it made me discount the whole post to effectively nothing. Even when something seemed totally, brilliantly true, it also felt untested to me, because I hadn't talked about it with anybody. Going to the group fixed all of that. While it's not really what you're asking for, you may find it does the trick.

    ** She has since moved to (of course) the Bay Area. Nonetheless, the group continues (and is roughly now two years running, hitting nearly every Monday evening). We regularly break 20 attendees now, occasionally break 30, and the "get dinner together" follow-up has grown into a regularly-scheduled weekly event in its own right at one of the local rationalist houses.

In response to comment by CBHacking on LessWrong 2.0
Comment author: hairyfigment 15 January 2017 07:19:56PM 1 point [-]

Upvoted, but this seems to vary from person to person. You also forgot how italics and lists work here.

Comment author: Gurkenglas 13 January 2017 09:43:10PM *  1 point [-]

To figure out the truth, we must not punish people for advocating a position, or we might end up in a situation where everyone sees a taboo truth and is afraid to speak it.

That someone advocates lying is evidence that they would lie and should be excluded. Now take that evidence and throw it out the window, because we need to figure out whether lying is actually the right thing to do, and for that we need to listen to all the sides. In fact, Gleb should be rewarded as compensation for the s̶u̶b̶c̶o̶n̶s̶c̶i̶o̶u̶s̶ trust of his peers that he sacrificed to help this discussion.

Comment author: hairyfigment 14 January 2017 07:39:04PM 1 point [-]

This is wholly irrelevant, because we've already caught Gleb lying many times. His comment sacrifices nothing, and in fact he's likely posting it to excuse his crimes (the smart money says he's lying about something in the process).

Your point does apply to the OP trying to smear her first example for practicing radical honesty. This is one of the points I tried to make earlier.

Comment author: sarahconstantin 12 January 2017 07:01:10AM 2 points [-]

Hi, I wrote the post.

I think that it's actually fine for me to use spooky/mystical language to describe human behavior. I'm trying to hint very broadly at subjective impressions, and provoke readers to see the same things I do. I have the rough sense of something spooky going on in the zeitgeist, and I want to evoke that spooky feeling in my readers, so that some of them might say "I see it too." That's exactly the right use case for magical thinking.

There are degrees of certainty in making accusations. If you have hard evidence that somebody did a seriously bad thing, then that's one kind of situation. I'm not making any of those kinds of claims. (There was hard evidence that Gleb did a seriously bad thing, but that's not original to me, and that was dealt with before.)

What I'm doing is more like the sort of thing that goes on when, say, a journalist/blogger might accuse EA of being a movement full of "nerdy white males" and insinuating that this is associated with certain stereotypical biases, and maybe pulling a quote or two to support the claim. It's a "hey, this smells funny" kind of deal. It's about pattern-matching and raising suspicion and smearing it around a bit.

Comment author: hairyfigment 12 January 2017 07:43:44AM 2 points [-]

I do not think it's fine. I think you're poisoning the discourse and should stop doing it, as indeed should the blogger in your example if there isn't more to go on. Is your last sentence some kind of parody, or an actual defense of the reason our country is broken?

Comment author: Benquo 12 January 2017 01:53:18AM 5 points [-]

Well, yes.

I think it's a bad motive and one that leads towards less openness and honesty, but Ben Todd personally is being very open and honest about it, which is right and virtuous and says good things about him as a human being, and his intentions. I think this gives things like EA a chance at avoiding various traps that we'd otherwise fall into for sure - but it's not a get-out-of-jail-free card.

Comment author: hairyfigment 12 January 2017 06:15:52AM 1 point [-]

I'm talking here about the linked post. The author's first example shows the exact opposite of what she said she would show. She only gives one example of something that she called a pattern, so that's one person saying they should consider dishonesty and another person doing the opposite.

If you think there's a version of her argument that is not total crap, I suggest you write it or at least sketch it out.

Comment author: hairyfigment 12 January 2017 01:11:46AM 2 points [-]

Another note I forgot to add: the first quote, about criticism, sounds like Ben Todd being extremely open and honest regarding his motives.

Comment author: hairyfigment 12 January 2017 01:04:45AM 1 point [-]

She does eventually give an example of what she says she's talking about - one example from Facebook, when she claimed to be seeing a pattern in many statements. Before that she objects to the standard use of the English word "promise," in exactly the way we would expect from an autistic person who has no ability to understand normal humans. Of course this is also consistent with a dishonest writer trying to manipulate autistic readers for some reason. I assume she will welcome this criticism.

(Seriously, I objected to her Ra post because the last thing humanity needs is more demonology; but even I didn't expect her to urge "mistrusting Something that speaks through them," like they're actually the pawns of demons. "Something" is very wrong with this post.)

The presence of a charlatan like Gleb around EA is indeed disturbing. I seem to recall people suggesting they were slow to condemn him because EA people need data to believe anything, and lack any central authority who could declare him anathema.

Comment author: ESRogs 11 January 2017 07:24:32AM 0 points [-]

What's wrong with green?

Comment author: hairyfigment 11 January 2017 11:43:52PM 0 points [-]

Well, it's not easy.

In response to comment by Jade on Crisis of Faith
Comment author: Salemicus 07 January 2017 12:15:36PM 1 point [-]

Neither sufficient nor necessary:

  • The origins of Christianity become more mysterious, not less, if there never was a Jesus.
  • We don't need to tie ourselves to a fringe hypothesis to posit non-supernatural origins for the Gospels.
In response to comment by Salemicus on Crisis of Faith
Comment author: hairyfigment 09 January 2017 09:43:45AM 1 point [-]

Your second point is clearly true. The first seems false; Christianity makes much more sense from a Greco-Roman perspective if Jesus was supposed to be a celestial being, not an eternal unchanging principle that was executed for treason. And the sibling comment leaves out the part about first-century Israelites wanting a way to replace the 'corrupt,' Roman-controlled, Temple cult of sacrifice with something like a sacrifice that Rome could never control.

Josephus saw the destruction of that Temple coming. For others to believe it would happen if they 'restored the purity of the religion' only requires the existence of some sensible zealots.

Comment author: see 01 January 2017 07:11:27AM 0 points [-]

The interactions of three people is more complex than the interactions of one person with himself. But the theory that my house contains three different residents still explains observations of my house much more simply than if you start with the assumption there's only one resident. You accordingly cannot actually use Occam's Razor to disfavor the theory that my house has three residents simply because the interactions of three people with each other are more complex than the interactions of one person with himself. Similarly, adding a cat to the three persons hypothesis actually improves the explanatory power of the model, even though you now have three sets of human-cat interactions added to the model; rejecting the cat on the basis of Occam's Razor is also a failure.

Is a trinity more complex than a unitary godhead? In itself, sure. But if you're trying to do something as notoriously convoluted as, say, theodicy, the question is, does the trinity provide extra explanatory power that reduces the overall complications?

And I strongly doubt anyone is both knowledgeable enough about theodicy and sufficiently rational and unbiased on the unity/trinity question to give a trustworthy answer on the question of which is the actual lesser hypothesis there. Especially since the obvious least hypothesis in theodicy is that there is no God at all and thus nothing to explain.

If you're going to claim that a unitary godhead is favored by Occam's Razor over a trinity, you actually need, among other things, a whole unitary-godhead theodicy. But if you actually worked one out, in order to have a rational opinion on the relative viability of the unitary and trinity theories, I'm going to wonder about your underlying rationality, given you wasted so much time on theodicy.

Comment author: hairyfigment 02 January 2017 03:38:54AM 0 points [-]

As defined in some places - for example, the Occam's Razor essay that Eliezer linked for you many comments ago - simplicity is not the same as fitting the evidence.

The official doctrine of the Trinity has probability zero because the Catholic Church has systematically ruled out any self-consistent interpretation (though if you ask, they'll probably tell you one or more of the heresies is right after all). So discussing its complexity does seem like a waste of time to me as well. But that's not true for all details of Catholicism or Christianity (if for some reason you want to talk religion). Perhaps some intelligent Christians could see that we reject the details of their beliefs for the same reason they reject the lyrics of "I Believe" from The Book of Mormon.

View more: Next