LESSWRONG
LW

336
Screwtape
69321805772
Message
Dialogue
Subscribe

I'm Screwtape, also known as Skyler. I'm an aspiring rationalist originally introduced to the community through HPMoR, and I stayed around because the writers here kept improving how I thought. I'm fond of the Rationality As A Martial Art metaphor, new mental tools to make my life better, and meeting people who are strange in ways I find familiar and comfortable. If you're ever in the Boston area, feel free to say hi.

Starting early in 2023, I'm the ACX Meetups Czar. You might also know me from the New York City Rationalist Megameetup, editing the Animorphs: The Reckoning podfic, or being that guy at meetups with a bright bandanna who gets really excited when people bring up indie tabletop roleplaying games. 

I recognize that last description might fit more than one person.

Sequences

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
Running A Basic ____ Meetup
Cohabitive Game Design
The LessWrong Community Census
Meetup Tips
Meetup in a box
5Screwtape's Shortform
3y
52
Mixed Feelings on Social Munchkinry
Screwtape12h60

Huh, yeah, I hadn't consciously noticed that clustering. Hrm, can I come up with counterexamples?

Positive example that's a single person ignoring social taboo: covert but responsible substance use maybe? I know a couple people who use phenibut or microdoses of LSD because they think it helps participate better in social situations, and as far as I've noticed it does, but there is a social taboo against using drugs.

(Not unambigiously positive, I think there's a somewhat reasonable preference other people might have to know if the folks around them are on some kind of mind-altering substance. And yet my intuition is that's reasonable for normal doses of LSD but not for normal doses of antidepressants or antianxiety meds.)

Negative example that's a group of people enthusiastically deciding to change or disregard social norms: Conspiracy to commit fraud. FTX, Enron, Bernie Madoff's associates, all of these are a small group of people deciding to change or disregard a social norm (don't lie) optimizing for something (lots of money) and that's bad. Likewise harmful cults like the People's Temple, Heaven's Gate, the Branch Davidians (even before the siege) seem to be doing something like optimizing hard for some kind of spiritual lifestyle that winds up sacrificing some important protective elements. The fraudsters fit the selfish description[1] but I don't think that's quite what's going on with the cults? 

At least in my head, both the positive and negative versions feel like similar mental motions. I can sort of poke my brain to step outside social scripts, look around, and optimize for something, and get interesting and creative answers back. It's sometimes a useful skill, and also, some of the interesting and creative ideas I come up with would make people worse off, blow up in my face if people find out, or otherwise would be unethical, so I don't do them. In the other direction, there's some ideas I've encountered I thought were morally fine, but broke social norms in a way I didn't want to do, and I'm glad I figured out the source of my objection.

  1. ^

    I think FTX/SBF claims it wasn't for selfish ends. I don't have a position on whether that's true or not, just noting it.

Reply
On Stance
Screwtape13h40

Yeah! I think we're talking about similar things here. I vaguely remember reading that actually and it might have been buried in the back of my head as I was writing this. I edited in a link to your list of examples, they're great, and I hope you don't feel like this is too derivative of your piece.

Reply
On Stance
Screwtape13h20

You're welcome! Thank you for your contributions, "this is a problem that can be solved by patience" in particular feels good and useful to me.

Reply
Everyone has a plan until they get lied to the face
Screwtape13h20

Seems a reasonable set of preferences.

Since you're bringing it up, I do want to clarify it wasn't a pure social hangout. Some rationalist meetups are people standing around talking about whatever they're into, some have board games people are playing or readings people are supposed to have read and then talk about, some have specific activities. This was one of the latter, specifically the thumb-on-the-scale variation of Zener Science.

I actually put a lot of thought into signposting for meetups, trying to make sure that people who don't want to participate in some kind of meetup activity knows about it before making the decision to go. E.g. if someone doesn't want to read Scott Alexander's writing or talk about it, that's pretty reasonable, but if that person show up to the event calling itself an ACX Reading Group where the announcement says you're supposed to read one of Scott's essays, it's not the organizers fault.

(The next bit of this is me working through the Zener Science example, because I'm actually not satisfied with how I signpost it.)

The thumb-on-the-scale Zener Science example is not one of my better signposted meetups and I'd like to solve that problem someday! I'd give it a C+. From my perspective, the problem stems from the fact that most people (especially people at rationalist meetups!) don't believe in psychic powers and aren't motivated to practice the stats/science skill the event is trying to work on. What happened without thumb-on-the-scale is they show up, guess at two or three cards, and then conclude (because they were already confident of this, despite not having done enough tests to conclude this from the experiment) that psychic powers aren't real.

So I can't necessarily signpost this as an earnest investigation into ESP, because I don't think ESP works, my attendees mostly don't think ESP works, and unless I prompt it the experiment isn't going to be good enough to prove anything.[1] I want to signpost it as a place to practice some stats/science skills, because that's my goal, but that works better if I can get people to dig a little deeper than just looking at a couple cards. To achieve that, I want to make them a little suspicious and work through the stats of how many right answers would indicate something weird was going on. Since I'm not actually psychic, I sometimes cheat the deck somehow, but since I really don't want people to wind up not able to trust me I want to signpost that I'm doing something unusual. 

Hence the Might Be Lying sign. Good glomarization means sometimes I use the Might Be Lying sign when I'm not actually lying- attendees shouldn't be able to look at the sign and go "okay, the answer is he's cheating the deck/psychic" without doing any tests. In theory I might be lying at any time, but when ideally wearing the sign is a good signal that something different is going on; I claim people shouldn't update much about my propensity to lie in normal life based on my propensity to lie when wearing the sign. But that's getting a bit complicated for a meetup announcement.

Coming back to your assumptions:

You did not clearly state you were definitely going to lie about certain things) and in the context of a "social experiment".

I didn't clearly state I was definitely going to lie about certain things, because sometimes I run the Zener Science straight without the thumb-on-the-scale variation and I'll wear the sign there as part of a glomarization strategy. It's in the context of a pretty specific experiment, and it's parapsychology, not social science. (Though I can see myself wearing a Might Be Lying sign for similar reasons in other kinds of activities - though sometimes I don't need the sign, as in Jimramdomh's example of social-deception games.)

To me it seems quite similar to people wearing signs saying "I might be rude" and then actually being rude.

Funnily enough, I've kinda run that meetup too. I'd give myself an A- on signposting there, and cheerfully endorse people deciding not to go to meetups in that style, safe in the knowledge that I'm not going to try and make them use Crocker's Rules at events announced as reading groups.

  1. ^

    I'm actually pretty excited about doing some variation of Zener Science with a mix of people who believe in ESP and people who don't, who were coming together in good faith to figure out what's going on. Wiseman & Schlitz’s Experimenter Effects And The Remote Detection Of Staring sounds like a good afternoon to me.

    And indeed, once or twice someone showed up to the Zener Science meetup who did believe in psychic powers. Whenever this happens I try to pivot to investigating how they think the psychic powers work and what we'd need to change about the test in order to provide evidence one way or another, without making them feel put on the spot or ~othered by being the one person out of a group to hold a contrary belief.

Reply
Lack of Social Grace is a Lack of Skill
Screwtape21h20

I will agree the title is a stronger and more universal version of the statement I mean. Alternate considerations included "Lack of Social Grace Is Often A Lack Of Skill." The main reasons I didn't use that title were that the piece I'm responding to doesn't (Lack of social grace is also not always epistemic virtue! Sometimes it's pure unforced error!) and because I do think even critical and harsh truths have more graceful ways to say them than many people use.

"Critical" is ambiguous; I assume you mean "having the potential to become disastrous; at a point of crisis" and not "expressing adverse or disapproving comments or judgments" I don't particularly think my argument changes in critical situations, because a lot of the time I think there's just strictly better ways to say things. "Hey moron there's a car coming" is worse than "There's a car coming" right? "I'm concerned you might not be happy or successful in medical school" is more graceful than "you're too stupid to take those classes," and also I think often actually more true.

Or take expressing disapproving comments or judgements; I've often seen people give feedback that only points out one or two flaws with no further commentary. I later learned that the person giving feedback loved most of the piece. The author or artist came away with the impression that the piece was terrible; after all, the only thing they heard about it was the flaws.

I love the HBO Chernobyl show. There's a great scene where a scientist is in a room full of people congratulating themselves on the Chernobyl disaster being not that big a deal and being well in hand, and stands up to say things are not fine. Here is a (fictional) man in a critical situation who could have done a maximally graceful thing of keeping his mouth shut, and instead did something else. But watch how the meeting goes; twice he changes his approach to be a bit more graceful, because he needs these people to listen to him and not ignore him.

Reply
Learning information which is full of spiders
Screwtape21h20

Yeah, that's basically fair. I do actually stand by rewriting old posts or ideas.

If I'm going to zoom in at the distinction, it's in this line.

Spiders, then, are the source of ugh fields.

The ugh field is this wide, ambient miasma around an area. The spider is the specific sharp prickly bit that is the source of the negative feedback. Some ugh fields are actually full of spiders everywhere. Some just have one or two spiders in a small area, but the spiders exert their influence over a wide zone.

But like, if you've got the one idea you plausibly don't need the other.

Reply
Just Another Five Minutes
Screwtape3d61
  • Get better metrics on what works when teaching rationality. Talk to all the rationalist/adjacent training groups, ask what metric they think they're improving, check if graduates of each group is improving on the metrics of all the groups, control against general population and the social rationalists.
  • Once we know what works (or if someone knows something I don't about what works), start getting it taught broadly. Teach more instructors, make inroads on the education systems, backchain from "X years from now, random people on the street know how to do this." The win condition is e.g. calibration being taught in grade school at least as often as we teach arithmetic.
  • A stable, user friendly crossposter for events. It's pretty common for someone to want an event to exist on multiple of Facebook, LessWrong, Partiful, Meetup.com, a Google group mailing list, a Discord event, a Google calendar, announced in Telegram, announced in Whatsapp, announced in Instagram, etc. "Here's the time, here's the place, here's the plaintext title and the plaintext description. Realistically this would be a constant pain to maintain as APIs changed, but I do think the math pencils out.
  • The go-wide version of a rationalist conference. EAGs get above a thousand people, and I think they're the biggest events in the adjacent space. Fandom or industry conferences in the mid-four digits aren't hard to find, and there's things with five digit counts. Plausibly the juice isn't worth the squeeze here; you can't meaningfully talk to all the people at a Manifest or an EAG already. And yet DEF CON exists and afaik is considered pretty valuable by those who partake.
  • Translations of the sequences or HPMOR into other languages. Maps of rationalist activity are mostly heatmaps of English speakers, with the exception of Germany being a notable outlier. Harry Potter was pretty popular in China!
Reply21
Everyone has a plan until they get lied to the face
Screwtape3d20

Aww, that doesn't work anymore? Probably good for the world if sad for pranksters. I admit I last pulled some variant of this prank in the late aughts/early 2010s and haven't tried recently. I got an afternoon of enjoyment out of upsidedownternet.

My next best idea I'm sure I could pull off would be to make my own website that looked like the wikipedia article, pull that up on my phone, and show it to the mark.

Reply
Learning information which is full of spiders
Screwtape9d30

The second Merrin story (well, I don't know if they're chronological or what but it's the one I reference second in this post) is here.

Reply
Solstice Season 2025: Ritual Roundup & Megameetups
Screwtape9d20

New York City

When: December 20th, 6:00pm

Where: HI NYC Hostel at 891 Amsterdam Avenue, NY

Tickets: https://rationalistmegameetup.com/

This combines with the East Coast Rationalist Megameetup, a weekend of neat talks and relaxed socialization. You can totally get a ticket to just Solstice or just Megameetup if that's more your preference.

Reply
Load More
26Preference Weighting and the Abilene Paradox
21h
1
46Mixed Feelings on Social Munchkinry
2d
4
26The Control System Going Out of Control
2d
0
24Process Crimes and Pedantic Rules
2d
4
17Brand New Experience Salesman
3d
0
28Just Another Five Minutes
4d
4
153Everyone has a plan until they get lied to the face
5d
27
28Meetup Tip: Food
6d
1
24Better than Baseline
7d
1
21On Stance
8d
5
Load More
Meetups (specific examples)
10 months ago
(+52/-8)
Meetups (specific examples)
3 years ago
(+813)