<cross-posed on Facebook>


In the spirit of Project Hufflepuff, I’m listing out some ideas for things I would like to see in the rationality community, which seem like perhaps useful things to have. I dunno if all of these are actually good ideas, but it seems better to throw some things out there and iterate.

 

Ideas:


Idea 1) A more coherent summary of all the different ideas that are happening across all the rationalist blogs. I know LessWrong is trying to become more of a Schelling point, but I think a central forum is still suboptimal for what I want. I’d like something that just takes the best ideas everyone’s been brewing and centralizes them in one place so I can quickly browse them all and dive deep if something looks interesting.


Suggestions:

A) A bi-weekly (or some other period) newsletter where rationalists can summarize their best insights of the past weeks in 100 words or less, with links to their content.

B) An actual section of LessWrong that does the above, so people can comment / respond to the ideas.


Thoughts:

This seems straightforward and doable, conditional on commitment from 5-10 people in the community. If other people are also excited, I’m happy to reach out and get this thing started.



Idea 2) A general tool/app for being able to coordinate. I’d be happy to lend some fraction of my time/effort in order to help solve coordination problems. It’s likely other people feel the same way. I’d like a way to both pledge my commitment and stay updated on things that I might be able to plausibly Cooperate on.


Suggestions:

A) An app that is managed by someone, which sends out broadcasts for action every so often. I’m aware that similar things / platforms already exist, so maybe we could just leverage an existing one for this purpose.


Thoughts:

In abstract, this seems good. Wondering what others think / what sorts of coordination problems this would be good for. The main value here is being confident in *actually* getting coordination from the X people who’ve signed up.


Idea 3) More rationality materials that aren’t blogs. The rationality community seems fairly saturated with blogs. Maybe we could do with more webcomics, videos, or something else?


Suggestions:

A) Brainstorm good content from other mediums, benefits / drawbacks, and see why we might want content in other forms.

B) Convince people who already make such mediums to touch on rationalist ideas, sort of like what SMBC does.


Thoughts:

I’d be willing to start up either a webcomic or a video series, conditional on funding. Anyone interested in sponsoring? Happy to have a discussion below.

 

EDIT:

Links to things I've done for additional evidence:

 


Idea 4) More systematic tools to master rationality techniques. To my knowledge, only a small handful of people have really tried to implement Systemization to learning rationality, of whom Malcolm and Brienne are the most visible. I’d like to see some more attempts at Actually Trying to learn techniques.


Suggestions:

A) General meeting place to discuss the learning / practice.

B) Accountability partners + Skype check-ins .

C) List of examples of actually using the techniques + quantified self to get stats.


Thoughts:

I think finding more optimal ways to do this is very important. There is a big step between knowing how techniques work and actually finding ways to do them. I'd be excited to talk more about this Idea.


Idea 5) More online tools that facilitate rationality-things. A lot of rationality techniques seem like they could be operationalized to plausibly provide value.


Suggestions:

A) An online site for Double Cruxing, where people can search for someone to DC with, look at other ongoing DC’s, or propose topics to DC on.

B) Chatbots that integrate things like Murphyjitsu or ask debugging questions.


Thoughts:

I’m working on building a Murphyjitsu chatbot for building up my coding skill. The Double Crux site sounds really cool, and I’d be happy to do some visual mockups if that would help people’s internal picture of how that might work out. I am unsure of my ability to do the actual coding, though.

 

 

Conclusion:

Those are the ideas I currently have. Very excited to hear what other people think of them, and how we might be able to get the awesome ones into place. Also, feel free to comment on the FB post, too, if you want to signal boost.

New Comment
35 comments, sorted by Click to highlight new comments since: Today at 11:09 AM

Great list.

However, I object to this statement:

The rationality community seems fairly saturated with blogs.

Compare: "The library seems fairly saturated with books." "Google Scholar seems fairly saturated with papers."

[-][anonymous]7y10

Haha, okay, I suppose this is true if we grant the claim that a large amount of rationality progress is predicated on writing things, which usually take the form of blog posts.

I think the general thing I was trying to point was the lack of a good intellectual Schelling point, although Romeo Stevens helped me point out why this might be unrealistic. (See my other comment.)

I’d be willing to start up either a webcomic or a video series, conditional on funding. Anyone interested in sponsoring?

I would expect that you would need to demonstrate skill at either to convince someone to fund you to do it. Do you have published webcomics?

[-][anonymous]7y30

This is perfectly reasonable, and I can link to some things below.

To be clear, I don't have published webcomics, but I have done art for many years, and have what could generally be termed "skill".

Links to things below:

Happy to link to more things if people want more evidence.

State obviously differs, but in contrast to xkcd comics those examples don't give me the impulse "this is share-worthy content, there should be more of it". It isn't that it's bad or objectionable but it's still not in the class of the webcomics I find valuable.

Given the success of xkcd and the artistic complexity of xkcd I don't think that being "good at art" is the core skill of Randall.

[-][anonymous]7y00

Given the success of xkcd and the artistic complexity of xkcd I don't think that being "good at art" is the core skill of Randall.

I don't doubt this either.

To be honest, I'm unsure of my ability to sorta approximate the sort of in-group humor that comics like SMBC/xkcd do, but I'd parsed your original statement as one of general "being not-terrible at artsiness".

Happy to spend some time trying some prototype ideas in the coming weeks, though.

The best way to convince people (I include myself here) to fund something like this would be to start doing it first. Does not seem like an undue burden.

Idea 2 seems very vague. Can you give an example of how I would use it?

[-][anonymous]7y00

To be honest, I'm not too sure myself. I was thinking about times where, say, TIME writes a favorable piece on AI, then we can coordinate to get lots of people to upvote it on HN/reddit, or things like that, where having lots of people do a thing could be useful. Maybe it'll be more relevant for people in the same geographical areas?

Maybe it'll be more relevant for people in the same geographical areas?

As Zvi notes NYC has a mailing list.

In Berlin we have a general LW mailing list and a Dojo mailing list. I used the to get feedback on a draft for http://lesswrong.com/r/discussion/lw/oe0/predictionbased_medicine_pbm/ and got good feedback from four people.

If we wanted to coordinate on such a matter, right now we would use email, email groups (NYC uses one of these), Facebook and Less Wrong in some combination. I would expect this to reach most of the people we would reach with an app?

[-][anonymous]7y00

Sure. There's something about sending out emails, though, that doesn't seem to guarantee contribution. What I'd like is a way to easily figure out how much help you can actually expect from your in-group.

Right now, for both FB / mailing lists, my priors are something like "maybe 10% of people will actually comment?" (Which seems to be generally true for people who aren't Schelling points in the community, in my experience.)

I'd like a way to signal my willingness to expend some amount of effort to help with things that need add'l people, in a high-confidence environment, so people know that they can actually count on my support, if that makes sense?

I'd like a way to signal my willingness to expend some amount of effort to help with things that need add'l people, in a high-confidence environment, so people know that they can actually count on my support, if that makes sense?

I don't think signing up to helping projects without knowing anything about the project beforehand leads to quality engagement with the project.

You might have people relatively blindly commit to upvoting reddit posts but the resulting impact of that kind of coordination is relatively low.

There used to be a MIRI volunteer project that seems to run out of steam. http://lesswrong.com/lw/2g7/miri_call_for_volunteers/ It might be worthwhile to look into what happened with it.

[-][anonymous]7y00

I don't think signing up to helping projects without knowing anything about the project beforehand leads to quality engagement with the project.

This may well be true. I guess my question, then, is something like "how do we set up a system such that people can easily broadcast information on projects/things to those who care?"

My qualm with facebook for notices of this sort is that it's not necessarily easy for others to see what's happening, and posts are fairly transient, given its temporal nature.

If you have a task that you care to get done then posting it to Slack(goals of LessWrong)/Facebook/LW Discussion works for reaching people. If it turns out that you have projects that you want to promote where the present system isn't enough, that would be the time to build a new system.

Using the existing channels is much easier than convincing people to adopt a new channel. If you want a new channel then a google group mailing list seems optimal for reaching people.

I love idea number 1, especially. Lots of 100-word insights to scan through.

Very yes. Specifically, a bi-weekly, or monthly thread (similar to the open threads currently) of eg "Pitch your idea", with the hard-constrain for the topmost comments being 100 words at most at any given time, with optional links leading down the rabbit hole.

Edit: bonus point, but not hard requirement describing your idea in language of "up goer five" to avoid that thing where people compress by using technical words, as opposed to compressing comprehensibly. Like, what we want to achieve here is to serve as a common onboarding point for new people to get introduced to those ideas; as opposed to communicating the Theory of Everything in greek symbols.

[-][anonymous]7y00

Bi-weekly threads might be good; I was thinking a specific subsection, like how we currently have headings for "main" and "discussion". Or just a way of giving it more attention, so it doesn't blend into the other things, like how the OTs do.

Unsure if people would like that, or if it would be doable w/ LW's current code. But, still, a thought.

Also, yes, up goer give language would be pretty cool. Although I think that trying to spend time here developing a Theory of Everything could also be useful, and there's a place for both further development as well as being a common onboarding point.

IMO, as LW stands right now, it's not very friendly to beginners. The site felt poorly organized to me when I first started coming here, and the wiki was not super helpful. Part of it may be the choice of font / graphics, which were a turn-off for past me.

It took me weeks to find the sequences when I first got here...

It's now on the homepage

Double Crux was largely a re-invention of Street Epistemology

You can find people to practice it with at the Street Epistemology Facebook group. They're having role-play sessions, making how-to videos, etc.

streetepistemology.com

(This is Dan from CFAR)

This is the first I've heard of Street Epistemology, or Boghossian's book A Manual for Creating Atheists where it was apparently introduced. A key difference between it and Double Crux:

From their guide, it looks like Street Epistemology is intended to be an asymmetric game. Only player A knows about Street Epistemology, player A chooses to start the conversation about a topic where player A is confident that they are right and player B is wrong, and the conversation is about the reasons for player B's beliefs. Player A attempts to change player B's mind by improving player B's epistemology. Player A needn't talk about their own beliefs; there is a short subsection in the guide which addresses this topic, beginning "If asked about your own beliefs you should be prepared to answer." The guide describes Street Epistemology as being "most useful for extraordinary claims, such as miracles and supernatural phenomena."

Double Crux is intended to be a symmetric game, where both players know what kind of conversation they're getting into and both players put their beliefs (and the reasons for their beliefs) on the table in an attempt to improve their models. The object of the game (as its name suggests) is to find a crux that is shared by both players, where either of them would change their mind about the original disagreement if they changed their mind about the crucial point. I previously described Double Crux as being most useful for tricky, important-to-you questions where "digging into your own thinking and the other person's thinking on the topic is one of the more promising options available for making progress towards figuring out something that you care about."

Hi, Dan

The transition from using SE one-way versus using it two ways is meaningful, of course.

I think if you watch some of the SE example videos (see link), though, you'll get a fuller idea of what I've seen. Anthony Magnabosco's personal flavor of SE in particular has developed a lot of DC specifics.

I posted this to the SE FB group in December, and it lists some of the similarities I thought they'd find interesting:

"The Center For Applied Rationality (CFAR) has basically just re-invented a universally applicable form of Street Epistemology in what they call "Double Crux."

A few excerpts from their description:

"the primary strategy is to embody the question "What do you see that I don't?" In other words, approach from a place of explicit humility and good faith, drawing out their belief structure for its own sake, to see and appreciate it rather than to undermine or attack it"

"model the behavior you'd like your partner to exhibit."

"use paper and pencil, or whiteboards, or get people to treat specific predictions and conclusions as immutable objects (if you or they want to change or update the wording, that's encouraged, but make sure that at any given moment, you're working with a clear, unambiguous statement)."

They then break it down into an explicit algorithm and include some tips not seen in SE about how to focus on the true substance of a belief and how to use your own belief structure in a discussion." .

They also found the similarities striking. Anthony Magnabosco has found a lot of wording specifics that help people be even more open, and Violet Bernarde has been focusing on body language refinements, there have been other interesting developments. Good things to take from each other for each of them.

Double Crux was largely a re-invention of Street Epistemology

In what sense to you believe that to be true. Do you believe that CFAR was inspired by Street Epistemology?

[-][anonymous]7y10

I didn't know about this! Thanks for the link!

[-][anonymous]7y00

Potential problems with Idea number 1 (thanks to a chat w/ Romeo Stevens):

  • A lot of ideas being explored are now several inferential distances from the baseline rationality literature, meaning that short summaries might not be great. (There'd be lots of dependencies / things.)

  • The incentive structure as it stands doesn't seem to necessarily encourage this sort of compilation. (Coordination problem / poor returns on effort spent on summaries, esp. if it's all being summarized by one person?)

  • In addition to there being lots of rationality blogs, the emergence of rationality hubs in meatspace like Berkeley mean that there's progress that's likely happening offline, and writing things up in a way that bridges inferential gaps is costly.

I follow several programming newsletters, and I don't have context to fully understand and appreciate most links they share (although I usually have a general idea about what they are talking about). It's still very valuable to me find out about new stuff in the field.

I'd patreon a few dollars for something like this.

[-][anonymous]7y00

Thanks for bringing this up again. I think I'll try a weekly roundup next week and see how it goes (along with short summaries).

Idea 4) More systematic tools to master rationality techniques. To my knowledge, only a small handful of people have really tried to implement Systemization to learning rationality, of whom Malcolm and Brienne are the most visible. I’d like to see some more attempts at Actually Trying to learn techniques.

Why do you count those two but not CFAR and the Dojos?

[-][anonymous]7y10

Illusion of transparency, my bad.

CFAR is obviously the Schelling point for teaching rationality.

However, my experience with CFAR is that they are very good at teaching, but their follow-up methodology is suboptimal. They have follow-up buddies, along with some weekly reminders, but I feel like there is more room for scaffolding.

I don't have experience with Dojos, so I can't say about those.

[-][anonymous]7y00

On #2: NYC has a Google groups mailing list to coordinate with, and that plus email works quite well. What is the use case for using an app? What makes a broadcast superior/different from an email?

[This comment is no longer endorsed by its author]Reply

On #1: Twitter does have its advantages (I would match character counts with them, for compatibility). Compiling tweets and could-have-been-tweets that hold meaningful content seems like a good project. Twitter does have a lot of advantages doing this, and I think dominates the newsletter format. Doing it in a way that allows karma and extended discussion/comments in response does offer some offsetting advantages.

We could just start with an account that re-tweets liberally when people tweet at it with such thoughts, and see what happens, with people encouraged to post here if there is something worth discussing, and a thread that links to the account.

I could be convinced to do this if enough people make arguments that it is a good idea.

[-][anonymous]7y00

I'm unsure about Twitter. I would probably prefer it less if it was just an automated / curated bot, as that seems to lose the "compilation" feel. Correct me if I'm wrong, but I feel like I'd have to do a lot of scrolling to get a quick overview of what's been brewing in people's heads.

I agree that it would need to be someone who thought about what to include and what not to include.