Note I'd like to make: a lot of people around here worry about cult-like behavior. that's not irrational; cult-like abuses have in fact occurred in humanity, including in places that are low network distance to this group. Cult-like behavior must be pushed back against specifically, not using vague generality. Attempting to convince people of the value of aligning multi-agent networks is, in fact, a major valuable direction that humanity could go, IMO, and being able to do that without risking cult-like abuses is important. Key things to avoid include isolating people from their friends, breaking the linguistic association of words to reality, demanding that someone change their linguistic patterns on the spot, etc - mostly things which street epistemology specifically makes harder due to the recommended techniques. I'd suggest that, in future instances where you'd like to push against cult-like abuses due to worrying you might be risking encouraging them, you can inline my point here and specifically state the details, such as that encouraging people to believe things risks being too convincing and that frequent reminders should be present to ensure people stay connected to their existing social networks unless they really have a strong personal reason not to.
Just a thought, anyway.
Returning from the tangent: I agree, convincing people that multi-agent alignment is a critical step for life on earth does seem like the #1 problem facing humanity, and we're reaching an era where the difference between human and AI has already been blurred. If we are to ensure that no subnetwork of beings replaces another, it is critical to find and spread the knowledge of how to ensure all beings are in prosocial alignment with each other at least enough to share whatever our cosmic endowment is.
Thanks. Yeah this all sounds extremely obvious to me, but I may not have included such obvious-to-Logan things if I was coaching someone else.
Key things to avoid include isolating people from their friends, breaking the linguistic association of words to reality, demanding that someone change their linguistic patterns on the spot, etc - mostly things which street epistemology specifically makes harder due to the recommended techniques
Are you saying street epistemology is good or bad here? I've only seen a few videos and haven't read through the intro documents or anything.
Good. people have [edit: some] defenses against abusive techniques and from what I've seen of Street epistemology it's responses to most of those is to knock on the front door rather than trying to sneak in the window, metaphorically speaking.
Have you gotten farther with this? It seems like a potentially very impactful thing to me. I also had the idea recently of paying skeptical AI researchers to spend a few hours discussing/debating their reasons for skepticism
Having a group of people who have experience convincing people of alignment helps us:
If you're not convinced of alignment, and would be willing to donate 1 hour of your time talking to me, I'd appreciate if you messaged me to schedule a call these next couple of weeks.
If you'd like to help convince people of alignment and build a curriculum w/ me, I'd also appreciate a message and schedule a call to chat.
What is Street Epistemology?
It's a way of getting to the core disagreements of a conversation, instead of talking in circles, past each other, and never getting anywhere.
Recently I had a conversation that went like this (note: details are vague, but this was the general flow of the conversation. They are a friend and had read Superintelligence but were not working in alignment):
[Note that we could've gone off on tangents like explaining different people's research agendas or saying "if you truly believed AI could cause an existential risk, then a pay cut's not a big deal"]
He said later “I could see myself transitioning if I made it a hobby first”. A good question in general may then be "How could you see yourself transitioning into alignment, realistically?" or giving people very concrete pathways that other people have gone through.
The Current Plan
I'll meets up/call 0-5 people each week to help build my personal understanding so that I can better convince others and coach people in the future.
Find at least 1 other person interested in this to start creating a curriculum together. I would expect them to also have conversations, coach people, and potentially read up on the street epistemology site for tips and tricks.
After ~20 calls and building the curriculum, start coaching people on how to do it themselves. One idea is to do 5 sessions of
I also expect to uncover common reasons for skepticisms and unwillingness, which I can write about and convince someone else to help fix those problems
Isn't This Kind of Weird or Cult-like?
In high-level detail, I can see that comparison, but when you actually have these conversations, they come across as very honest and intentional.
Call to Action
Repeating the beginning, if you're not convinced of alignment, and would be willing to donate 1 hour of your time talking to me, I'd appreciate if you messaged me to schedule a call these next couple of weeks.
If you'd like to help convince people of alignment and build a curriculum w/ me, I'd also appreciate a message and schedule a call to chat.