"Quantified" or "Quantifiable Altruism" is a pretty good idea; the name sounds both nerdy enough to select for quant-skilled people, and less self-righteous than "Effective Altruism".
Not sure about the word connotations (since I don't know what a new person looking from outside would think), but the current name sounds more... friendly? Emotional? Positive? The letter "E" seems friendlier than "Q". (This is probably not the right level of abstraction to base this decision on, but I figured nobody else would bring it up.)
(Side note: "EA" can make lots of people confuse us with a US-based video game company, while "QA" only makes a few people confuse us with a process step.)
I saw somebody on twitter suggesting "LessBad" to match "LessWrong". https://twitter.com/Kat__Woods/status/1642291750621581312
Funny thing: the intention of "LessWrong" is "less wrong than I was before" but I have heard people take it to mean "less wrong than everyone else is". (LessBad obviously has the same problem.)
AI has become so incredibly important that any utilitarian-based charity should probably be totally focused on AI.
Aimed Altruism is an alternative but it could confuse people with a different AA group
How about "strategic altruism" or "rational philanthropy"?
I think that Outcome-Focused Altruism is better.
Comparative Altruism or Compared-Outcome Altruism might be epistemically accurate, but it suffers from the cute puppy/low decoupler/punchable problem. Comparing outcomes is implied.
I appreciate the thought put into these two, can you think of more?
Ezgainzism
It feels to me as if the punchable thing is more "effective altruist" than "effective altruism".
That is: if you say e.g. "I am interested in Effective Altruism", you're saying that you want to do altruism effectively (or to know how to, whether or not you actually do it, or something like that), and that seems pretty unobjectionable. Whereas if you say "I am an Effective Altruist", you're claiming that you actually are doing altruism effectively, and that's more likely to raise hackles. [EDITED much later to add:] You're also implicitly claiming that other people aren't doing altruism effectively, which is also pretty hackle-raising.
But terms like "aspiring effective altruist" are super-clumsy, and if the movement is called "effective altruism" then it's inevitable that the people in it will start calling themselves "effective altruists", which is indeed what has happened, and while inevitable it's a shame.
None of this is much help for deciding whether a better name is needed, or choosing one if so. Except to suggest: consider explicitly how any given proposal sounds both as a name for the movement and as a term for the people in it.
("Quantified" is less clunky than "quantifiable", but "quantified altruist" doesn't make much sense; "quantifiable altruist" kinda-sorta does but, again, clunky. I don't have a better suggestion.)
Relevant forecasting questions that need to be refined to make them empirically falsifiable:
[note: I'm not part of any EA organization, for at least some of the reasons I'm over-stating here. ]
I mean, the movement IS fairly self-righteous and punch-able. It attempts to take credit and support a very wide range of arbitrary dimensions of improvement, from animal rights to low-development-area disease mitigation to AI hand-wringing, but those things are not similar enough to actually combine into a coherent whole.
And then specific sub-groups attempt to redefine EA to mean their preferred cause, in a motte-and-bailey move to imply they're saving the world and you shouldn't ask inconvenient questions, like actual quantification of near-term results. In fact, unless you're prepared to go all-in on mosquito nets, I'd highly advise against highlighting measurement of results in your branding.
EA is currently still the final frontier for every vegan everywhere, for example, including those in professional networks such as government policy.
EA is the ultimate destination for virtually all vegans in the world right now
I'm not that familiar with the vegan/animal rights community. What do you mean by this, can you elaborate? I thought animal rights was a large movement in its own right, separate from EA?
What causes are core to EA, and not separate (or at least separable and probably should be separate) movements in their own right?
Reporting on my mental state here: I'm emotionally opposed to the name change; I like "effective altruism". I don't want to do quantifiable good; I want to do effective good. Having x-risk and animal welfare in the same category makes perfect sense to me.
[Disclaimer: this was not an April fools joke, people are actually working on this right now and this is the only critical window to contribute]
Recent comments by Holden Karnofsky, the leader of Open Philanthropy (and influential enough to initiate a name-change for EA):
It would still be better if there was a parent organization based on mutual respect/pragmatism; EA is currently still the final frontier for every vegan everywhere, for example, including those in professional networks such as government policy.
It seems to me like if someone thinks of a good enough name for EA, that people at least accept as a parent organization and an umbrella of mutual respect/pragmatism/sanity, then that will substantially increase the odds that it gets adopted.
Yudkowsky proposed a decent name and gave a decent justification for it, but it's probably worthwhile to have some people put 5 minutes into a name that could work (and spending 5 minutes thinking has a significant chance of yielding a galaxy-brained solution)
"Quantifiable Altruism", for example, seems like it would attract lots of math people and altruistic people, rather than people really into moral philosophy.
That seems great; I'm not sure if EA is currently bottlenecked on quant folk, but if there were tons of quant folk, it would at least make for a substantially larger talent pool for AI alignment upskilling programs to draw from. For ~1.5 years now, I've been working on a personal project that has been heavily bottlenecked by the fact that I don't know anyone I can trust who is really good at math. I'm currently skeptical of the idea that any group of people can have too many math nerds, to the point where they can't utilize most of them for important projects.
There's also other factors at play; EA is the ultimate destination for virtually all vegans in the world right now (hypothetically, a vegan could theoretically become dictator of the world and stomp out the meat industry a few years before a viable meat substitute is invented, so I can't say that it's the end 100% of vegans). A name like Quantified Altruism would be a good name for that too, as basically any vegan could look at their beloved pet and think "wow, that's only +1 animal, I ought to think about numbers" after hearing the words "Quantified Altruism". I suppose you could worry about getting the best possible ratio of geniuses to emotionally unstable people, but it's also the case that competent people want to know precisely what you're talking about when you refer to something for the first time; "Effective Altruism" has some serious problems with this, as Yudkowsky made clear.
It's also important to keep in mind that the ingroup-outgroup mentality is probably the human brain's most exploited zero-day, in all of human history. Since EA will inevitably become an ingroup for some people and an outgroup for others, this fact is highly relevant.
What could EA change it's name to?