being dishonest when honesty would violate local norms doesn't necessarily feel like selling your soul. like concretely, in most normal groups of people, it is considered a norm violation to tell someone their shirt is really ugly even if this is definitely true. so I would only tell someone this if I was sufficiently confident that they would take it well - maybe I've known them long enough that I know they like the honesty, or we are in a social setting where people expect it. imo, it doesn't take galaxy brain consequentialism to arrive at this particular norm, or impugn one's honor to comply with this norm.
with respect to the climate change example, it seems instructive to observe the climate people who feel an urge to be maximally doomerish because anything less would be complacent, and see if they are actually better at preventing climate change. I'm not very deeply embedded in such communities, so I don't have a very good sense. but I get the vibe that they are in fact less effective towards their own goals: they are too prone to dismiss actual progress, lose a lot of productivity to emotional distress, are more susceptible to totalizing "david and goliath" ideological frameworks, descend into purity spiral infighting, etc. obviously, the facts of AI are different, but this still seems instructive as a case study to look deeper into.
to be clear, a very important part of the culture of the antechamber is encouraging people to spend time in the arena, or if people are not ready to do so, to encourage people to grow emotionally so that they can handle being in the arena.
I don't hear that one as often - what's a good example? in particular, I hear people complain all the time that LW is too critical of ideas, and that when you post anything a whole bunch of people will appear out of the woodwork to critique you. I don't feel like I've ever heard anyone say that people in LW are too uncritical and unwilling to challenge things they disagree with
maybe I should host an antechamber/arena house party: one chill cozy room with soothing music where no arguing is allowed and people are strongly encouraged to say kind things and reflect on things they're grateful for and whatnot, and another with harsh fluorescent lights and agitating music and a big whiteboard full of hot takes and the conversations all get transcribed by speech to text and posted on lesswrong in real time. and guests are given a heart rate monitor that beeps if their HR gets too high, forcing them to spend a few minutes in the chill room before returning to the arena
I don't think unsycophantic kindness is quite that difficult to achieve. clearly some groups of people IRL achieve such kindness. generally, people in such communities try to understand each other and why they believe the things they do without judgement in either direction, and affirm the emotional responses to beliefs rather than the beliefs themselves. you don't have to agree with someone to agree that you'd feel the same in their shoes. somehow, these groups of people don't inevitably slide into subtle sneering and trolling and sycophancy.
plus, the point of explicitly separating the arena and the antechamber is to make it clear that when you are receiving kindness, you are not receiving updates towards truth. so it is clear to you, and to people around you, that receiving emotional validation in the antechamber is not evidence that your beliefs are correct. it's valid for people to spend all their time in the antechamber, but everyone will see this, and assign less weight to the truthfulness of their beliefs.
I also don't think non-sycophantic kindness causes people to dig in to their incorrect beliefs. if anything, it seems more common that people dig into incorrect beliefs because of a sense of adversity against others. think about how much more painful it is to concede a point if your interlocutor is being really mean about it, vs if they are thoughtful and hear you out.
while there may exist people who do not have this emoting in the first place, there are far more people who are too emotionally unaware to even realize the emotional drives behind their beliefs and actions. this is often very obvious to people around them. so I won't take anyone's word on this matter, and instead only trust a track record of behavior.
inline ASM doesn't seem that bad. I don't write much performance critical CPU code but it's very common in performance critical GPU code.
I claim there's a pareto frontier of epistemic correctness vs emotional kindness. some things, like sneering at people and implying that they are foolish, are pareto suboptimal. but once you achieve pareto optimality, there is a tradeoff between kindness and correctness; and what I think should exist is two distinct spaces on different parts of this tradeoff curve (and of course nobody should do pareto suboptimal things)
I mean, sure, maybe maximal doomerish is not exactly the right term for me to use. but there's definitely a tendency for people to be worried that being insufficiently emotionally scared and worried will make them complacent. to be clear, this is not about your epistemic p(doom); I happen to think AGI killing everyone is more likely than not. but really feeling this deeply emotionally is very counterproductive for my actually reducing x-risk.