daig

A voice for the unknown human who hides within the binary.

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
daig9-2

Thanks for making this map 🙏

 

I expect this is a rare moment of clarity because maintaining updates takes a lot of effort and is now subject to optimization pressure.

Also imo most of the "good" alignment work in terms of eventual impact is being done outside the alignment label (eg as differential geometry or control theory) and will be merged in later once the connection is recognized. Probably this will continue to become more true over time.

 

daig520

I’d like to offer some data points without much justification, I hope it might spur some thought/discussion without needing to be taken on faith:

  • I’m not a “vassarite”
  • I’ve never met Vassar
  • I’ve independently come to many of the same observations, before encountering the rationality community, that are typically attributed to vassar’s crazy making, eg around an “infinitely corrupt” and fundamentally coercive/conformist world. I’m not sure what to make of this yet but at the least it makes insistence that we all just decide to talk things over rationally a bit tiring. I think this realization interacts poorly from the rationalist starting point of “save the world at all costs” and clearcut views of cooperation vs defection - which contributed to a lot of the breakdowns around the vassar folks.
  • my impression is that he’s very angry at the world, has a strong sense of personal importance and agency, and a disregard for others psychological well-being. I suspect he relishes in ”breaking” those unprepared to integrate his personal truth, because at least then they can’t be used in service of the “infinitely corrupt”
  • i think the way he was expelled from the community relied on some pretty underhanded social gaming, denied the agency of those compelled by him, and completely missed the way in which he was actually harmful, such that it now proliferates in other forms hidden from discussion.
  • I have met Jessica and found her one of the clearest thinkers in the rationalist extended community by a large margin (Not to say she is the only, there are others more prominent)
  • I’ve never had a psychotic break or instability otherwise, and yet
  • Her account of psychosis is the most lucid I’ve encountered- and her current epistemic integration of it seems really sound
  • i think psychosis is quite complicated, and represents a willingness to leave consensus territory more than a magnitude of absolute wrongness (although it is obviously characterized by non-reality-tracking beliefs). In particular i see many commonly accepted beliefs in the world (some in the rationalists) both more wrong and more harmful than most of what psychotics come up with. I think the impact and source of the wrong beliefs should be considered rather, than treating it as a magical departure from agency.
  • I’ve seen prominent members of the rationalist community consistently employing similar tactics around attentional misdirection, disowned power asymmetry, and commitment escalation that they accuse their villains of; vassar, geoff, brent, ziz etc.
  • Jessica seems committed to honest truth seeking detached from political agenda in a way I really appreciate and to a degree i haven’t seen elsewhere at similar levels in the community.
  • I’ve noticed a semi-systematic pattern of “very sane” people goading people in an unstable space of uncharted territory into being more crazy than they would be otherwise, by dismissing their concerns in a ways that are obviously erroneous, preemptively treating them as dangerous, etc - driving them further from consensus rather than helping them reintegrate.
  • the less “weird” an organization, and the more members relying on its credibility, the more costly it is to speak against them - so it tends to only come from those who can hold their own or who have nothing left to lose (frequently coming from a damaged psychological state). Leverage is easier to call out on than MIRI/CFAR, and there are still more upstanding factions in the same boat. Onlookers might do well to take this availability bias into account when sizing up the accounts on each side.
  • The tightly knit rationalist power hubs have a near monopoly on “weird x-risk mitigation” funding and intellectual capital - so for the kind of people who take x-risk deathly seriously, they may see no other option than to submit to the official narrative for fear of getting blacklisted. EDIT: would also like to add that I’m generally optimistic about the individuals involved with CFAR/MIRI especially in light of the discussing here, and despite having similar experiences as jessica with them in the past. I do worry that nitpicking specifics of “who’s worse” or hunting “bad actors” detracts from understanding the root dynamics that lead to so many dramatic incidents in the extended community.
daig40

Here is my take:
Value is a function of the entire state space, and can't be neatly decomposed as a sum of subgames.
Rather (dually), value on ("quotient") subgames must be confluent with the total value on the joint game.
Eg, there's an "enjoying the restaurant food" game, and a "making your spouse happy" game, but the joint game of "enjoying a restaurant with your spouse" game has more moves available, and more value terms that don't show up in either game, like "be a committed couple".
"Confluence" here means that what you need to forget to zoom in on the "enjoying the restaurant food" subgame causes your value judgement of "enjoying the restaurant food" and "enjoying a restaurant with your spouse, ignoring everything except food" to agree.
The individual subgames aren't "closed", they were never closed, their value only makes sense in a larger context, because the primitives used to define that value refer to the larger context. From the perspective of the larger game, no value is "destroyed", it only appears that way when projecting into the subgames, which were only ever virtual.