I operate by Crocker's rules.
I try to not make people regret telling me things. So in particular:
- I expect to be safe to ask if your post would give AI labs dangerous ideas.
- If you worry I'll produce such posts, I'll try to keep your worry from making them more likely even if I disagree. Not thinking there will be easier if you don't spell it out in the initial contact.
What don't LLMs linearly represent?
Sure, just put everyone in stasis until the batteries are refilled.
The mass of a black hole is proportional to its radius, not to its volume like with rocks. So you can't make a two-dimensional mesh of black holes, only a one-dimensional mesh.
To capture dark energy aka the expansion of the universe, take some masses and let them expand apart, there's your potential energy. If you ever capture all of it, you can go collect the matter that has stopped expanding away from you.
Finally, there is a bit of an antistrategy (exemplified by Athens, Riyadh, and Istanbul), which simply doesn’t work. People punish cooperators at high rates and thereby harm themselves and everyone else. Why would anyone think that’s a good idea?
Good question! I am confused that they didn't ask the participants. This could have just been a translation effect. When the control questions show the participants don't understand the setup, they get to try again until they succeed... I would try staring at the raw data to try and come up with a hypothesis, but I don't see it anywhere... I guess I'll send an author an email.
My purpose for the first is to not let a raider steal his entry fee too.
My purpose for the second is to effectively require consensus for any motions to pass, while leaving a way out of deadlock that is neutral in expectation.
You could have the proceeds from new governance tokens go to the governors instead of the treasury. You could make any motion that doesn't reach consensus split the DAO: If p% were in favor, the company has a p% chance of being split up among those in favor, and a 1-p% chance of being split up among those opposed.
You mean, are 100 founders enough that 60% cannot coordinate a raid? You'll have trouble telling whether 60 of the founders are a rich guy in sixty trenchcoats.
I infer they didn't get "The most forbidden technique". Try again with e.g. "Never train an AI to hide its thoughts."?