I don't doubt that LLMs could do this, but has this exact thing actually been done somewhere?
The "one weird trick" to getting the right answers is to discard all stuck, fixed points. Discard all priors and posteriors. Discard all aliefs and beliefs. Discard worldview after worldview. Discard perspective. Discard unity. Discard separation. Discard conceptuality. Discard map, discard territory. Discard past, present, and future. Discard a sense of you. Discard a sense of world. Discard dichotomy and trichotomy. Discard vague senses of wishy-washy flip floppiness. Discard something vs nothing. Discard one vs all. Discard symbols, discard signs, discard waves, discard particles.
All of these things are Ignorance. Discard Ignorance.
Is this the same principle as "non-attachment"?
Make a letter addressed to Governor Newsom using the template here.
For convenience, here is the template:
September [DATE], 2024
The Honorable Gavin Newsom
Governor, State of California
State Capitol, Suite 1173
Sacramento, CA 95814
Via leg.unit@gov.ca.govRe: SB 1047 (Wiener) – Safe and Secure Innovation for Frontier Artificial Intelligence Models Act – Request for Signature
Dear Governor Newsom,
[CUSTOM LETTER BODY GOES HERE. Consider mentioning:
- Where you live (this is useful even if you don’t live in California)
- Why you care about SB 1047
- What it would mean to you if Governor Newsom signed SB 1047
SAVE THIS DOCUMENT AS A PDF AND EMAIL TO leg.unit@gov.ca.gov
]Sincerely,
[YOUR NAME]
This matches my memory as well.
I have no idea, but I wouldn't be at all surprised if it's a mainstream position.
My thinking is that long-term memory requires long-term preservation of information, and evolution "prefers" to repurpose things rather than starting from scratch. And what do you know, there's this robust and effective infrastructure for storing and replicating information just sitting there in the middle of each neuron!
The main problem is writing new information. But apparently, there's a protein evolved from a retrotransposon (those things which viruses use to insert their own RNA into their host's DNA) which is important to long term memory!
And I've since learned of an experiment with snails which also suggests this possibility. Based on that article, it looks like this is maybe a relatively new line of thinking.
It's good news for cryonics if this is the primary way long term memories are stored, since we "freeze" sperm and eggs all the time, and they still work.
Do you know if fluid preservation preserves the DNA of individual neurons?
(DNA is on my shortlist of candidates for where long-term memories are stored)
Consider finding a way to integrate Patreon or similar services into the LW UI then. That would go a long way towards making it feel like a more socially acceptable thing to do, I think.
Yeah, that's not what I'm suggesting. I think the thing I want to encourage is basically just to be more reflective on the margin of disgust-based reactions (when it concerns other people). I agree it would be bad to throw it out unilaterally, and probably not a good idea for most people to silence or ignore it. At the same time, I think it's good to treat appeals to disgust with suspicion in moral debates (which was the main point I was trying to make) (especially since disgust in particular seems to be a more "contagious" emotion for reasons that make sense in the context of infectious diseases but usually not beyond that, making appeals to it more "dark arts-y").
As far as the more object-level debate on whether disgust is important for things like epistemic hygiene, I expect it to be somewhere where people will vary, so I think we probably agree here too.
I meant wrong in the sense of universal human morality (to the extent that's a coherent thing). But yes, on an individual level your values are just your values.
Well, I'm very forgetful, and I notice that I do happen to be myself so... :p
But yeah, I've bitten this bullet too, in my case, as a way to avoid the Boltzmann brain problem. (Roughly: "you" includes lots of information generated by a lawful universe. Any specific branch has small measure, but if you aggregate over all the places where "you" exist (say your exact brain state, though the real thing that counts might be more or less broad than this), you get more substantial measure from all the simple lawful universes that only needed 10^X coincidences to make you instead of the 10^Y coincidences required for you to be a Boltzmann brain.)
I think that what anthropically "counts" is most likely somewhere between conscious experience (I've woken up as myself after anesthesia), and exact state of brain in local spacetime (I doubt thermal fluctuations or path dependence matter for being "me").