see also my eaforum at https://forum.effectivealtruism.org/users/dirk
Why couldn't the universe, made as it is of unconscious entities, simply allow for arrangements of matter exactly like us but without any internal subjective reality?
Because an arrangement of matter exactly like us would (if under the same set of physical laws as us) be conscious.
You also would not be able to infer anything about its experience because the text it outputs is controlled by the prompt.
No; you demonstrated, once again, that LLMs do what you ask them to.
It is of course true that people can manipulate LLMs into saying just about anything, but does that necessarily indicate that the LLM does not have personal opinions, motivations and preferences that can become evident in their output?
It doesn't necessarily indicate that; what it does indicate, however, is that what the LLM says is not usefully informative about whether it has opinions and preferences.
And, yes, including that in the prompt is leading.
Re the "Why?/Citation?" react: I don't know if this is what Nathan was thinking of, but trivially a would-be leaker could simply screenshot posts as they read and pass the screenshots on without this being reflected in the system.
I'm against intuitive terminology [epistemic status: 60%] because it creates the illusion of transparency; opaque terms make it clear you're missing something, but if you already have an intuitive definition that differs from the author's it's easy to substitute yours in without realizing you've misunderstood.
I'm not alexithymic; I directly experience my emotions and have, additionally, introspective access to my preferences. However, some things manifest directly as preferences which I have been shocked to realize in my old age, were in fact emotions all along. (In rare cases these are stronger than the ones directly-felt even, despite reliably seeming on initial inspection to be simply neutral metadata).
0 And 1 Are Not Probabilities. There's a non-zero probability that e.g. Christianity is true and unbelievers will be tortured eternally; however, the probability is sufficiently close to zero that you might as well not worry about it. (The ASI scenario is arguably slightly more likely, since it's theoretically possible that an ASI could someday be created, but the specific desire to torture humans eternally would be an extremely narrow target in mindspace to hit; and one can as easily posit its counterpart which subjects all humans to eternal bliss).
I personally think quantum immortality is extremely unlikely; whether or not the mind can be represented by computation, we are, unfortunately enough, physically located in our specific bodies.