Well, yes, it's true, and obviously those things do not necessarily all have genuine infinite value. I think what this really means in practice is not that all non-fungible things have infinite value, but that because they are non-fungible, most judgements involving them are not as easy or straightforward as simple numerical comparisons. Preferences end up being expressed anyway, but just because practical needs force a square peg in a round hole doesn't make it fit any better. I think this in practice manifests in high rates of hesitation or regret for decisions involving such things, and the general difficulty of really squaring decisions like these We can agree in one sense that several trillion dollars in charity are a much greater good than someone not having their fingers cut off, and yet we generally wouldn't call that person "evil" for picking the latter option because we understand perfectly how to someone their own fingers might feel more valuable. If we were talking about fungible goods we'd feel very differently. Replace cutting one's fingers with e.g. demolishing their house.
Probabilities for physical processes are encoded in quantum wavefunctions one way or another, so I'd put that under the umbrella of "winning a staring contest with the laws of physics", which was basically what the average Spiral Energy user did.
And then again, while optimistic, the series still does show Simon using his power responsibly and essentially renouncing it to avoid causing the Spiral Nemesis. He doesn't just keep growing everything exponentially and decide nothing bad can ever possibly come out of it.
I think the core message of optimism is a positive one, but of course IRL we have to deal with a world whose physical laws do not in fact seem to bend endlessly under sufficient application of MANLY WARRIOR SPIRIT, and thus that forces us to be occasionally Rossiu even when we'd want to be Simon. Memeing ourselves into believing otherwise doesn't really make it true.
People often say that wars are foolish, and both sides would be better off if they didn't fight. And this is standardly called "naive" by those engaging in realpolitik. Sadly, for any particular war, there's a significant chance they're right. Even aside from human stupidity, game theory is not so kind as to allow for peace unending.
I'm not saying obviously that ALL conflict ever is avoidable or irrational, but there are a lot that are:
And I'd say that just about makes up a good 90% of all conflicts. There's a thing where people who are embedded into specialised domains start seeing the trees ("here is the complex clockwork of cause-and-effect that made this thing happen") and missing the forest ("if we weren't dumb and irrational as fuck none of this would have happened in the first place"). The main point of studying past conflicts should be to distil here and there a bit of wisdom about how in fact lot of that stuff is entirely avoidable if people can just stop being absolute idiots now and then.
I think there's one fundamental problem here IMO, which is that not everything is fungible, and thus not everything manages to actually comfortably exist on the same axis of values. Fingers are not fungible. At the current state of technology, once severed, they're gone. In some sense, you could say, that's a limited loss. But for you, as a human being, it may as well be infinite. You just lost something you'll never ever have back. All the trillions and quadrillion dollars in the world wouldn't be enough to buy it back if you regretted your choice. And thus, while in some sense its value must be limited (it's just the fingers of one single human being after all, no? How many of those get lost every day simply because it would have been a bit more expensive to equip the workshop with a circular saw that has a proper safety stop?), in some other sense the value of your fingers to you is infinite, completely beyond money.
Bit of an aside - but I think this is part of what causes such a visceral reaction in some people to the idea of sex reassignment surgery, which then feeds into transphobic rationalizations and ideologies. The concept of genuinely wanting to get rid of a part of your body that you can't possibly get back feels so fundamentally wrong on some level to many people, it pretty much alone for them seals the deal that you must either be insane or having been manipulated by some kind of evil outside force.
It's also not really a movie as much as a live recording of a stage play. But agree it's fantastic (honestly, I'd be comfortable calling it Aladdin rational fanfiction).
Also a little silly detail I love about it in hindsight:
During the big titular musical number, all big Disney villains show on stage to make a case for themselves and why what they wanted was right - though some of their cases were quite stretched. Even amidst this collection of selfish entitled people, when Cruella De Vil shows up to say "I only wanted a coat made of puppies!" she elicits disgust and gets kicked out by her fellow villains, having crossed a line. Then later on Disney thought it was a good idea to unironically give her the Wicked treatment in "Cruella".
Must be noted that all that subtext is entirely the product of the movie adaptation. The short story absolutely leaves no room for doubt, and in fact concludes on a punchline that rests on that.
This muddies the alienness of AI representation quite a bit.
I don't think that's necessarily it. For example, suppose we build some kind of potentially dangerous AGI. We're pretty much guaranteed to put some safety measures in place to keep it under control. Suppose these measures are insufficient and the AGI manages to deceive its way out of the box - and we somehow still live to tell the tale and ask ourselves what went wrong. "You treated the AGI with mistrust, therefore it similarly behaved in a hostile manner" is guaranteed to be one of the interpretations that pop up (you already see some of this logic, people equating alignment to wanting to enslave AIs and claiming it is thus more likely to make them willing to rebel). And if you did succeed to make a truly "human" AI (not outside of the realm of possibility if you're training it on human content/behaviour to begin with), that would be a possible explanation - after all, it's very much what a human would do. So is the AI so human it also reacted to attempt to control it as a human would - or so inhuman it merely backstabbed us without the least hesitation? That ambiguity exists with Ava, but I also feel like it would exist in any comparable IRL situation.
Anyway "I am Mother" sounds really interesting, I need to check it out.
Only tangentially related, but one very little known movie that I enjoyed is the Korean sci-fi "Jung_E". It's not about "alien" AGI but rather about human brain uploads used as AGI. It's quite depressing, along the lines of that qntm story you may have read on the same topic, but it felt like a pretty thoughtful representation of a concept that usually doesn't make it a lot into mainstream cinema.
Curious - what other AI depictions are you considering/comparing to? I'm not 100% sure about what my best would be, I find good bits and pieces here and there in several movies (Ex Machina, 2001: A Space Odyssey, even the very cheesy but surprisingly not entirely unserious M3gan) but maybe not a single organic example I'd place above the rest.
I think it's a very visible example that right now is particularly often brought up. I'm not saying it's all there is to it but I think the fundamental visceral reaction to the very idea of self-mutilation is an important and often overlooked element of why some people would be put off by the concept. I actually think it's something that makes the whole thing a lot more understandable in what it comes from than the generic "well they're just bigoted and evil" stuff people come up with in extremely partisan arguments on the topics. These sort of psychological processes - the fact that we may first have a gut-level reaction, and only later rationalize it by constructing an ideological framework to justify why the things that repulses us are evil - are very well documented, and happen all over the place. Does not mean everyone who disagrees with me does so because of it (nor that everyone who agrees doesn't do it!) but it would be foolish to just pretend this never happens because it sounds a bit offensive to bring up in a debate. The entire concept of rationality is based around the awareness that yeah, we're constantly affected by cognitive biases like these, and separating the wheat from the chaff is hard work.
And by the way it's an excellent example of the reverse too. Just like people who are not dysphoric are put off by mutilation, people who are are put off by the feeling of having something grafted onto their bodies that doesn't belong. Which is sort of the flip side of it. Essentially we tend to have a mental image of our bodies and a strong aversion to that shape being altered or disturbed in some way (which makes all kinds of sense evolutionarily, really). Ironically enough, it's probably via the mechanism of empathy that someone can see someone else do something to their body that feels "wrong" and cringe/be grossed out on their behalf (if you think trans issues are controversial, consider the reactions some people can have even to things like piercings in particularly sensitive places).