What you're describing is not really different in principle from using specialized hardware like GPUs for rendering polygons instead of running everything on the same general CPUs. There are ASICs for hashing (used for Bitcoin mining), FGPAs (real-time signal processing, I think), and of course, TPUs for AI inference. And with cloud-computing, would you even know if your computation was actually being done with different physics than you thought?
If we’re doing our utmost to avoid creating them, then the likelihood of having to destroy one is minimal
This is an unwarranted assumption about the effectiveness of your preventative policies. It's perfectly plausible that your only enforcement capability is after-the-fact destruction.
No, if one does not "approve of destroying self-aware AIs," the incentives you would create are first to try to stop them being created, yes, but after they're created (or when it seems inevitable that they are), to stop you from destroying them.
If you like slavery analogies, what you're proposing is the equivalent of a policy that to ensure there are no slaves in the country, any slaves found within the borders be immediately gassed/thrown into a shredder. Do you believe the only reasons any self-proclaimed abolitionists would oppose this policy to be that they secretly wanted slavery after all?
Sure, this is a mainstream enough observation that I'm gonna point you to Ezra Klein et al's new book, Abundance. My ideal solution involves much less government action beyond getting rid of the imposed rules, and would go significantly further in deregulation (delenda FDA, for example), but I agree with most of the problems he lists.
These regulations, and thousands like them, were never intended to be implemented.
In America, the equivalent stupid rules actually are enforced, which makes them much worse: they would be less harmful if the institutions were as dysfunctional as those of the post-Soviet Eastern Europe in your post. Increasing levels of corruption[1] drastically, and making it a cultural norm, would ameliorate this, and might be almost as effective as abolishing the agencies that make the regulations while being less easily reversed.
Which I like to call "the people's deregulation."
Pleasure and pain of living beings matter because they are a proxy for that being's evolutionary fitness.
This just moves the question sideways. Why should I care about an unrelated[1] organism's evolutionary fitness?
Also, what does this "evolutionary ethics" framework say about enslaving and raping the women of unrelated[1] tribes? Traditionally that wouldn't decrease their reproductive chances (except in the maladaptive case of using contraceptives, of course).
Or so distantly related that one can reasonably approximate it to be so, as you yourself do with leopards later in the thread.
Not quite what you asked, but I found this video visualizing Lagrange multipliers quite helpful. Plausibly it'll help clarify Euler–Lagrange as well.
Your units question is easy: you get the same dimensionless quantities whatever units you choose. Instead of thinking of units as dimensions, I'd think of them as basis vectors in a five-dimensional space (length, mass, time, current, temperature[1]): you should have exactly five of them, and they need to be "linearly" independent, but beyond that, you can choose any set you like: you could instead have something like the natural units of (speed, gravitationality[2], angular momentum, entropy, charge). In this conception, this 5D space is fundamental: you need at least five dimensions (the ones spanned by c, G, h, k and e), and if you want more, you need to find some new as-yet-unknown independent dimension (maybe baryon/lepton number counts?).
Of the seven SI units, the other two are Candela, which is some "human visual perception" bullshit masquerading as fundamental, and amount of stuff/Avogadro's constant, which I don't think meaningfully constitutes a "dimension."
ykwim: something with the units of G. If you don't like this, just choose mass instead.
I believe this was just a call to PimEyes.