No "real" universe running "top-level" simulations is actually necessary, because our observations are explained without need for those concepts.
Compat is not an explanatory theory, it's a predictive one. It's proposed as a consequence of the speed prior rather than a competitor.
Compat is funneling a fraction X of the reality fluid (aka "computational resources") your universe gets from the top-level speed prior into heaven simulations. Simulating heaven requires a fraction Y of the total resources it takes to simulate normal physics for those observers. So just choose X s.t. X / Y > 1, or X > Y
This becomes impossible to follow immediately. As far as I can tell what you're saying is
Rah := resources applied to running heaven for Simulant
R := all resources belonging to Host
X := Rah/R
Rap := Resources applied to the verbatim initial physics simulations of Simulant.
and Y := Rah/Rap
Rap < R
so Rah/Rap > Rah/R
so Y > X
Which means either you are generating a lot of confusion very quickly to come out with Y < X, or it would take far too much effort for me to noise-correct what you're saying. Try again?
If you are just generating very elaborate confusions very fast- I don't think you are- but if you are, I'm genuinely impressed with how quickly you're doing it, and I think you're cool.
I am getting the gist of a counterargument though, which may or may not be in the area of what you're angling at, but it's worth bringing up.
If we can project the solomonoff fractal of environmental input generators onto the multiverse and find that they're the same shape, the multiversal measure of higher complexity universes is so much lower than the measure of lower complexity universes that it's conceivable that higher universes can't run enough simulations for P(issimulation(loweruniverse)) to break 0.5.
There are two problems with that. I'm reluctant to project the solomonoff hierarchy of input generators onto the multiverse, because it is just a heuristic, and we are likely to find better ones, the moment we develop brains that can think in formalisms properly at all. I'm not sure how the complexity of physical laws generally maps to computational capacity. We can guess that capacityprovidedby(laws) < capacityrequiredtosimulate(laws) (no universe can simulate itself), but that's about it. We know that the function expectedinternalcomputationalcapacity(simulation_requirements) has a positive gradient, but it could end up having a logarithmic curve to it that allows drypat(a variant of compat that requires P(simulation) to be high) to keep working.
Another other issue is, I think I've been overlooking this, drypat isn't everything. Compat with quantum immortality precepts doesn't require P(simulation) to be high at all. For compat to be valuable, it just has to be higher than P(path to deletarious quantum immortality). In this case, supernatural intervention is unlikely, but, if non-existence is not an input, finding one's inputs after death to be well predicted by compat is still very likely, because the alternative, QI, is extremely horrible.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
The article looks like they're trying to say how awesome their game is.
Any TECHNICAL difference between that and, say, a decent rogue-like algorithm? I have a feeling that it's scaled-up rather than technical up.
Also, couldn't see a GitHub link, so I'm assuming this is proprietary and therefore have no reason to trust whatever they say.
An example of a technical move forward would be a game world that is so large it must be procedurally generated, that also has the two properties that it is massively multiplayer, and that players can arbitrarily alter the environment.
You'd get the technical challenge of reconciling player-made alterations to the environment with the "untouched" version of the environment according to your generative algorithm. Then you'd get the additional challenge of sharing those changes across lots of different players in real time.
I don't get the sense that either of the two properties (massively multiplayer and alterable environment) are a big part of this game.
If a game with all three properties (procedural generation of a large universe, massively multiplayer, and alterable environment) were to be made, it'd make me take a harder look as simulation arguments.