Does anybody now if dark matter can be explained as artificial systems based on known matter? It fits well the description of stealth civilization, if there is no way to nullify gravitational interaction (which seems plausible). It would also explain, why there is so much dark matter - most of the universe's mass was already used up by alien civs.
Overscrupulous chemistry major here. Both Harry and Snape are wrong. By the Pauli exclusion principle an orbital can only host two electrons. But at the same time, there is no outermost orbital - valence shells are only oversimplified description of atom. Actually, so oversimplified that no one should bother writing it down. Speaking of HOMOs of carbon atom (highest [in energy] occupied molecular orbitals), each has only one electron.
My problem with such examples is that it seems more like Dark Arts emotional manipulation than actual argument. What your mind hears is that, if you're not believing in God, people will come to your house and kill your family - and if you believed in God they wouldn't do that, because they'd somehow fear the God. I don't see how is this anything else but an emotional trick.
I understand that sometimes you need to cut out the nuance in morality thought experiments, like equaling taxes to being threatened to be kidnapped, if you don't regularly pay a racket. But the opposite thing is creating exciting graphic visions. Watching your loved one raped is not as bad as losing a loved one - but it creates a much better psychological effect, targeted to elicit emotional blackmail.
Can anybody point me to what choice of interpretation changes? From what I understand it is an interpretation, so there is no difference in what Copenhagen/MWI predict and falsification isn't possible. But for some reason MWI seems to be highly esteemed in LW - why?
Small observation of mine. While watching out for sunk cost fallacy it's easy to go to far and assume that making the same spending is the rational thing. Imagine you bought TV and the way home you dropped it and it's destroyed beyond repair. Should you just go buy the same TV as the cost is sunk? Not neccesarily - when you were buying the TV the first time, you were richer by the price of the TV. Since you are now poorer, spending this much money might not be optimal for you.
First, I wouldn't call it a solution, since the original "you" will not get transported, and the em-you will suffer 1000 times unnecessarily. Second, consider reading the various anthropic arguments, such as SSA, SIA, Sleeping beauty and such, if you are so inclined.
Big thanks for poiting me to Sleeping beauty.
It is a solution to me - it doesn't feel like a suffering, just as few minute tease before sex doesn't feel that way.
Let's consider a similar problem. Suppose you've just discovered that you've got cancer. You decide to buy a pill that would erase your memory of diagnosis. From your new perspective, your chance of having cancer will be 2%. In this situation, you can change your knowledge about your odds of having cancer or not, but you can't change whether you actually have cancer, or reduce the probability of someone with your symptoms having cancer.
What seems to make this situation paradoxical is that when you make a decision, it seems to change the probability that you had before you made the decision. This isn't quite what happens. If you accept determinism, you were always going to make that decision. If you had known that you were going to make this decision before you actually made it, then your probability estimate wouldn't have changed when you made the decision. The reason why the probability changes is that you have gained an additional piece of information, that you are the kind of person to make a vow to simulate yourself. You might have assigned a smaller probability to the chance that you were such a person before you decided to actually do it.
What I had in mind isn't a matter of manually changing your beliefs, but rather making accurate prediction whether or not you are in a simulated world (which is about to become distinct from "real" world), based on your knowledge about existence of such simulations. It could just as well be that you asked your friend, to simulate 1000 copies of you in that moment and having him teleport you to Hawaii as 11 AM strikes.
What's the probability that you'd be in a tropical paradise in one minute?
Depends on whether you consider the simulated people "you" or not. Also depends on whether "in one minute" is by our world clock, or clocks in future time, by our world clocks, saying one minute from now.
By "me" I consder this particular instance of me, which is feeling that it sits in a room and which is making such promise - which might of course be a simulated mind.
Now that I think about it, it seems to be a problem with a cohesive definition of identity and notion of "now".
Two fluid model of anthropics. The two different fluids are "probability" and "anthropic measure." Probabilities come from your information, and thus you can manipulate your probability by manipulating your information (e.g. by knowing you'll make more copies of yourself on the beach). Anthropic measure (magic reality fluid) measures what the reality is - it's like how an outside observer would see things. Anthropic measure is more properly possessed by states of the universe than by individual instances of you.
Thus a paradox. Even though you can make yourself expect (probability) to see a beach soon, it doesn't change the fact that you actually still have to sit through the cold (anthropic measure). Promising to copy yourself later doesn't actually change how much magic reality fluid the you sitting there in the cold has, so it doesn't "really" do anything.
Anthropic measure (magic reality fluid) measures what the reality is - it's like how an outside observer would see things. Anthropic measure is more properly possessed by states of the universe than by individual instances of you.
It doesn't look like a helpful notion and seems very tautological. How do I observe this anthropic measure - how can I make any guesses about what the outside observer would see?
Even though you can make yourself expect (probability) to see a beach soon, it doesn't change the fact that you actually still have to sit through the cold (anthropic measure).
Continuing - how do I know I'd still have to sit through the cold? Maybe I am in my simulated past - in hypothetical scenario it's a very down-to-earth assumption.
Sorry, but above doesn't clarify anything for me. I may accept that the concept of probability is out of the scope here, that bayesianism doesn't work for guessing whether one is or isn't in a certain simulation, but I don't know if that's what you meant.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
A huge swarm/sphere of solar collectors uses up precious materials (silicon, etc) that are far more valuable to use in ultimate compact reversible computers - which don't need much energy to sustain anyway.
You seem to be bottomlining. Earlier you gave cold reversible-computing civs reasonable probability (and doubt), now you seem to treat it as an almost sure scenario for civ developement.