If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "
If there is One Weird Trick that you should using right now in order to game your way around anthropics, simulationism, or deontology, you don't know what that trick is, you won't figure out what that trick is, and it's somewhat likely that you can't figure out what that trick is because if you did you would get hammered down by the acausal math/simulators/gods.
You also can't know if you're in a simulation, a Big quantum world, a big cosmological world, or if you're a reincarnation. Or one or more of those at the same time. And each of those realities would imply a different thing that you should be doing to optimize your ... whatever it is you should be optimizing. Which you also don't know.
So really I just go with my gut and try to generally make decisions that I probably won't think are stupid later given my current state of knowledge.
But you can make estimates of the probabilities (EY's estimate of the big quantum world part, for example, is very close to 1).
That just sounds pretty difficult, as my estimate of whether a decision is stupid or not may depend hugely on the assumptions I make about the world. In some... (read more)