Posts

Sorted by New

Wiki Contributions

Comments

I think emotional nihilism is more like a utility function that's locally constant at zero. You have emotional investments, but they're options that are too far out of the money. (Worse is when your short puts and calls are at the money and your longs are out of it.)

Makes me wish initiatives like U of Ottawa's JI-R would turn into real journals that could reliably publish at least quarterly.

This is also very unlike the outcome of donating to a charity, which I can believe is approximately log-normal.

This can't be right, because log-normal variables are never negative, and charitable interventions do backfire (e.g. Scared Straight, or any health-care program that promotes quackery over real treatment) a non-negligible percentage of the time.

That raises an interesting question: is it possible to base a moral code only on what's true in all possible worlds that contain me?