When Will talks about hell, or anything that sounds like a religious concept, you should suppose that in his mind it also has a computational-transhumanist meaning. I hear that in Catholicism, Hell is separation from God, and for Will, God might be something like the universal moral attractor for all post-singularity intelligences in the multiverse, so he may be saying (in the great-grandparent comment) that if you are insufficiently attentive to the question of right and wrong, your personal algorithm may never be re-instantiated in a world remade by friendly AI. To round out this guide for the perplexed, one should not think that Will is just employing a traditional language in order to express a very new concept, you need to entertain the idea that there really is significant referential overlap between what he's talking about and what people like Aquinas were talking about - that all that medieval talk about essences, and essences of essences, and all this contemporary talk about programs, and equivalence classes of programs, might actually be referring to the same thing. One could also say something about how Will feels when he writes like this - I'd say it sometimes comes from an advanced state of whimsical despair at ever being understood - but the idea that his religiosity is a double reverse metaphor for computational eschatology is the important one. IMHO.
I have become convinced that problems of this kind are the number one problem humanity has. I'm also pretty sure that most people here, no matter how much they've been reading about signaling, still fail to appreciate the magnitude of the problem.
Here are two major screw-ups and one narrowly averted screw-up that I've been guilty of. See if you can find the pattern.
It may not be immediately obvious, but all three examples have something in common. In each case, I thought I was working for a particular goal (become capable of doing useful Singularity work, advance the cause of a political party, do useful Singularity work). But as soon as I set that goal, my brain automatically and invisibly re-interpreted it as the goal of doing something that gave the impression of doing prestigious work for a cause (spending all my waking time working, being the spokesman of a political party, writing papers or doing something else few others could do). "Prestigious work" could also be translated as "work that really convinces others that you are doing something valuable for a cause".
We run on corrupted hardware: our minds are composed of many modules, and the modules that evolved to make us seem impressive and gather allies are also evolved to subvert the ones holding our conscious beliefs. Even when we believe that we are working on something that may ultimately determine the fate of humanity, our signaling modules may hijack our goals so as to optimize for persuading outsiders that we are working on the goal, instead of optimizing for achieving the goal!
You can see this all the time, everywhere:
There's an additional caveat to be aware of: it is actually possible to fall prey to this problem while purposefully attempting to avoid it. You might realize that you have a tendency to only want to do particularly prestigeful work for a cause... so you decide to only do the least prestigeful work available, in order to prove that you are the kind of person who doesn't care about the prestige of the task! You are still optimizing your actions on the basis of expected prestige and being able to tell yourself and outsiders an impressive story, not on the basis of your marginal impact.