LessWrongers as a group are often accused of talking about rationality without putting it into practice (for an elaborated discussion of this see Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality). This behavior is particularly insidious because it is self-reinforcing: it will attract more armchair rationalists to LessWrong who will in turn reinforce the trend in an affective death spiral until LessWrong is a community of utilitarian apologists akin to the internet communities of anorexics who congratulate each other on their weight loss. It will be a community where instead of discussing practical ways to "overcome bias" (the original intent of the sequences) we discuss arcane decision theories, who gets to be in our CEV, and the most rational birthday presents (sound familiar?).
A recent attempt to counter this trend or at least make us feel better about it was a series of discussions on "leveling up": accomplishing a set of practical well-defined goals to increment your rationalist "level". It's hard to see how these goals fit into a long-term plan to achieve anything besides self-improvement for its own sake. Indeed, the article begins by priming us with a renaissance-man inspired quote and stands in stark contrast to articles emphasizing practical altruism such as "efficient charity"
So what's the solution? I don't know. However I can tell you a few things about the solution, whatever it may be:
- It wont feel like the right thing to do; your moral intuitions (being designed to operate in a small community of hunter gatherers) are unlikely to suggest to you anything near the optimal task.
- It will be something you can start working on right now, immediately.
- It will disregard arbitrary self-limitations like abstaining from politics or keeping yourself aligned with a community of family and friends.
- Speaking about it would undermine your reputation through signaling. A true rationalist has no need for humility, sentimental empathy, or the absurdity heuristic.
Whatever you may decide to do, be sure it follows these principles. If none of your plans align with these guidelines then construct a new one, on the spot, immediately. Just do something: every moment you sit hundreds of thousands are dying and billions are suffering. Under your judgement your plan can self-modify in the future to overcome its flaws. Become an optimization process; shut up and calculate.
I declare Crocker's rules on the writing style of this post.
"there is no connotation that this authority has no evidential entanglement with the subject of the argument" -- quite correct. Which is why it is fallacious: it is an assertion that this is the case without corroborating that it actually is the case.
If it were the case, then the act would not be an 'appeal to authority'.
This is categorically invalid. Humans are not bayesian belief-networks. In fact, humans are notoriously poor at assessing their own probabilistic estimates of belief. But this, really, is neither hither nor thither.
The only error that's occurring here is your continued belief that beliefs are relevant to this conversation. They simply aren't. We're not discussing "what should you believe" -- we are discussing "what should you hold to be true."
And that, sir, categorically IS binary. A thing is either true or not true. If you affirm it to be true you are assigning it a fixed binary state.
You have a point in saying that "12.2485%" is an unlikely number to give to your degree of belief in something, although you could create a scenario in which it is reasonable (e.g. you put 122485 red balls in a bag...). And it's also fair to say that casually giving a number to your degree of belief is often unwise when that number is plucked from thin air - if you are just using "90%" to mean "strong belief" for example. The point about belief not being... (read more)