This article is a deliberate meta-troll. To be successful I need your trolling cooperation. Now hear me out.
In The Strangest Thing An AI Could Tell You Eliezer talks about asognostics, who have one of their arm paralyzed, and what's most interesting are in absolute denial of this - in spite of overwhelming evidence that their arm is paralyzed they will just come with new and new rationalizations proving it's not.
Doesn't it sound like someone else we know? Yes, religious people! In spite of heaps of empirical evidence against existence of their particular flavour of the supernatural, internal inconsistency of their beliefs, and perfectly plausible alternative explanations being well known, something between 90% and 98% of humans believe in the supernatural world, and is in a state of absolute denial not too dissimilar to one of asognostics. Perhaps as many as billions of people in history have even been willing to die for their absurd beliefs.
We are mostly atheists here - we happen not to share this particular delusion. But please consider an outside view for a moment - how likely is it that unlike almost everyone else we don't have any other such delusions, for which we're in absolute denial of truth in spite of mounting heaps of evidence?
If the delusion is of the kind that all of us share it, we won't be able to find it without building an AI. We might have some of those - it's not too unlikely as we're a small and self-selected group.
What I want you to do is try to trigger absolute denial macro in your fellow rationalists! Is there anything that you consider proven beyond any possibility of doubt by both empirical evidence and pure logic, and yet saying it triggers automatic stream of rationalizations in other people? Yes, I pretty much ask you to troll, but it's a good kind of trolling, and I cannot think of any other way to find our delusions.
P(Accomplish goals given get really rich) > P (Accomplish goals given ~get really rich)
vs.
P[(Accomplish goals given try to get really rich & (get really rich or ~get really rich)] ?>=<? P(Accomplish goals given ~try to get really rich)
My symbols kind of suck in this format, but you seem to be arguing the former when the latter is the relevant consideration. It also ignores the goal of personal happiness; I would guess that most people in practice have very high coefficients for themselves and loved ones in their utility functions, regardless of what they profess to believe.
Oh, and the whole claim about not valuing social status enough and, in particular, not valuing sex with extremely attractive women is, well, unsupported, to put it extremely charitably. Unless people here have the goals of "showing people up" or "having sex with extremely attractive women whose interest in them is contingent on their wealth," adapting those values would not be conducive to accomplishing their current goals, so failing to adapt them is hardly an error.
More to the point, saying that people aren't pursuing wealth and claiming the specific cause of this is a lack of valuing social status is like saying people aren't buying a Mercedes because they don't adequately value an all-leather interior. There are many other values that would attain the ends, and there are many other ends that would fulfill the values. I'd go into this at length, but the post did explicitly condone trolling, so I won't take this too seriously.
This is not to say that you're wrong (about wealth being a rational goal for meeting our existing goals); I don't have the numbers to shut up and multiply. I'm just saying you may well not be right.
What kind of numbers do you think you would need to shut up and multiply? No trolling, just an honest question. To clarify, I support Roko here -- I've been thinking along the same lines for some time.