I'm skeptical about trying to build FAI, but not about trying to influence the Singularity in a positive direction. Some people may be skeptical even of the latter because they don't think the possibility of an intelligence explosion is a very likely one. I suggest that even if intelligence explosion turns out to be impossible, we can still reach a positive Singularity by building what I'll call "modest superintelligences", that is, superintelligent entities, capable of taking over the universe and preventing existential risks and Malthusian outcomes, whose construction does not require fast recursive self-improvement or other questionable assumptions about the nature of intelligence. This helps to establish a lower bound on the benefits of an organization that aims to strategically influence the outcome of the Singularity.
- MSI-1: 105 biologically cloned humans of von Neumann-level intelligence, highly educated and indoctrinated from birth to work collaboratively towards some goal, such as building MSI-2 (or equivalent)
- MSI-2: 1010 whole brain emulations of von Neumann, each running at ten times human speed, with WBE-enabled institutional controls that increase group coherence/rationality (or equivalent)
- MSI-3: 1020 copies of von Neumann WBE, each running at a thousand times human speed, with more advanced (to be invented) institutional controls and collaboration tools (or equivalent)
(To recall what the actual von Neumann, who we might call MSI-0, accomplished, open his Wikipedia page and scroll through the "known for" sidebar.)
Building a MSI-1 seems to require a total cost on the order of $100 billion (assuming $10 million for each clone), which is comparable to the Apollo project, and about 0.25% of the annual Gross World Product. (For further comparison, note that Apple has a market capitalization of $561 billion, and annual profit of $25 billion.) In exchange for that cost, any nation that undertakes the project has a reasonable chance of obtaining an insurmountable lead in whatever technologies end up driving the Singularity, and with that a large measure of control over its outcome. If no better strategic options come along, lobbying a government to build MSI-1 and/or influencing its design and aims seems to be the least that a Singularitarian organization could do.
You exaggerate slightly, but if anyone with more than a passing familiarity with the history of the Middle Ages were offered the choice between being tried by a civil court or an ecclesiastical court, then they would choose the later without hesitation.
Also, of particular interest to Less Wrongers, the manuals created by Church lawyers for use by Inquisitors are important to the later development of probability theory. They included such topics as how to calculate "grades of evidence" (which can be contrasted with the "proof" or "no proof" methods of earlier Roman law) and even how much to discount witness testimony (taking into account not only whether the witness saw a particular act directly or only heard it happen from a room away, but whether the witness had a grudge against the defendant or other motives for wanting them punished unjustly).
Who knows, perhaps people like Alicorn who "don't think in numbers" would be better served by using a (pre-Pascal) probability theory like the one the Inquisitors made use of that included rigorously defined verbal probabilities such as suspicion, presumption, indication, support, vehement support, and conjecture instead of floating point numbers.