Why aren't [big number] of educated people a superintelligence now?
They are. Many collections of individuals (e.g. tech companies, hedge funds, PACs, etc.) seem to do rather a lot more than an individual human could. Likewise, humanity as a whole could be classified as a superintelligence (and possibly a recursively self-improving one: see the Flynn effect). The idea is not that large numbers of intelligent people aren't a superintelligence, it's that 10000 von Neumanns would be a more powerful superintelligence than most groups of highly intelligent people.
Downvoted for using terms imprecisely; see The Virtue of Narrowness.
Superintelligences are not "any powerful entity"; humanity is not "recursively self-improving". This conversation was over some time in 2009 when Eliezer finally got Tim Tyler to stop applying those terms to things that already exist, as though that meant anything.
I'm skeptical about trying to build FAI, but not about trying to influence the Singularity in a positive direction. Some people may be skeptical even of the latter because they don't think the possibility of an intelligence explosion is a very likely one. I suggest that even if intelligence explosion turns out to be impossible, we can still reach a positive Singularity by building what I'll call "modest superintelligences", that is, superintelligent entities, capable of taking over the universe and preventing existential risks and Malthusian outcomes, whose construction does not require fast recursive self-improvement or other questionable assumptions about the nature of intelligence. This helps to establish a lower bound on the benefits of an organization that aims to strategically influence the outcome of the Singularity.
(To recall what the actual von Neumann, who we might call MSI-0, accomplished, open his Wikipedia page and scroll through the "known for" sidebar.)
Building a MSI-1 seems to require a total cost on the order of $100 billion (assuming $10 million for each clone), which is comparable to the Apollo project, and about 0.25% of the annual Gross World Product. (For further comparison, note that Apple has a market capitalization of $561 billion, and annual profit of $25 billion.) In exchange for that cost, any nation that undertakes the project has a reasonable chance of obtaining an insurmountable lead in whatever technologies end up driving the Singularity, and with that a large measure of control over its outcome. If no better strategic options come along, lobbying a government to build MSI-1 and/or influencing its design and aims seems to be the least that a Singularitarian organization could do.