From their site:
OpenAI is a non-profit artificial intelligence research company. Our goal is to advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.
The money quote is at the end, literally—$1B in committed funding from some of the usual suspects.
There is no way around this problem because the run-up to the singularity is going to bring huge economic and military benefits to having slightly better AI than anybody else. Moloch is hard to beat.
Ok, we have many nuclear powers in world, but only one main non-proliferation agency that is IAEA, and some how work. The same way we could have many AI-projects in the world, but one agency which provide safety guidelines (and it will be logical that it will be MIRI+Bostrom as they did most known research in the topic). But if we have many agencies which provide different guidelines or even several AI with slightly different friendliness we are doomed.