EDIT: This post no longer reflects my current attitude. I'm now signed up as a volunteer for SIAI and will help them with the website and/or whatever else needs doing. Add a comment or contact me if you're curious as to what my attitude was or why it changed.

What I've learnt: People want something more specific

What I've also learnt: Not to commit to donating money to an organization without carefully reading their website first

Imagine you are a prospective SIAI donor. You've learnt about AI and its risks, about how hardly anyone takes these risks seriously, about how people are fundamentally not mentally equipped to handle issues of existential risk in a sane way. You've looked around and seen that the SIAI is the only (or one of just a few) organizations that appear to realise this and want to do anything about it.

So you go to their website. What are you looking for? You're looking for a reason not to give money to them.

Here's one:

The Singularity Institute exists to carry out the mission of the Singularity-aware – to accelerate the arrival of the Singularity in order to hasten its human benefits; ...

This seems a somewhat gung-ho attitude which is not consistent with the message on the rest of the site. And this isn't just my misreading or quoting out of context - apparently that page is very out of date and no longer represents the worldview of the more mature, grown up SIAI.

But people reading the site don't know that. And remember, they're looking for reasons not to give - for reasons to retreat back to their comfort zone where everything's basically OK and the SIAI are just a bunch of weirdos.

The fact that an organization dedicated to shaping the future of humanity can't keep their website up to date would seem to be one of those reasons.

So, if you really believe the SIAI to be the most effective charity right now, you should help them by offering to fix their website for them - in order to help attract more donors.

Some possible objections and counter-objections:

1. If Giles thinks fixing the SIAI's website is so important, he'd already be doing it himself.

Essentially this boils down to the fact that you probably trust the SIAI a lot more than I do. So for me the community-building effort is the higher priority.

2. If the website was so important, the SIAI would already have fixed it. Better just to give them money and they'll spend it on fixing the website when it's optimal to do so.

This assumes that the SIAI behaves in a perfectly rational way. It also ignores the fact that people are going to look at their accounts and try to find evidence that they are actually engaging in saving-the-world type activities. If all they do is "fix our own website and make ourselves look good" then no-one's going to take them seriously. By donating your time to improve their website, you keep that activity off the balance sheet.

3. There are so many more important factors keeping people away from donating to the SIAI. Surely better to address those first?

Maybe - but they need to be fixed one at a time. And I believe the website to be a single point of failure - even people who are otherwise really keen might be put off by a single strange-sounding sentence appearing on the website.

Conclusion:

I don't think the website needs a big overhaul or a massive amount of new information. It just needs a little thought as to people's questions and concerns. Other than the page I mentioned, possible concerns might be:

- All of the issues that arose in the GiveWell interview

- A recognition of the non-strawman criticisms of the SIAI and how they are being addressed

- An answer to the only-game-in-town question: if we recognise that the SIAI is the only organization seriously addressing these issues but aren't sure of its effectiveness, are we better off giving now or waiting for a more effective organization to come along?

 

New to LessWrong?

New Comment
17 comments, sorted by Click to highlight new comments since: Today at 3:11 PM

Yay being specific!

If I were to volunteer to work on this, how would I go about it? Just post blocks of text here with changes written in and someone will upload them if they think the changes are good?

Note: I don't know enough about the up-to-date positions of the SIAI, but if someone were to give a brief summary of them I could find and change things that conflict with them.

I'd suggest emailing them and asking. You can also ask whether there is a website-rewriting effort already in progress.

You can also ask which pages are most/least up to date. The more up to date pages are more likely to reflect their current attitudes and may not need updating.

I'd expect their recent publications would also give a good picture of their current activities and worldview.

In fact, I'd suggest signing up as a volunteer like I did, or at least giving it some thought. (Just email Louie if you're at all interested). There's a lot of stuff that needs doing besides the website.

Thanks for the advice. I just sent the email. With any luck, I can now do some work translating the site into French.

That's awesome! I'll get in touch soon to make sure you're not translating stuff that I'm going to rewrite anyway.

I'll get in touch soon to make sure you're not translating stuff that I'm going to rewrite anyway.

Great, thanks. Right now I'm working on the "Arms Control and Intelligence Explosions" paper. Is that non-obsolete enough to be worthwhile, and if not which paper(s) are?

This all stems from a misunderstanding, which is that "Singularity" in that sentence implicitly means "positive Singularity". You see this usage quite often. To keep things concise, seemingly unnecessary qualifiers are sometimes dropped, but it's true that in the years since then, SIAI has so effectively raised the concept of a negative Singularity that people now think of Singularity as either positive or negative rather than just positive, a la Kurzweil & Singularity Hub. Ironic, though, that the memetic influence of the SIAI itself "obsoleted" its own website and that this could somehow be construed as incompetence in the organization!

This all stems from a misunderstanding, which is that "Singularity" in that sentence implicitly means "positive Singularity".

A misunderstanding on the part of the article's author, presumably. The page continues:

If there's a Singularity effort that has a strong vision of this future and supports projects that explicitly focus on transhuman technologies such as brain-computer interfaces and self-improving Artificial Intelligence, then humanity may succeed in making the transition to this future a few years earlier, saving millions of people who would have otherwise died.

This adds support to the "mad rush -> reap benefits -> yay!" interpretation. That seems to be quite some distance from the current, more sensible idea that it makes more sense to be mainly concerned with how well the transition goes.

The SIAI didn't really invent machine intelligence doom-mongering marketing. Hugo de Garis and Kevin Warwick were doing that before them - and machines-gone-wrong has been a staple of science fiction for far longer.

I would welcome specific suggestions for improvements. If you email me updates, I would be open to making them. louie.helm AT singinst.org

You are correct that the website is out-of-date in spots. It's unfortunate... can you help me fix these inconsistencies since it's easier for you to spot them as an outsider?

Please list the URL of the pages you want to change, then list what you want replaced, and please suggest new version that is consistent with Singularity Institute's actual values.

Thanks for your help Giles!

Here's one:

The Singularity Institute exists to carry out the mission of the Singularity-aware – to accelerate the arrival of the Singularity in order to hasten its human benefits; ...

This seems a somewhat gung-ho attitude which is not consistent with the message on the rest of the site. And this isn't just my misreading or quoting out of context - apparently that page is very out of date and no longer represents the worldview of the more mature, grown up SIAI.

Machine intelligence is a race. I think everyone involved is aware of the time pressure element. About the only strategy that doesn't involve attempting rapid progress is sabotaging other people's projects - and that looks like a pretty ineffective strategy - not least because such destruction probably won't "get" all of the projects.

I emailed them to ask about that particular sentence, and got back that it was out of date and doesn't accurately reflect their current position.

The issue is a lot more nuanced than just "singularity is bad" or "singularity is good" and these subtleties need to be made clear. Don't assume that your line of thinking will be immediately obvious to readers.

[EDIT: ... obvious to readers of the SIAI website, that is.]

About the only strategy that doesn't involve attempting rapid progress is sabotaging other people's projects - and that looks like a pretty ineffective strategy - not least because such destruction probably won't get all the projects.

I'd go as far as to suspect that making sabotage attempts is likely to speed up the rate of research so may only be expected to push back the critical date when the situation has got particularly urgent.

Sabotage and negative marketing seem rather common. For example, here is some baseless shit slinging:

And if Novamente should ever cross the finish line, we all die.

I'm not clear what the net effect of such FUD on the overall rate of progress (if any) is, though. Usually such strategies aim at hampering competitors - not at manipulating the overall rate of progress.

I think we should probably discourage the use of negative marketing in this area. I think it is more likely to be used by organisations with poor moral scruples - of the type we do not want to gain an advantage. Public disapproval may not eliminate it - but might at least drive it underground.