Wiki Contributions

Comments

Sorted by

Ah, well paleontologists aren't exactly our target group.

If you target people likely to understand X-risk, they should have no more crazy sounding people than X-risk currently has, should they? Like IT/computer science people, other technical degrees? Sci-fi people perhaps? Any kind of technophile?

It's great to have responses more thought out than one's original idea!

The people who would misunderstand existential risk, are you thinking it's better to leave them in the dark as long as possible so as to not disturb the early existential risk movement, or that they will be likelier to accept existential risk once there is more academic study? Or both? The downside of course is that without publicity you will have fewer resources and brains on the problem.

I agree it is best not to mention far future stuff. People are already familiar with nuclear war, epidemics and AI-trouble(with Gates, Hawking and Musk stating their concern), so existential risk itself isn't really that unfamiliar.

For the part about people just seeing the title and move on: you can have a suitably vague title, but even if not, what conclusions can they possibly draw from just a title? I don't think people remember skimming over one.

I have no idea what those search terms mean, but it sounds like a good idea. Perhaps you should run such a campaign?

You seem like a very down to earth guy, MarsColony_in10years :)

I'm not sure X-risk needs to be complicated though. The basics is just "Future technology may be dangerous and needs to be studied more". That should be enough to support the cause. One doesn't need to, and I don't think Bostrom does, go into the complicated things you mentioned.

The part in Bostrom's video where he talks about future people colonizing the galaxy and uploading themselves into computers and reach a post human condition should probably be cut for mainstream viewers, and maybe the expected utility calculations, other than that I don't see what could turn people off?

I was thinking using Bostroms Ted talk, if that is succesful you can consider making an ad. The adblocker point is interesting, could be polled.

I think that would vary too much depending on the video to make a meaningful comparison. Better to compare 0.2$ to the oppurtunity costs of word of mouth and other methods of spreading existential risk awareness, isn't it?

Has anyone tried advertising existential risk?

Bostroms "End of Humanity" talk for instance.

It costs about 0.2 $ per view for a video ad on YouTube, so if 0.2% of viewers give an average of 100 $ it would break even. Hopefully people would give more than that.

You can target ads to groups likely to give much by the way, like the highly educated