jacob_cannell comments on Less Wrong: Open Thread, September 2010 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (610)
What ideas? I'm pretty sure I find whatever you are talking about interesting and shiny, but I'm not quite sure what it even is.
Any ideas. For the SIAI it would probably be existential risks then UFAI later, in general it could be rationality or evolution or atheism or whatever.
What is the whole industry you speak of? Self-help, religion, marketing? And what additional advertising? I think that spreading the ideas is important as well, I"m just not sure what you are considering.
Advertising/marketing. Short of ashiest bus ads, I can't think of anything that's been done.
All I'm really suggesting is that we focus on mass persuasion in the way it has been proven to be most efficient. What that actually amounts to will depend on the target audience, and how much money is available, among other things.
Did you mean "atheist bus ads"? I actually find strict-universal-atheism to be irrational compared to agnosticism because of the SA and the importance of knowing the limits of certainty, but that's unrelated and I digress.
I've long suspected that writing popular books on the subject would be an effective strategy for mass persuasion. Kurzweil has certainly had a history of some success there, although he also brings some negative publicity due to his association with dubious supplements and the expensive SingUniversity. It will be interesting to see how EY's book turns out and is received.
I'm actually skeptical about how far rationality itself can go towards mass persuasion. Building a rational case is certainly important, but the content of your case is even more important (regardless of its rationality).
On that note I suspect that bridging a connection to the mainstream's beliefs and values would go a ways towards increasing mass marketability. You have to consider not just the rationality of ideas, but the utility of ideas.
It would be interesting to analyze and compare how emphasizing the hope vs doom aspects of the message would effect popularity. SIAI at the moment appears focused on emphasizing doom and targeting a narrow market: a subset of technophile 'rationalists' or atheist intellectuals and wooing academia in particular.
I'm interested in how you'd target mainstream liberal christians or new agers, for example, or even just the intellectual agnostic/atheist mainstream - the types of people who buy books such as the End of Faith, Breaking the Spell, etc etc. Although a good portion of that latter demographic is probably already exposed to the Singularity is Near.
I'm not sure what I'd do, but I'm not a marketing expert either. (Though I am experimenting)
It would probably be possible to make a campaign that took advantage of UFAI in sci-fi. AI's taking over the world isn't a difficult concept to get across, so the ad would just need to persuade that it's possible in reality, and there is a group working towards a solution.
I hope you haven't forgotten our long drawn out discussion, as I do think that one is worthwhile.
AIs taking over the world because they have implausibly human-like cognitive architectures and they hate us or resent us or desire higher status than us is an easy concept to get across. It is also, of course, wrong. An AI immediately taking apart the world to use its mass for something else because its goal system is nothing like ours and its utility function doesn't even have a term for human values is more difficult; because of anthropomorphic bias, it will be much less salient to people, even if it is more probable.
They have the right conclusion (plausible AI takeover) for slightly wrong reasons. "hate [humans] or resent [humans] or desire higher status than [humans]" are slightly different values than ours (even if just like the values humans often have towards other groups)
So we can gradually nudge people closer to the truth a bit at a time by saying "Plus, it's unlikely that they'll value X, so even if they do something with the universe it will not have X"
But we don't have to introduce them to the full truth immediately, as long as we don't base any further arguments on falsehoods they believe.
If someone is convinced of the need for asteroid defense because asteroids could destroy a city, you aren't obligated to tell them that larger asteroids could destroy all humanity when you're asking for money. Even if you believe bigger asteroids to be more likely.
I don't think it's dark epistemology to avoid confusing people if they've already got the right idea.
Writing up high-quality arguments for your full position might be a better tool than "nudging people closer to the truth a bit at a time". Correct ideas have a scholar appeal due to internal coherence, even if they need to overcome plenty of cached misconceptions, but making that case requires a certain critical mass of published material.
I do see value in that, but I'm thinking of a TV commercial or youtube video with a terminator style look and feel. Though possibly emphasizing that against real superintelligence, there would be no war.
I can't immediately remember a way to simplify "the space of all possible values is huge and human like values are a tiny part of that" and I don't think that would resonate at all.