You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

scientism comments on What happens when your beliefs fully propagate - Less Wrong Discussion

20 Post author: Alexei 14 February 2012 07:53AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread.

Comment author: scientism 14 February 2012 04:36:28PM 6 points [-]

Here's what I was thinking as I read this: Maybe you need to reassess cost/benefits. Apply the Dark Arts to games and out-Zynga Zynga. Highly addictive games with in-game purchases designed using everything we know about the psychology of addiction, reward, etc. Create negative utility for a small group of people, yes, but syphon off their money to fund FAI.

I think if I really, truly believed FAI was the only and right option I'd probably do a lot of bad stuff.

Comment author: Alex_Altair 14 February 2012 05:19:17PM 6 points [-]

Let's start a Singularity Casino and Lottery.

Comment author: JenniferRM 17 February 2012 01:54:31AM 4 points [-]

I think if I really, truly believed FAI was the only and right option I'd probably do a lot of bad stuff.

You might want to read through some decision theory stuff and ponder it for a while. Also, even before that, please consider the possibility that your political instincts are optimized to get a group of primates to change a group policy in a way you prefer while surviving the likely factional fight. If you really want to be effective here or in any other context requiring significant coordination with numerous people, it seems likely to me that you'll need to adjust your goal directed tactics so that you don't leap into counter-productive actions the moment you decide they are actually worth doing.

Baby steps. Caution. Lines of retreat. Compare and contrast your prediction with: the valley of bad rationality.

I have approached numerous intelligent and moral people who are perfectly capable of understanding the basic pitch for singularity activism but who will not touch it with a ten-foot pole because they are afraid to be associated with anything that has so much potential to appeal to the worst sorts of human craziness. Please do something other than confirm these bleak but plausible predictions.