Comment author: calcsam 16 August 2011 12:59:12AM *  9 points [-]

Not feasible. Let's aim for a more modest goal, say, better PR and functional communities.

Moreover, not this community's comparative advantage. Why do we think we'd be any better than anyone else at running the world? And why wouldn't we be subject to free-riders, power-seekers, and rationalists-of-fortune if we started winning?

Comment author: Arandur 16 August 2011 04:13:42PM 0 points [-]

Functional communities would be nice. I'm not so sure that better PR is the way to go. Why not no PR? Why not subtle induction via existing infrastructure? Let the people who most deserve to be here be the ones who will find us. Let us not go out with blaring trumpet, but with fishing lure.

Comment author: peter_hurford 16 August 2011 07:07:11AM 2 points [-]

By the way, sorry that this comment treats you like you're new to LW -- I can see from going through your comment and post history that you're not. My mistake.

Comment author: Arandur 16 August 2011 04:10:15PM *  9 points [-]

That's quite all right; I'm sure the naivete blossoming forth from the OP makes that an easy mistake to make. :P

I'm well aware of the Discussion Section... which only compounds my error. Yes, this should have been posted there. Losing some eighty Karma (by the way, apparently negative Karma does not exist per se, but perhaps it does de facto... is as good a wakeup call as any for the sin of overconfidence.

I would have traded my karma simply for the advice you've given here. Thank you. And thank you for the compliment on my writing style; nice to see not everything about this experience was negative. I assure you that I will not be leaving any time soon. When I first saw that this post was getting a negative response, I made a split-second decision: should I flee, or should I learn? I choose to learn.

Comment author: orthonormal 16 August 2011 01:50:04PM *  16 points [-]

Conditional on a Conspiracy existing, the probability that they'd reveal themselves to an unknown person asking via e-mail has to be pretty low. What you obviously should have done instead is to brainstorm for five minutes on how you would really recruit new members if you were the Conspiracy, or alternately on what courses of action you could take to benefit the Conspiracy if it existed. But, like I said, it's too late now- instead, you've signaled that you're clever enough to come up with an idea but not disciplined enough to think it through properly, and that's precisely the type of member a Bayesian Conspiracy would wish to avoid.

Comment author: Arandur 16 August 2011 04:04:51PM 6 points [-]

Your chastisement is well taken. Thank you.

Comment author: Mitchell_Porter 16 August 2011 05:16:56AM 30 points [-]

The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression. It's also likely that the various forms of rationalistic utilitarian altruism existing in certain corners of the Internet will eventually give rise to a distinctive ideology that will take its place in the spectrum of political views that count. It is even possible that some intersection of these two currents - the futurological rationalism on display at this site - will give rise to a politically minded movement or organization. This post, the earlier "Altruist Support" sequence by Giles, a few others show that there's some desire to do this. However, as things stand, this desire is still too weak and formless for anyone to actually do anything, and if anyone did become worked-up and fanatical enough to organize seriously, the result would most likely be an irrelevant farce, a psychodrama only meaningful to half a dozen people.

The current post combines: complete blindness with respect to what's involved in acquiring power at a national or international level; no sense of how embattled and precarious is the situation of futurist causes like cryonics and Friendly AI; and misplaced confidence in the correctness of the local belief system.

Let's start with the political naivete. Rather than taking over openly, it's proposed that the Conspiracy could settle for

a simple infiltration of the world's extant political systems

I love the word "simple". Look, politics isn't a game of hide and seek. Ideological groups have the cohesion that they do because membership in the group depends on openly espousing the ideology. If you get to be head of the politburo of the Tragic Soulfulness League after years of dutifully endorsing the party line, and then, once you're in charge, you announce to your colleagues that you actually believe in Maximum Happiness, what happens is that the next day, the media carry the tragically soulful news of the unfortunate accident which cut you down just at the beginning of your term in office, and your successor, the former deputy head, wiping away a tear, vows to uphold the principles of the tragic soul, just as you would have wanted.

the Conspiracy becomes the only major influence in world politics

A perfect picture of fanaticism... Apparently you think of political influence only in terms of belief systems. The perfect end state is that the one true belief system is triumphant! But political influence is also an expression just of the existence of a group of people; it means that the system knows about them, listens to them, contains their representatives. If the world still contains a billion Indians or three hundred million Americans, then India and America will continue to be major "influences" in world politics.

Now let's turn to the author's innocence regarding the situation of cryonics, etc, in the world.

we should devote fewer of our resources to cryonics and life extension, and focus on saving the lives of those to whom these technologies are currently beyond even a fevered dream

In other words, the microscopic number of highly embattled people who are currently working on these matters, should instead take on the causes which are already ubiquitously signposted as Good, and which already receive billions of dollars per year. The rationale proposed for this perspective is that when the Conspiracy is in charge, it will own all the resources of the world, so it will be able to afford to do both things at once.

Arandur, if you take this line of thought, you end up working neither on life extension nor on poverty alleviation, but simply on assuming power, with the plan of doing those promised good works at some unknown time in the future.

In passing, let's consider what specific proposals are offered here, regarding the solution of recognized problems like war and starvation (as opposed to unrecognized problems like ageing or unfriendly AI)? The answers I see are (1) spend even more money on them (2) trust us to think of a better approach, we're rationalists and that means we're better at problem-solving.

At least an explicitly transhumanist agenda would bring something concrete and new to politics. With respect to the existing concerns of politics, this proposal offers no-one any reason to offer you a share of power or to support your aspirations.

Finally, fanatical faith in the correctness of the local philosophy and the way that it is just destined to empower the true believer:

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.

It is even more demonstrable that one's level of self-identification as a rationalist has a direct correlation to the probability that one is irrelevant to anything of any significance, especially the sort of worldly affairs that you are talking about.

Comment author: Arandur 16 August 2011 06:35:34AM 4 points [-]

I'm being pulled off to bed, but from my skimming this looks like a very, very helpful critique. Thank you for posting it; I'll peruse it as soon as I'm able. One note: I did note after posting this, but too late to make a meaningful change, that "we should support cryonics less" is rather a ridiculous notion, considering the people I'm talking to are probably not the same people who are working hardest on cryonics. So: oops.

Comment author: peter_hurford 16 August 2011 03:08:53AM 6 points [-]

I suppose the biggest question is, is all this realistic? Or is just an idealist's dream?

While beautifully written; it does sound all an idealist's dream. Or at least you have said very little to suggest otherwise.

More downvotes would send you to negative karma if there is such a place, and that's a harsh punishment for someone so eloquent. In sparing you a downvote, I encourage you to figure out what went wrong with this post and learn from it.

If there's three things I've found in my little time here it is that the community strongly admires in posts is novelty (the post discusses new material or adds to material in a way that is not covered in other posts), specifics (the post explains a plan for action or a set of facts to be learned rather than more vague philosophic generalities), and balance (the post examines the pros and cons of making a change, rather than appearing one-sided). You're post seems little on all of the three.

Comment author: Arandur 16 August 2011 06:32:24AM 2 points [-]

..... I will meditate on this constructive criticism. Thank you very much; I think this is the most useful response I've seen.

Comment author: Kevin 16 August 2011 03:10:07AM 0 points [-]

Bayesian Conspiracy @ Burning Man 2011, a social group? Ha.

Comment author: Arandur 16 August 2011 06:31:32AM 2 points [-]

I do apologize if I've given offense; not having had the opportunity yet to attend, I used the broadest term I could conjure while maintaining applicability.

Comment author: wedrifid 16 August 2011 03:52:43AM 2 points [-]

Downvoted. This is a serious post and this comment adds absolutely nothing to the discussion. Funny references belong on reddit.

Reversed. I liked the comment. You underestimate the relevance.

Comment author: Arandur 16 August 2011 06:30:47AM 3 points [-]

Seconded. I actually found this very relevant, and quite a good point.

Comment author: lessdazed 16 August 2011 03:05:30AM *  2 points [-]

1) I reject the implication that there is no amount of humor that could justify a comment regardless of its other substance (given its length and the context). I accept for consideration the criticism that my comment wasn't funny enough, but not that it was categorically wrong to have a comment that is nothing but humorous.

2) To say that the comment had no substance aside from humor is a fine enough thing to say, because and only because the reader will interpret it as meaning that you didn't see any other substance. It is a fine enough thing to say if one thinks the probability of other substance is sufficiently low...but how close to zero did you think it was? "World domination" really did make me think of Pinky and the Brain, FWIW.

3) The value of a comment with no substance aside from humor here was to somewhat mitigate what I saw as an impending avalanche of critical comments and downvotes.

Comment author: Arandur 16 August 2011 06:30:17AM 1 point [-]

Heh, I appreciate the mitigation.

Comment author: Vaniver 16 August 2011 02:05:04AM 3 points [-]

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.

Really? How would one demonstrate this? What does it mean for a definition to be "correct"? If something is true by definition, is it really demonstrable?

we have a moral obligation to work our hardest on this project

Really? Your plan is to get people interested in world domination by guilting them?

Comment author: Arandur 16 August 2011 06:29:09AM 2 points [-]

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination...

I hadn't considered that, but now I see it clearly. How interesting.

Really? Your plan is to get people interested in world domination by guilting them?

Ha! If that would work, maybe it'd be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying "See that poor starving African woman? if you had listened to my plan, she'd be happier." But I won't be doing that.

Comment author: lessdazed 16 August 2011 02:51:07AM *  13 points [-]

Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger

I was watching a hockey game with my ex-girlfriend when a fight broke out (on the ice, not between us). "That shouldn't be allowed!" she said. "It isn't," I responded. "It's a five minute penalty." "But the referees are just watching them fight. They should stop them from fighting!" "That's not an action. They can move their bodies and arms, and step between them, or pull them from behind. But 'making them stop' isn't something that a person can just decide to do. If they step between them now, someone could get hurt."

"Ending negligence" unfortunately isn't an action, unlike, say, typing. It's more like "stopping fighting".

Comment author: Arandur 16 August 2011 06:27:05AM 1 point [-]

That's quite true. But I have a hunch (warning: bare assertion) that much governmental negligence is due to a) self-interest and b) corruption (see: corrupt African dictatorships).

View more: Prev | Next