Comment author: Multiheaded 19 November 2012 02:35:23PM *  0 points [-]

As long as other people are polarized about some issue, you opinion about conflict in Gaza is essentialy a decision to join the "team Israel" or "team Palestine". This choice is absolutely unrelated to the actual people killing each other in the desert. This choice is about whether Joe will consider you an ally, and Jane an enemy, or the other way. With high probability, neither Joe nor Jane are personally related to people killing each other in the desert, and their choices were also based on their preference to be in the same team with some other people.

Data point: you probably know I'm left-wing (in an eccentric way) - and yet, frankly, I'm very "pro-Israel" (although not fanatically so), and think that all the cool, nice, cosmopolitan, compassionate lefty people who protest "Zionist aggression" should go fuck themselves in regards to this particular issue. This includes e.g. Noam Chomsky, whom I otherwise respect highly. And I realize that this lands me in the same position as various far-right types whom I really dislike, yet I'm quite fine with it too.

Yes, I'm not neurotypical. However, you know that I can and do get kinda mind-killed on other political topics. So I'm not satisfied by your explanation.

Comment author: AlphaOmega 19 November 2012 09:51:58PM *  2 points [-]

I think what Viliam_Bur is trying to say in a rather complicated fashion is simply this: humans are tribal animals. Tribalism is perhaps the single biggest mind-killer, as you have just illustrated.

Am I correct in assuming that you identify yourself with the tribe called "Jews"? For me, who has no tribal dog in this particular fight, I can't get too worked up about it, though if the conflict involved, say, Irish people, I'm sure I would feel rather differently. This is just a reality that we should all acknowledge: Our attempts to "overcome bias" with respect to tribalism are largely self-delusion, and perhaps even irrational.

Comment author: michaelcurzi 18 November 2012 10:50:01PM *  1 point [-]

It could be evidence that the questioner isn't worth engaging, because the conversation is unlikely to be productive. The questioner might have significantly motivated cognition or have written the bottom line.

Comment author: AlphaOmega 19 November 2012 02:20:39AM *  -5 points [-]

On the contrary, adversarial questioners are often highly productive. I've already incited one of the best comments you've seen on LessWrong, haven't I?

Yes, my cognition is significantly motivated along these lines. Doesn't Hitler deserve some of the credit for the rapid development of computers and nuclear bombs? Perhaps I or someone like me will play a similar role in the development of AI?

Comment author: AlphaOmega 17 November 2012 01:37:18AM 1 point [-]

Just a gut reaction, but this whole scenario sounds preposterous. Do you guys seriously believe that you can create something as complex as a superhuman AI, and prove that it is completely safe before turning it on? Isn't that as unbelievable as the idea that you can prove that a particular zygote will never grow up to be an evil dictator? Surely this violates some principles of complexity, chaos, quantum mechanics, etc.? And I would also like to know who these "good guys" are, and what will prevent them from becoming "bad guys" when they wield this much power. This all sounds incredibly naive and lacking in common sense!

Comment author: AlphaOmega 08 November 2012 10:38:00PM 0 points [-]

I can conceive of a social and technological order where transhuman power exists, but you may or may not want to live in it. This is a world where there are god-like entities doing wondrous things, and humanity lives in a state of awe and worship at what they have created. To like living in this world would require that you adopt a spirit of religious submission, perhaps not so different from modern-day monotheists who bow five times a day to their god. This may be the best post-Singularity order we can hope for.

In response to against "AI risk"
Comment author: AlphaOmega 12 April 2012 05:48:26PM *  -1 points [-]

I am going to assert that the fear of unfriendly AI over the threats you mention is a product of the same cognitive bias which makes us more fascinated by evil dictators and fictional dark lords than more mundane villains. The quality of "evil mind" is what really frightens us, not the impersonal swarm of "mindless" nanobots, viruses or locusts. However, since this quality of "mind," which encapsulates such qualities as "consciousness" and "volition," is so poorly understood by science and so totally undemonstrated by our technology, I would further assert that unfriendly AI is pure science fiction which should be far down the list of our concerns compared to more clear and present dangers.

Comment author: TheOtherDave 10 April 2012 09:09:56PM 6 points [-]

Robots taking human jobs is another step toward bringing the curtain down permanently on the dead-end primate dramas

Well, so is large-scale primate extermination leaving an empty husk of a planet.

The question is not so much whether the primates exist in the future, but what exists in the future and whether it's something we should prefer to exist. I accept that there probably exists some X such that I prefer (X + no humans) to (humans), but it certainly isn't true that for all X I prefer that.

So whether bringing that curtain down on dead-end primate dramas is something I would celebrate depends an awful lot on the nature of our "mind children."

Comment author: AlphaOmega 11 April 2012 07:08:48PM *  -2 points [-]

OK, but if we are positing the creation of artificial superintelligences, why wouldn't they also be morally superior to us? I find this fear of a superintelligence wanting to tile the universe with paperclips absurd; why is that likely to be the summum bonum to a being vastly smarter than us? Aren't smarter humans generally more benevolent toward animals than stupider humans and animals? Why shouldn't this hold for AI's? And if you say that the AI might be so much smarter than us that we will be like ants to it, then why would you care if such a species decides that the world would be better off without us? From a larger cosmic perspective, at that point we will have given birth to gods, and can happily meet our evolutionary fate knowing that our mind children will have vastly more interesting lives than we ever could have. So I don't really understand the problem here. I guess you could say that I have faith in the universe's capacity to evolve life toward more intelligent and interesting configurations, because for the last several billion years this has been the case, and I don't see any reason to think that this process will suddenly reverse itself.

Comment author: AlphaOmega 10 April 2012 06:30:22PM *  -3 points [-]

It seems to me that humanity is faced with an epochal choice in this century, whether to:

a) Obsolete ourselves by submitting fully to the machine superorganism/superintelligence and embracing our posthuman destiny, or

b) Reject the radical implications of technological progress and return to various theocratic and traditionalist forms of civilization which place strict limits on technology and consider all forms of change undesirable (see the 3000-year reign of the Pharaohs, or the million-year reign of the hunter-gatherers)

Is there a plausible third option? Can we really muddle along for much longer with this strange mix of religious “man is created in the image of God”, secular humanist “man is the measure of all things” and transhumanist “man is a bridge between animal and Superman” ideologies? And why do even Singularitarians insist that there must be a happy ending for homo sapiens, when all the scientific evidence suggests otherwise? I see nothing wrong with obsoleting humanity and replacing them with vastly superior “mind children.” As far as I’m concerned this should be our civilization’s summum bonum, a rational and worthy replacement for bankrupt religious and secular humanist ideals. Robots taking human jobs is another step toward bringing the curtain down permanently on the dead-end primate dramas, so it’s good news that should be celebrated!

Comment author: timtyler 25 March 2012 12:51:28AM 0 points [-]

I have a pretty clear vision of such a cult, its ideology, activities and structure, and would like to know if anyone here is interested in such a thing.

Sure. I'm interested in all end-of-the-world cults. The more virulent their memes the better.

For example, from the point of view of the "Cult of Omega", the extinction of humanity is an all but inevitable and desirable outcome, as we march ineluctably toward the Singularity.

A cult with no respect for history? Those who don't remember the past are doomed to repeat it.

Comment author: AlphaOmega 25 March 2012 02:03:45AM *  -5 points [-]

Excellent! Perhaps you can be BetaOmega ;)

As far as history goes, there are some chapters that might be worth repeating. For example, what possessed the ancient Egyptians, suddenly and out of the stone age, to build huge monuments of great precision which still awe us after 4.5 thousand years? Some crazy pharaohnic cult made that possible, and even though it seems totally irrational, I’m glad they did it! So maybe this is what we need today: a cult of the Machine which gives our technology an ideology, and even a religion. Otherwise it all seems rather pointless, doesn't it?

Please don't be too put off by my web site by the way -- I was in a comic book supervillain phase when I created it which I’m finally getting over. Nor am I here to troll LessWrong; I think what has been created here is brilliant, and though it’s often accused of being cultish, maybe the real problem is that it isn’t cultish enough!

Comment author: XiXiDu 24 March 2012 07:02:26PM 2 points [-]

Out of curiosity, what are your current thoughts on the arguments you've laid out here?

Strong enough to justify the existence of an organisation like SIAI. Everything else is a matter of expected utility calculations. Which I am not able to handle. Not given my current education and not given my psyche.

I know how what I am saying is incredible repugnant to some people here. I see no flaws. But I can't help but flinch away from taking all those ideas seriously. Although I am currently trying hard. I suppose the post above is a baby-step.

This video pretty much is the window to my soul. You see how something can be completely rational yet feel ridiculous?

Less Wrong opens up the terrifying vistas of reality that I tried to flee from since a young age.

The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the light into the peace and safety of a new dark age.

-- The Call of Cthulhu

I felt compelled to try and see if I can make it all vanish.

Comment author: AlphaOmega 24 March 2012 11:26:37PM *  -7 points [-]

I think I understand how you feel. Here is what I propose, for people who find these vistas of reality terrifying, and who may feel a need to approach them from a more "spiritual" (for lack of a better word) perspective: a true Singularity cult. By that I mean, no more pretending that you are a mere rationalist, coolly calculating the probabilities of heaven and hell, but rather to embrace the quasi-religious nature of this subject matter in all its glory. I have a pretty clear vision of such a cult, its ideology, activities and structure, and would like to know if anyone here is interested in such a thing. What I have in mind would be rather extreme and terrifying to the profane, and hence is better discussed in a more cult-like environment. For example, from the point of view of the "Cult of Omega", the extinction of humanity is an all but inevitable and desirable outcome, as we march ineluctably toward the Singularity. I believe that if it was done well, such a cult could become the nexus of a powerful new religion which could totally remake the world.

Comment author: AlphaOmega 15 June 2011 09:42:51PM *  1 point [-]

How useful are these surveys of "experts", given how wrong they've been over the years? If you conducted a survey of experts in 1960 asking questions like this, you probably would've gotten a peak probability for human level AI around 1980 and all kinds of scary scenarios happening long before now. Experts seem to be some of the most biased and overly optimistic people around with respect to AI (and many other technologies). You'd probably get more accurate predictions by taking a survey of taxi drivers!

View more: Next