Comment author: Dorikka 12 April 2015 08:37:38PM 8 points [-]

Sadly, I think the other title was better. "Is X a good idea" seems open enough to prompt discussion, while "abomination" prompts me to mentally categorize the post as spam.

Comment author: kingmaker 12 April 2015 08:39:18PM 1 point [-]

Duly noted

Comment author: shminux 12 April 2015 08:34:19PM 20 points [-]

Friendly-AI is a truly abhorrent concept indicative of intellectual depravity.

I tend to ignore any non-fiction which starts with moralizing and denigrating. I assume the rest of the article is crap, too.

Comment author: kingmaker 12 April 2015 08:35:04PM 0 points [-]

Please read on, I would have removed the snarky intro

Comment author: Dorikka 12 April 2015 08:30:41PM *  4 points [-]

The doublepost with different titles cracks me up.

Comment author: kingmaker 12 April 2015 08:32:20PM 1 point [-]

Yeah, I'm not very good at the internet, I didn't realize deleting articles apparently means nothing on this site

Comment author: kingmaker 03 April 2015 08:39:38PM 6 points [-]

Desirability is not a requisite of the truth darkmatter2525 source

Comment author: advancedatheist 31 March 2015 03:09:11PM -3 points [-]

No doubt this post will drop into downvoted oblivion. But I would like to explore the following, for personal reasons:

Have you ever used the services of a legal prostitute, like the ones who operate in bordellos in some Nevada counties? Did you have your sexual debut with a legal prostitute because you couldn’t make it happen in your organic social situation while growing up, for example, with girls you knew in high school or college? And did that experience somehow make it easier to develop the skills for having sexual relationships with women through dating? Or does it still leave you relatively incompetent in that area because prostitutes don’t really solve the sex problem you thought you had?

I don’t know of any research into this. But then professional sex researchers in general seem strangely incurious about the problems of sexually inexperienced and excluded adult men, judging from the absence of this topic in the recently published Human Sexuality 101 textbooks I’ve seen.

BTW, I find it interesting that over a decade ago, Eliezer described himself in a news story as a “volunteer virgin,” though he has since become sexually active. That implies he had opportunities for sexual relationships that he had simply declined until something happened to change his mind about pursuing them. Perhaps he realized that sexual experience would elevate his “armor class” in the male status hierarchy. It would also improve his social relationships with women in other areas; women can pick up on the “tells,” as Texas Hold’em players call them, of sexually inexperienced men, and they tend not to respect them.

Comment author: kingmaker 31 March 2015 03:58:40PM -1 points [-]

I love the way that advancedatheist assumes that we're all guys. That, or lesbians.

Comment author: artemium 31 March 2015 05:57:09AM *  0 points [-]

I had exactly the same idea!

It is possible that in that only few people are actually 'players' (have consciousness) and others are NPC-like p-zombies. In that case, I can say I'm one of the players, as I'm sure that I have consciousness, but there is no way I can prove it to anyone else ;-) .

One of the positive aspects of this kind of thought experiments is that usually gives people additional reasons for good behavior because in most cases it is highly likely that simulators are conscious creatures who will probably reward those who behave ethically.

Comment author: kingmaker 31 March 2015 03:55:03PM *  5 points [-]

I admit that it serves my ego suitably to imagine that I am the only conscious human, and a world full of shallow-AI's was created just for me ;-)

Comment author: tailcalled 31 March 2015 09:45:21AM 2 points [-]

Well, yeah, you should still be good to your friends and other presumably real people. However, there would be no point in, say, trying to save people from the holocaust, since the simulators wouldn't let actual people get tortured and burnt.

Comment author: kingmaker 31 March 2015 03:50:18PM 4 points [-]

The simulators may justify in their minds actual people getting tortured and burnt by suggesting that most of the people will not experience too much suffering, that the simulations would not otherwise have lived (although this fails to distinguish between lives and lives worth living), and that they can end the simulation if our suffering becomes too great. That the hypothetical simulators did not step in during the many genocides in our kind's history may suggest that they either do not exist, or that creating an FAI is more important to them than preventing human suffering.

Comment author: kingmaker 30 March 2015 07:31:10PM *  8 points [-]

This co-opts Bostrom's Simulation argument, but a possible solution to the fermi paradox is that we are all AI's in the box, and the simulators have produced billions of humans in order to find the most friendly human to release from the box. Moral of the story, be good and become a god

Comment author: kingmaker 30 March 2015 04:46:23PM -1 points [-]

Seeing as I'm new here, absolutely nothing

In response to comment by tailcalled on Boxing an AI?
Comment author: CBHacking 30 March 2015 10:41:02AM 0 points [-]

If you're in a box, then the computational resources available are finite. They might change over time, as those outside the box add or upgrade hardware, but the AI can't just say "I need some highly parallel computing hardware to solve this problem with" and re-invent the GPU, or rather, if it did that, it would be a GPU emulated in software and hence extremely slow. The entire simulation would, in effect, slow down due to the massively increased computational cost of simulating this world.

Now, if you cut the AI off from any type of real-time clock, maybe it doesn't notice that it's running slower - in the same way that people generally wouldn't notice if time dilation due to the Earth's movement were to double, because all of our frames of reference would slow together - but I suspect that the AI would manage to find something useful for letting it know the box is there. Remember that you have to get this right the first time; if the AI finds itself in a box, you have to assume it will find its way out.

In response to comment by CBHacking on Boxing an AI?
Comment author: kingmaker 30 March 2015 04:40:34PM 4 points [-]

It may simply deduce that it is likely to be in a box, in the same way that Nick Bostrom deduced we are likely to be in a simulation. Along these lines, it's amusing to think that we might be the AI in the box, and some lesser intelligence is testing to see if we're friendly

View more: Prev | Next