All of MichaelG's Comments + Replies

Tim, do we have any idea what is required for uploads? Do we have any idea what is required for AGI? How can you make those comparisons?

If we thin-section and scan a frozen brain, it's an immense amount of data, but at least potentially, captures everything you need to know about a brain. This is a solvable technological problem. If we understand neurons well enough, we can simulate that mapped brain. Again, that's just a matter of compute power. I'm sure there's a huge distance from a simulated scan to a functional virtual human, but it doesn't stri... (read more)

Eliezer, I understand the logic of what you are saying. If AI is an existential threat, then only FriendlyAI can save us. Since any self-improving AI can become quickly unstoppable, FriendlyAI must be developed first and deployed as soon as it is developed. The team that developed it would in fact have a moral imperative to deploy it without risking consultation from anyone else.

I assume you also understand where I'm coming from. Out here in the "normal" world, you sound like a zealot who would destroy the human race in order to save it. Any... (read more)

Moshe Gurvich, thanks for the encouragement. I can never decide if my problem is Depression as a disease, or just reaction to my particular life circumstances.

There are people who recommend purely cognitive approaches to depression, including a lot of self-monitoring. Finding a project that engages you, so that you don't dwell on your depression, is a different approach, although also purely cognitive.

My point on the original post though was that you might naively assume that people would be scared of self-modification. But then you see people using Pro... (read more)

I'm depressed about the coming end of the human race. Got a solution for that? :-)

5Dojan
I'd say that is an accurate feeling. You should not want it to go away, by any other means than making the coming end of the human race go away.
MichaelG-10

Eliezer, I'm aware of nanotech. And I know you think the human race is obsolete when AI comes along. And I also think that you might be right, and that people like you might have the power to make it so.

But I also believe that if the rest of the human race really thought that was a possibility, you'd be burned at the stake.

Do you have any regard for the opinions of humanity at all? If you were in the position of having an AI in front of you, that you had convinced yourself was friendly, would you let it out of the box without bothering to consult anyone else?

0mamert
The term "obsolete" as used here confuses me. It seems to imply a purpose, one that individuals - or humanity - or whatever other "intelligence collective" there may be - could get behind. What might that purpose be? Not survival, is it?

So do you think it's possible to deal with depression by thinking "oh, just ignore that mood. It's just a defective portion of my brain speaking."

Or is the act of getting an antidepressant med basically acting on the desire to change your own brain?

What does it say about our regard for self and willingness to change our mental structure that so many people take antidepressants? If we were uploaded, would we freely modify our minds, or fear losing ourselves in the process?

I forgot I posted over here the other day, and so I didn't check back. For anyone still reading this thread, here's a bit of an email exchange I had on this subject. I'd really like a "FriendlyAI scenarios" thread.


From the few sentences I read on CEV, you are basically saying “I don’t know what I want or what the human race wants, but here I have a superintelligent AI. Let’s ask it!” This is clever, even if it means the solution is completely unknown at this point. Still, there are problems. I envision this as a two-step process. First, ... (read more)

There's that old quote: "never let your sense of morality keep you from doing what you know is right."

I'd still like an answer to the most basic Friendly AI question: what do you want it to do? Forget the implementation problems for a second, and just give me a scenario where the AI is doing what you want it to do. What does that world look like? Because I don't even know what I want from that future.

Here's a doubt for you: I'm a nerd, I like nerds, I've worked on technology, and I've loved techie projects since I was a kid. Grew up on SF, all of that.

My problem lately is that I can't take Friendly AI arguments seriously. I do think AI is possible, that we will invent it. I do think that at some point in the next hundreds of years, it will be game over for the human race. We will be replaced and/or transformed.

I kind of like the human race! And I'm forced to conclude that a human race without that tiny fraction of nerds could last a good long tim... (read more)

1tlhonmey
I would say that the non-nerds can't save the human race either though.  Without nerds our population never exceeds what can be supported by hunting, gathering, and maybe some primitive agriculture. Which isn't much.  We'd be constantly hovering just short of being wiped out by some global cataclysm.  And there's some evidence that we've narrowly missed just that at least once in our history.  If we want to survive long-term we need to get off this rock, and then we need to find at least one other solar system.  After that we can take a breather while we think about finding another galaxy to colonize. Yes, we might destroy ourselves with new technology.  But we're definitely dead without it.  And if you look at how many new technologies have been denounced as being harbingers for the end of the world vs how many times the world has actually ended, I'd have to think that gut feelings about what technologies are the most dangerous and how badly we'll handle them are probably wrong more often than they're right.
1JohnH
Have you ever heard of the term hubris? If you can't imagine ways in which the human race can be destroyed by non-nerds then that shows a lack of imagination not that it can not be done. Also, it isn't like nerds and non-nerds are actually a different species, people that do not have a natural aptitude for a subject are still capable of learning the subject,. If nerds all moved to nerdtopia other people would study what material there was on the subject and attempt to continue on. If this is not possible then you have applied the term nerd to be too broad such that it contains the majority of people and all that would be left are people that are incapable to fully taking care of themselves without some form of outside assistance and would thus destroy the human race by sheer ineptitude at basic survival skills.