You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

DanielLC comments on A few thoughts on a Friendly AGI (safe vs friendly, other minds problem, ETs and more) - Less Wrong Discussion

3 Post author: the-citizen 19 October 2014 07:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (44)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanielLC 20 October 2014 07:44:48PM 0 points [-]

You don't need accurate down to the subatomic level. You just need a human. The same human would be nice, since it means the FAI managed to keep all of those people from dying, but unless it's programmed to only value currently alive people, that's not a big deal.

Also, you make it sound like you're saying we won't develop that storage capability until well after we develop the AGI. It's the AGI that will be developing technology. What we can do just before we make it is not a good indicator of what it can do.

Comment author: the-citizen 21 October 2014 07:10:12AM 0 points [-]

Because neural pathways and other structures of the brain are pretty small, I think you'd need an extremely high resolution. However, I guess what you're saying is that a breeding population would be enough to at least keep the species going, so I acknowledge that. Still, I'm hoping we can make something that does something in addition to that.

Your second point depends on how small the AGI can make reliable storage tech I guess.

In the perhaps this whole point is moot because its unlikely an intelligence explosion will take long enough for there to be time for other researchers to construct an alternative AGI.

Comment author: DanielLC 21 October 2014 05:18:11PM 0 points [-]

Still, I'm hoping we can make something that does something in addition to that.

Their children will be fine. You don't even need a breeding population. You just need to know how to make an egg, a sperm, and an artificial uterus.

In the perhaps this whole point is moot because its unlikely an intelligence explosion will take long enough for there to be time for other researchers to construct an alternative AGI.

It might encounter another AGI as it spreads, although I don't think this point will matter much in the ensuing war (or treaty, if they decide on that).