PhilGoetz comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (432)
This post doesn't show up under "NEW", nor does it show up under "Recent Posts".
ADDED: Never mind. I forgot I had "disliked" it, and had "do not show an article once I've disliked it" set.
(I disliked it because I find it kind of shocking that Ben, who's very smart, and whom I'm pretty sure has read the things that I would refer him to on the subject, would say that the Scary Idea hasn't been laid out sufficiently. Maybe some people need every detail spelled out for them, but Ben isn't one of them. Also, he is committing the elementary error of not considering expected value.
ADDED: Now that I've read Ben's entire post, I upvoted rather than downvoted this post. Ben was not committing the error of not considering expected value, so much as responding to many SIAI-influenced people who are not considering expected value. And I agree with most of what Ben says. I would add that Eliezer's plan to construct something that will provably follow some course of action - any course of action - chosen by hairless primates, is likely to be worse in the long run than a hard-takeoff AI that kills all humans almost immediately. Explaining what I mean by "worse" is problematic; but no more problematic than explaining why I should care about propagating human values.)
I also disagree about what the Scary Idea is - to me, the idea that the AI will choose to keep humans around for all eternity, is scarier than that it will not. But that is something Eliezer either disagrees with, or has deliberately made obscure.)
Wouldn't it make sense to keep some humans around for all eternity - in the history simul-books? That seems to make sense - and not be especially scary.
Sure. Tiling the universe largely with humans is the strong scary idea. Locking in human values for the rest of the universe is the weak scary idea. Unless the first doesn't imply the second; in which case I don't know which is more scary.
It does now for me. Strange.
Oops. My mistake. It's a setting I had that I forgot about.
It doesn't?
It's off the front page of NEW/Recent Posts, as there have been more than ten other posts since it was posted, but it's still there.
Weird. It's there for me.