"If computing power doubles every eighteen months, what happens when computers are doing the research?"
And this sounds horrifyingly naive to my present ears
TMOL was freaking brilliant. This post was awesome. It blew me away. Can't wait to see the follow up.
I know this whole comment was kind of vacuous, but yeah. Wow. I don't even have to tell you that you make more sense than virtually everyone on the planet.
"In real life, I'd expect someone to brute-force an unFriendly AI on one of those super-ultimate-nanocomputers, followed in short order by the end of the world."
indeed.
"You should at least explain why you think his theory wrong"
Please don't ask that... I've heard the parables too many times to count, about how you can't build an AI by putting in individual pieces of knowledge, you need something that generates these pieces of knowledge itself, and so on.
CYC has become a cliche absurdity...
Yeah, I mean... who needs Friendly AI anyway?
No big deal, we can just put that off until later, or better yet- indefinitely!
There is no need for a single Friendly AI researcher AT ALL!
Right? Am I reading this wrong?
"But a vote for a losing candidate is not "thrown away"; it sends a message to mainstream candidates that you vote"
Ah, but one of my sayings is this:
"Never throw yourself off a cliff to send a message."
NORMALLY this wouldn't be an important phrase, however in this election it is very applicable. Looks like we are going off the cliff anyway.
It takes a very healthy sense of humor I suppose :)
I was trying to figure out why are having dialogues with complete fools, but apparently this guy is famous or something.
huh.
How do you suffer morons like this...?
" " Mathew C: "And the biggest threat, of course, is the truth that the self is not fundamentally real. When that is clearly seen, the gig is up."
Spot on. That is by far the biggest impasse I have faced anytime I try to convey a meta-ethics denying the very existence of the "singularity of self" in favor of the self of agency over increasing context. I usually to downplay this aspect until after someone has expressed a practical level of interest, but it's right there out front for those who can see it. "
I think you are misinterpreting things here. I would call it a false dichotomy."
What I mean is, just because there is no "ontological" self doesn't mean there isn't a really complex "self-like" process that is highly dependent on, correlated with, and based in a single, individual brain - a process that simply is not isomorphic to a group of such processes, especially with respect to the Singularity.
http://yudkowsky.net/obsolete/tmol-faq.html