All of sophiesdad's Comments + Replies

@Eliezer: Sophiesdad, you should be aware that I'm not likely to take your advice, or even take it seriously. You may as well stop wasting the effort.

Noted. No more posts from me.

@sophiesdad: Autodidactism may be a superior approach for the education of certain individuals, but it allows the individual to avoid one element crucial to production: discipline.

@Pyramid Head: Eliezer (who, in my opinion, don't lack discipline)

My comment about discipline was not meant to be inflammatory, nor even especially critical. Rather, it was meant to be descriptive of one aspect of autodidactism. In comparison, suppose that Mr. Yudkowsky was working toward his PhD at (say) University of Great Computer Scientists. His chosen topic for his dissertat... (read more)

@Pyramid Head: I don't see how he can hope to save the world by writing blog posts...

Ditto. Autodidactism may be a superior approach for the education of certain individuals, but it allows the individual to avoid one element crucial to production: discipline. Mr. Yudkowsky's approach, and his resistance to work with others, along with his views that it is his job to save the world, and no one else can do it, suggest an element of savantism. Hardly a quality one would want in a superhuman intelligence.

I, too, enjoy his writing, but the fact that he discove... (read more)

@Eliezer Yudkowsky said: Spindizzy and sophiesdad, I've spent quite a while ramming headlong into the problem of preventing the end of the world. Doing things the obvious way has a great deal to be said for it; but it's been slow going, and some help would be nice. Research help, in particular, seems to me to probably require someone to read all this stuff at the age of 15 and then study on their own for 7 years after that, so I figured I'd better get started on the writing now.

I have posted this before without answer, but I'll try again. You are working a... (read more)

I second spindizzy, yet hope that something major is happening.

I don't think Deep Blue "knew" that it was trying to beat Gary Kasparov in the game of chess. It was programmed to come up with every possible alternative move and evaluate the outcome of each in terms of an eventual result of taking K's king. The human brain is elegant, but it's not fast, and unquestionable no human could have evaluated all the possible moves within the time limit. Deep Blue is quaint compared to the Universal Machines of the near future. David Deutsch claims that quantum computers will be able to factor numbers that would requi... (read more)

sophiesdad: As I understand it, it is not possible for a human to design a machine that is "smarter-than-human", by definition.

Caledonian: Your understanding is mistaken.

Mr. Caledonian, I'm going to stick by my original statement. Smarter-than-human intelligence will be designed by machines with "around human" intelligence running recursive self-improvement code. It will not start with a human-designed superhuman intelligence. How could a human know what that is? That's why I'm not sure that all the years of thought going into the meani... (read more)

Mr. Yudkowsky says: But does that question really make much difference in day-to-day moral reasoning, if you're not trying to build a Friendly AI?

Here is a quote that I think speaks to that:

The Singularity is the technological creation of smarter-than-human intelligence. Ask what “smarter-than-human” really means. And as the basic definition of the Singularity points out, this is exactly the point at which our ability to extrapolate breaks down. We don’t know because we’re not that smart.-- Eliezer Yudkowsky

As I understand it, it is not possible for a huma... (read more)

Unknown wrote:
As I've stated before, we are all morally obliged to prevent Eliezer from programming an AI.

As Bayesians, educated by Mr. Yudkowsky himself, I think we all know the probability of such an event is quite low. In 2004, in the most moving and intelligent eulogy I have ever read, Mr. Y stated: "When Michael Wilson heard the news, he said: "We shall have to work faster." Any similar condolences are welcome. Other condolences are not." Somewhere, some person or group is working faster, but at the Singularity Institute, all the... (read more)

Interestingly, there are categories of habitual liars (as distinguished from pathologic liars) who have no fear of common knowledge whatever. They lie in preference to telling the truth, and if caught, suffer no embarrassment or remorse. I once encountered such a person who was telling a story about a Wildlife Officer using an AK-47 assault rifle to kill a grizzly bear in West Virginia. When informed that no wild grizzly has ever been reported east of the continental divide, and that a state agency certainly would not issue a Russian weapon to its office... (read more)