Bugmaster comments on Thoughts on the Singularity Institute (SI) - Less Wrong

256 Post author: HoldenKarnofsky 11 May 2012 04:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1270)

You are viewing a single comment's thread. Show more comments above.

Comment author: Bugmaster 10 May 2012 08:18:55PM 0 points [-]

I also find it likely that certain practical problems would be prohibitively difficult (if not outright impossible) to solve without an AGI of some sort. Fluent machine translation seems to be one of these problems, for example.

Comment author: [deleted] 13 May 2012 09:38:55AM 3 points [-]

This belief is mainstream enough for Wikipedia to have an article on AI-complete.

Comment author: Alsadius 13 May 2012 03:57:35AM *  2 points [-]

Given some of the translation debates I've heard, I'm not convinced it would be possible even with AGI. You can't give a clear translation of a vague original, to name the most obvious problem.

Comment author: NancyLebovitz 13 May 2012 04:35:51AM 1 point [-]

Is matching the vagueness of the original a reasonable goal?

Comment author: Alsadius 15 May 2012 12:56:29AM 1 point [-]

True, but good luck getting folks to agree on whether you'd done so.

Comment author: [deleted] 13 May 2012 09:36:35AM *  1 point [-]

(I'm taking reasonable to mean ‘one which you would want to achieve if it were possible’.) Yes. You don't want to introduce false precision.

Comment author: dlthomas 15 May 2012 12:59:21AM 0 points [-]

One complication here is that you ideally want it to be vague in the same ways the original was vague; I am not convinced this is always possible while still having the results feel natural/idomatic.

Comment author: Bugmaster 15 May 2012 01:01:37AM 1 point [-]

IMO it would be enough to translate the original text in such a fashion that some large proportion (say, 90%) of humans who are fluent in both languages would look at both texts and say, "meh... close enough".

Comment author: dlthomas 15 May 2012 02:23:47AM 0 points [-]

My point was just that there's a whole lot of little issues that pull in various directions if you're striving for ideal. What is/isn't close enough can depend very much on context. Certainly, for any particular purpose something less than that will be acceptable; how gracefully it degrades no doubt depends on context, and likely won't be uniform across various types of difference.

Comment author: Bugmaster 15 May 2012 02:26:14AM 1 point [-]

Agreed, but my point was that I'd settle for an AI who can translate texts as well as a human could (though hopefully a lot faster). You seem to be thinking in terms of an AI who can do this much better than a human could, and while this is a worthy goal, it's not what I had in mind.