Eliezer_Yudkowsky comments on Call for new SIAI Visiting Fellows, on a rolling basis - Less Wrong

29 Post author: AnnaSalamon 01 December 2009 01:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (264)

You are viewing a single comment's thread. Show more comments above.

Comment author: Liron 01 December 2009 10:42:14PM 2 points [-]

This was written circa 2002 when Java was at least worthy of consideration compared to the other options out there.

Comment author: Eliezer_Yudkowsky 01 December 2009 10:50:23PM 8 points [-]

Yup. The logic at the time went something like, "I want something that will be reasonably fast and scale to lots of multiple processors and runs in a tight sandbox and has been thoroughly debugged with enterprise-scale muscle behind it, and which above all is not C++, and in a few years (note: HAH!) when we start coding, Java will probably be it." There were lots of better-designed languages out there but they didn't have the promise of enterprise-scale muscle behind their implementation of things like parallelism.

Also at that time, I was thinking in terms of a much larger eventual codebase, and was much more desperate to use something that wasn't C++. Today I would say that if you can write AI at all, you can write the code parts in C, because AI is not a coding problem.

Mostly in that era there weren't any good choices, so far as I knew then. Ben Goertzel, who was trying to scale a large AI codebase, was working in a mix of C/C++ and a custom language running on top of C/C++ (I forget which), which I think he had transitioned either out of Java or something else, because nothing else was fast enough or handled parallelism correctly. Lisp, he said at that time, would have been way too slow.

Comment author: kpreid 01 December 2009 11:15:35PM 6 points [-]

Today I would say that if you can write AI at all, you can write the code parts in C, because AI is not a coding problem.

I'd rather the AI have a very low probability of overwriting its supergoal by way of a buffer overflow.

Comment author: Nick_Tarleton 02 December 2009 04:14:26AM 6 points [-]

Proving no buffer overflows would be nothing next to the other formal verification you'd be doing (I hope).

Comment author: DanArmak 02 December 2009 02:12:57AM 2 points [-]

I fully agree that C++ is much, much, worse than Java. The wonder is that people still use it for major new projects today. At least there are better options than Java available now (I don't know what the state of art was in 2002 that well).

If you got together an "above-genius-level" programming team, they could design and implement their own language while they were waiting for your FAI theory. Probably they would do it anyway on their own initiative. Programmers build languages all the time - a majority of today's popular languages started as a master programmer's free time hobby. (Tellingly, Java is among the few that didn't.)

A custom language built and maintained by a star team would be at least as good as any existing general-purpose one, because you would borrow design you liked and because programming language design is a relatively well explored area (incl. such things as compiler design). And you could fit the design to the FAI project's requirements: choosing a pre-existing language means finding one that happens to match your requirements.

Incidentally, all the good things about Java - including the parallelism support - are actually properties of the JVM, not of the Java the language; they're best used from other languages that compile to the JVM. If you said "we'll probably run on the JVM", that would have sounded much better than "we'll probably write in Java". Then you'll only have to contend with the CLR and LLVM fans :-)

Comment author: Eliezer_Yudkowsky 02 December 2009 04:27:23AM 4 points [-]

I don't think it will mostly be a coding problem. I think there'll be some algorithms, potentially quite complicated ones, that one will wish to implement at high speed, preferably with reproducible results (even in the face of multithreading and locks and such). And there will be a problem of reflecting on that code, and having the AI prove things about that code. But mostly, I suspect that most of the human-shaped content of the AI will not be low-level code.

Comment author: Eliezer_Yudkowsky 02 December 2009 09:17:08AM 0 points [-]

How's the JVM on concurrency these days? My loose impression was that it wasn't actually all that hot.

Comment author: mattnewport 02 December 2009 09:51:12AM *  2 points [-]

I think it's pretty fair to say that no language or runtime is that great on concurrency today. Coming up with a better way to program for many-core machines is probably the major area of research in language design today and there doesn't appear to be a consensus on the best approach yet.

I think a case could be made that the best problem a genius-level programmer could devote themselves to right now is how to effectively program for many-core architectures.

Comment author: Henrik_Jonsson 02 December 2009 07:50:39PM *  0 points [-]

My impression is that JVM is worse at concurrency than every other approach that's been tried so far.

Haskell and other functional programming languages has many promising ideas but isn't widely used in the industry AFAIK.

This presentation gives a good short overview of the current state of concurrency approaches.

Comment author: anonym 02 December 2009 08:29:48AM 0 points [-]

Speaking of things that aren't Java but run on the JVM, Scala is one such (really nice) language. It's designed and implemented by one of the people behind the javac compiler, Martin Odersky. The combination of excellent support for concurrency and functional programming would make it my language of choice for anything that I would have used Java for previously, and it seems like it would be worth considering for AI programming as well.

Comment author: komponisto 03 December 2009 08:54:34PM 1 point [-]

Today I would say that if you can write AI at all, you can write the code parts in C, because AI is not a coding problem.

Exactly -- which is why the sentence sounded so odd.

Comment author: Eliezer_Yudkowsky 03 December 2009 09:18:32PM 7 points [-]

Well, yes, Yudkowsky-2002 is supposed to sound odd to a modern LW reader.