vi21maobk9vp comments on Do people think Less Wrong rationality is parochial? - Less Wrong

27 Post author: lukeprog 28 April 2012 04:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (196)

You are viewing a single comment's thread. Show more comments above.

Comment author: vi21maobk9vp 05 May 2012 05:20:50PM 2 points [-]

As for computation theory, he didn't skip all the fundamentals, only some parts of some of them. There are some red flags, though.

By the way, I wonder where "So you want to become Seed AI programmer" article from http://acceleratingfuture.com/wiki (long broken) can be found. It would be useful to have it around or have it publicly disclaimed by Eliezer Yudkowsky: it did help me to decide whether I see any value in SIAI plans or not.

Comment author: private_messaging 05 May 2012 07:36:08PM *  1 point [-]

There's awful lot of fundamentals, though... I've replied to a comment of his very recently. It's not a question of what he skipped, it's a question of what few things he didn't skip. You got 100 outputs, 10 values each, you get 10^100 actions here (and that's not even big for innovation). Nothing mysterious about being unable to implement something that'll deal with that in the naive way. Then if you are to use better methods than bruteforce maximizing, well, some functions are easier to find maximums of analytically, nothing mysterious about that either. Ultimately, you don't find successful autodidacts among people who had opportunity to obtain education the normal way at good university.

Comment author: vi21maobk9vp 06 May 2012 04:37:24AM 2 points [-]

At this point you are being somewhat mean. It does look like honest sloppy writing on his part. With a minimum of goodwill I can accept that he meant "effectively maximizing the expectation of". Also, it would still be somewhat interesting if only precisely one function could be maximizied - at least some local value manipulations could be possible, after all. So it is not that obvious.

About autodidacts - the problem here is that even getting education in some reputed place can still leave you with a lot of skipped fundamentals.

Comment author: private_messaging 06 May 2012 06:29:33AM *  1 point [-]

If he means effectively maximizing the expectation of, then there is nothing mysterious about different levels of 'effectively' being available for different functions and his rhetorical point with 'mysteriously' falls apart.

I agree that education also allows for skipped fundamentals. Self education can be good if one has good external critique, such as learning to program and having computer tell you when you're wrong. Blogging, not so much. Internal critique is possible but rarely works, and doesn't work for things that are in the slightest bit non rigorous.