Wei_Dai comments on The Level Above Mine - Less Wrong

42 Post author: Eliezer_Yudkowsky 26 September 2008 09:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (387)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 26 September 2008 05:54:06PM 9 points [-]

Manuel, "enroll in a grad program for AI" != "you're smart, you should go to college".

Kragen, the short answer is, "It's easy to talk about the importance of effort if you happen to be Hamming." If you can make the ante for the high-stakes table, then you can talk about how little the ante counts for, and the importance of playing your cards well. But if you can't make the ante...

Robin, it's not blind faith in math or math for the sake of impressiveness, but a specific sense that the specific next problems I have to solve, will require more math than I've used up to this point. Not Andrew J. Wiles math, but Jaynes doesn't use Wiles-math either. I quite share your prejudice against math for the sake of looking impressive, because that gets you the wrong math. (Formality isn't about Precision?)

Ken, it's exclusively my work that gives me the motivation to keep working on something for years, but things like pride can give me the motivation to keep working on something for the next minute. I'll take whatever sources of motivation I can get (er, that aren't outright evil, of course).

Douglas, yes, my father changed at 40. But one of my primary sources of hope is that people have been known to do basic research later than this if they changed fields late in life, which suggests that it actually can be a matter of approach/outlook/methodology and avoiding serving on prestigious committees.

Retired, I don't understand the apparent contradiction you see. I participated in the Midwest Talent Search at a young age (not "Northwestern" anything, maybe you're confusing with Northwestern University?) and scored second-best for my grade category, but at that point I'd skipped a grade. But I think I can recall hearing about someone who got higher SAT scores than mine, at age nine. That would be decisive, if the SAT were a perfect noiseless measurement of ability to work on AI.

Vassar: You see, for many many people it is possible to choose a weighting scheme among a dozen or so factors contribute to intellectual work such that they are the best.

Yes, this is the well-known phenomenon where asking someone "How dumb are you?" produces a different answer than "How smart are you?" because they recall a different kind of evidence. But the question I'm trying to answer is "How much potential do you have to solve the remaining FAI problems you know about?" As I said to Robin, I do think this is going to involve taking a step up in math level.

To all commenters who observed that I don't seem to stand out from 10 other smart people they know, either you didn't comprehend the entirety of today's post, or you have very high confidence that you occupy the highest possible rank of human ability.

Comment author: Wei_Dai 06 March 2011 09:33:31AM *  17 points [-]

Robin, it's not blind faith in math or math for the sake of impressiveness, but a specific sense that the specific next problems I have to solve, will require more math than I've used up to this point.

I'm curious if this is still your sense, and if so, what kind of math are you talking about?

My sense is that currently the main problems in FAI are philosophical. Skill in math is obviously very useful, but secondary to skill in philosophy, because most of the time it's still "I have no idea how to approach this problem" instead of "Oh, if I can just solve this math problem, everything will be clear".

...or I'm strictly dumber than Conway, dominated by him along all dimensions. Maybe, if I could find a young proto-Conway and tell them the basics, they would blaze right past me, solve the problems that have weighed on me for years, and zip off to places I can't follow.

Marcello observed "In terms of philosophical intuition, you are head and shoulders above Conway." Making progress in FAI theory seems to require a combination of rationality, good philosophical intuition, math talent, motivation, and prerequisite background knowledge. (Am I leaving out anything?) Out of these, perhaps good philosophical intuition is rarest, in large part because we don't know how to teach it (or screen for it at a young age). Is this a problem you've considered?