You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

paper-machine comments on The Singularity Institute's Arrogance Problem - Less Wrong Discussion

63 Post author: lukeprog 18 January 2012 10:30PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (307)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 19 January 2012 01:15:49AM 2 points [-]

Since EY claims to be doing math, he should be posting at least a couple of papers a year on arxiv.org (cs.DM or similar), properly referenced and formatted to conform with the prevailing standard (probably LaTeXed), and submit them for conference proceedings and/or into peer-reviewed journals. Anything less would be less than rational.

I agree, wholeheartedly, of course -- except the last sentence. There's a not very good argument that the opportunity cost of EY learning LaTeX is greater than the opportunity cost of having others edit afterward. There's also a not very good argument that EY doesn't lose terribly much from his lack of academic signalling credentials. Together these combine to a weak argument that the current course is in line with what EY wants, or perhaps would want if he knew all the relevant details.

Comment author: Maelin 19 January 2012 01:30:18AM 28 points [-]

For someone who knows how to program, learning LaTeX to a perfectly serviceable level should take at most one day's worth of effort, and most likely it would be spread diffusely throughout the using process, with maybe a couple of hours' dedicated introduction to begin with.

It is quite possible that, considering the effort required to find an editor and organise for that editor to edit an entire paper into LaTeX, compared with the effort required to write the paper in LaTeX in the first place, the additional effort cost of learning LaTeX may in fact pay for itself after less than one whole paper. It's very unlikely that it would take more than two.

Comment author: dbaupp 19 January 2012 02:12:35AM 5 points [-]

It is quite possible that, considering the effort required to find an editor and organise for that editor to edit an entire paper into LaTeX, compared with the effort required to write the paper in LaTeX in the first place, the additional effort cost of learning LaTeX may in fact pay for itself after less than one whole paper

And one gets all the benefits of a text document while writing it (grep-able, version control, etc.).

(It should be noted that if one is writing LaTeX, it is much easier with a LaTeX specific editor (or one with an advanced LaTeX mode))

Comment author: lukeprog 19 January 2012 01:28:56AM 3 points [-]

I'm not at all confident that writing (or collaborating on) academic papers is the most x-risk-reducing way for Eliezer to spend his time.

Comment author: Bugmaster 21 January 2012 03:44:12AM 7 points [-]

Speaking of arrogance and communication skills: your comment sounds very similar to, "Since Eliezer is always right about everything, there's no need for him to waste time on seeking validation from the unwashed academic masses, who likely won't comprehend his profound ideas anyway". Yes, I am fully aware that this is not what you meant, but this is what it sounds like to me.

Comment author: lukeprog 21 January 2012 03:48:48AM 1 point [-]

Interesting. That is a long way from what I meant. I just meant that there are many, many ways to reduce x-risk, and it's not at all clear that writing papers is the optimal way to do so, and it's even less clear that having Eliezer write papers is so.

Comment author: Bugmaster 21 January 2012 03:58:25AM 4 points [-]

Yes, I understood what you meant; my comment was about style, not substance.

Most people (myself included, to some non-trivial degree) view publication in academic journals as a very strong test of one's ideas. Once you publish your paper (or so the belief goes), the best scholars in the field will do their best to pick it apart, looking for weaknesses that you might have missed. Until that happens, you can't really be sure whether your ideas are correct.

Thus, by saying "it would be a waste of Eliezer's time to publish papers", what you appear to be saying is, "we already know that Eliezer is right about everything". And by combining this statement with saying that Eliezer's time is very valuable because he's reducing x-risk, you appear to be saying that either the other academics don't care about x-risk (in which case they're clearly ignorant or stupid), or that they would be unable to recognize Eliezer's x-risk-reducing ideas as being correct. Hence, my comment above.

Again, I am merely commenting on the appearance of your post, as it could be perceived by someone with an "outside view". I realize that you did not mean to imply these things.

Comment author: wedrifid 21 January 2012 04:34:27AM *  2 points [-]

Thus, by saying "it would be a waste of Eliezer's time to publish papers", what you appear to be saying is, "we already know that Eliezer is right about everything".

That really isn't what Luke appears to be saying. It would be fairer to say "a particularly aggressive reader could twist this so that it means..."

It may sometimes be worth optimising speech such that it is hard to even willfully misinterpret what you say (or interpret based on an already particularly high prior for 'statement will be arrogant') but this is a different consideration to trying not to (unintentionally) appear arrogant to a neutral audience.

Comment author: JoshuaZ 27 January 2012 08:27:49PM 3 points [-]

That really isn't what Luke appears to be saying. It would be fairer to say "a particularly aggressive reader could twist this so that it means..."

For what it is worth, I had an almost identical reaction when reading the statement.

Comment author: Bugmaster 21 January 2012 04:48:39AM 0 points [-]

Fair enough; it's quite possible that my interpretation was too aggressive.

Comment author: wedrifid 21 January 2012 04:51:57AM 0 points [-]

It's the right place for erring on the side of aggressive interpretation. We've been encouraged (and primed) to do so!

Comment author: [deleted] 19 January 2012 01:30:45AM 6 points [-]

I thought we were talking about the view from outside the SIAI?

Comment author: lukeprog 19 January 2012 01:45:22AM 5 points [-]

Clearly, Eliezer publishing technical papers would improve SI's credibility. I'm just pointing out that this doesn't mean that publishing papers is the best use of Eliezer's time. I wasn't disagreeing with you; just making a different point.

Comment author: shminux 19 January 2012 02:55:40AM 13 points [-]

Publishing technical papers would be one of the better uses of his time, editing and formatting them probably is not. If you have no volunteers, you can easily find a starving grad student who would do it for peanuts.

Comment author: [deleted] 20 January 2012 03:50:40PM 2 points [-]

Well, they've got me for free.

Comment author: shminux 20 January 2012 06:26:24PM 0 points [-]

You must be allergic to peanuts.

Comment author: [deleted] 20 January 2012 06:38:51PM 0 points [-]

Not allergic, per se. But I doubt they would willingly throw peanuts at me, unless perhaps I did a trick with an elephant.

Comment author: [deleted] 19 January 2012 01:47:00AM *  0 points [-]

I'm not disagreeing with you either.

Comment author: mwengler 19 January 2012 04:11:45PM 7 points [-]

I think the evolution is towards a democratization of the academic process. One could say the cost of academia was so high in the middle ages that the smart move was filtering the heck out of participants to at least have a chance of maximizing utility of those scarce resources. And now those costs have been driven to nearly zero, with the largest cost being the sigal-to-noise problem: how does a smart person choose what to look at.

I think putting your signal into locations where the type of person you would like to attract gather is the best bet. Web publication of papers is one. Scientific meetings is another. I don't think you can find an existing institution more chock full of people you would like to be involved with than the Math-Science-Engineering academic institutions. Market in them.

If there is no one who can write an academic math paper that is interested enough in EY's work to translate it into something somewhat recognizable as valuable by his peers, than the emperor is wearing no clothes.

As a PhD calltech applied physicist who has worked with optical interferometers both in real life and in QM calculations (published in journals), EY's stuff on interferometer is incomprehensible to me. I would venture to say "wrong" but I wouldn't go that far without discussing it in person with someone.

Robin Hanson's endorsement of EY is the best credential he has for me. I am a caltech grad and I love Hanson's "freakonomics of the future" approach, but his success at being associated wtih great institutions is not a trivial factor in my thinking I am right to respect him.

Get EY or lukeprog or Anna or someone else from SIAI on Russ Roberts' podcast. Robin has done it.

Overall, SIAI serves my purposes pretty well as is. But I tend to view SIAI as pushing a radical position about some sort of existential risk and beliefs about AI, where the real value is probably not quite as radical as what they push. An example from history would be BF Skinner and behaviorism. No doubt behavioral concepts and findings have been very valuable, but the extreme "behaviorism is the only thing, there are no internal states" behaviorism of its genius pusher BF Skinner is way less valuable than an eclectic theory that includes behaviorism as one piece.

This is a core dump since you ask. I don't claim to be the best person to evaluate EY's interformetry claims as my work was all single-photon (or linear anyway) stuff and I have worked only a small bit with two-photon formalisms. And I am unsophisticated enough to think MWI doesn't pass the smell test no matter how much lesswrong I've read.

Comment author: Adele_L 03 June 2012 08:05:17AM 3 points [-]

Robin Hanson's endorsement of EY is the best credential he has for me.

Similarly, the fact that Scott Aaronson and John Baez seem to take him seriously are significant credentials he has for me.

Comment author: shminux 19 January 2012 02:51:18AM *  1 point [-]

I would see what the formatting standards are in the relevant journals and find a matching document class or a LyX template. Someone other than Eliezer can certainly do that.