Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Stuart_Armstrong comments on The Doomsday argument in anthropic decision theory - Less Wrong Discussion

5 Post author: Stuart_Armstrong 31 August 2017 01:44PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (54)

You are viewing a single comment's thread. Show more comments above.

Comment author: Stuart_Armstrong 01 September 2017 11:36:27AM 4 points [-]

If you're a published academic in field X, you're part of the community of published academics in field X, which, in many but not all cases, is the entirety of people doing serious work in field X. "Prestige" mainly translates to "people who know stuff in this area take me seriously".

Comment author: Wei_Dai 01 September 2017 01:10:48PM 5 points [-]

When I was involved in crypto there were forums that both published academics and unpublished hobbyists participated in, and took each other seriously. If this isn't true in a field, it makes me doubt that intellectual progress is still the highest priority in that field. If I were a professional philosopher working in anthropic reasoning, I don't see how I can justify not taking a paper about anthropic reasoning seriously unless it passed peer review by anonymous reviewers whose ideas and interests may be very different from my own. How many of those papers can I possibly come across per year, that I'd justifiably need to outsource my judgment about them to unknown peers?

(I think peer review does have a legitimate purpose in measuring people's research productivity. University admins have to count something to determine who to hire and promote, and number of papers that pass peer review is perhaps one of the best measure we have. And it can also help outsiders to know who can be trusted as experts in a field, which is what I was thinking of by "prestige". But there's no reason for people who are already experts in a field to rely on it instead of their own judgments.)

Comment author: Stuart_Armstrong 01 September 2017 01:23:17PM 3 points [-]

If I were a professional philosopher working in anthropic reasoning, I don't see how I can justify not taking a paper about anthropic reasoning seriously

Depends on how many cranks there are in anthropic reasoning (lots) and how many semi-serious people post ideas that have already been addressed or refuted in papers already (in philosophy in general, this is huge; in anthropic reasoning, I'm not sure).

Comment author: Wei_Dai 01 September 2017 05:10:58PM *  0 points [-]

Lots of places attract cranks and semi-serious people, including the crypto forums I mentioned, LW, and everything-list which was a mailing list I created to discuss anthropic reasoning as one of the main topics, and they're not that hard to deal with. Basically it doesn't take a lot of effort to detect cranks and previously addressed ideas, and everyone can ignore the cranks and the more experienced hobbyists can educate the less experienced hobbyists.

EDIT: For anyone reading this, the discussion continues here.

Comment author: Stuart_Armstrong 02 September 2017 06:23:45PM 0 points [-]

Basically it doesn't take a lot of effort to detect cranks and previously addressed ideas

This is news to me. Encouraging news.

Comment author: turchin 02 September 2017 12:11:01PM *  0 points [-]

If specially dedicated journal for anthropic reasoning exists (or say for AI risks and other x-risks), it will probably improve quality of peer review and research? Or it will be no morŅƒ useful than Lesswrong?

Comment author: Stuart_Armstrong 02 September 2017 06:27:13PM 2 points [-]

If I were a professional philosopher working in anthropic reasoning, I don't see how I can justify not taking a paper about anthropic reasoning seriously

But there are no/few philosophers working in "anthropic reasoning" - there are many working in "anthropic probability", to which my paper is an interesting irrelevance. it's essentially asking and answering the wrong question, while claiming that their own question is meaningless (and doing so without quoting some of the probability/decision theory stuff which might back up the "anthropic probabilities don't exist/matter" claim from first principles).

I expected the paper would get published, but I always knew it was a bit of a challenge, because it didn't fit inside the right silos. And the main problem with academia here is that people tend to stay in their silos.

Comment author: Wei_Dai 05 September 2017 02:47:50PM 1 point [-]

But there are no/few philosophers working in "anthropic reasoning" - there are many working in "anthropic probability", to which my paper is an interesting irrelevance. it's essentially asking and answering the wrong question, while claiming that their own question is meaningless

Seems like a good explanation of what happened to this paper specifically.

(and doing so without quoting some of the probability/decision theory stuff which might back up the "anthropic probabilities don't exist/matter" claim from first principles)

I guess that would be the thing to try next, if one was intent on pushing this stuff back into academia.

And the main problem with academia here is that people tend to stay in their silos.

By doing that they can better know what the fashionable topics are, what referees want to see in a paper, etc., which help them maximize the chances of getting papers published. This seems to be another downside of the current peer review system as well as the larger publish-or-perish academic culture.