Comment author: CuSithBell 27 May 2012 08:53:36PM 0 points [-]

Missed the point. Do you understand that you shouldn't have been confident you knew why cousin_it felt a particular way? Beyond that, personally I'm not all that interested in theorizing about the reasons, but if you really want to know you could just ask.

Comment author: sufferer 28 May 2012 05:43:28PM 0 points [-]

Sorry I wasn't implying very strong confidence. I would give a probability of, say, 65% that my reason is the principal cause of the feelings of Cousin_it

Comment author: CuSithBell 27 May 2012 08:38:22PM 0 points [-]

That you may have discovered the reason that you felt this way does not mean that you have discovered the reason another specific person felt a similar way. In fact, they may not even be unaware of the causes of their feelings.

Comment author: sufferer 27 May 2012 08:41:35PM *  0 points [-]

Sure. That's why I said: "I welcome alternative theories" (including theories about there being multiple different reasons which may apply to different extents to different people). Do you have one?

Comment author: CuSithBell 18 May 2012 02:55:59AM 1 point [-]

Probably not wise to categorically tell someone the reasons behind their feelings when you're underinformed, and probably not kind to ruminate on the subject when you can expect it to be unpleasant.

Comment author: sufferer 27 May 2012 08:22:02PM 1 point [-]

I have personally felt the same feelings and I think I have pinned down the reason. I welcome alternative theories, in the spirit of rational debate rather than polite silence.

Comment author: cousin_it 13 May 2012 01:20:16PM *  9 points [-]

Not sure about the others, but as for me, at some point this spring I realized that talking about saving the world makes me really upset and I'm better off avoiding the whole topic.

Comment author: sufferer 17 May 2012 06:18:29PM *  0 points [-]

It's because talking about the singularity and end-of-world in near mode for a large amount of time makes you alieve that it's going to happen. In the same way that it actually happening would make you alieve it, but talking about it once and believing it then never thinking about it explicitly again wouldn't.

Comment author: jsteinhardt 11 May 2012 05:51:45PM 0 points [-]

I'm not sure what you mean by

Hanson is on the same page as the rest of SIAI with regard to expected utility

As Holden and Eliezer both explicitly state, SIAI itself rejects the "but there's still a chance" argument.

Comment author: sufferer 13 May 2012 07:17:28PM 1 point [-]

It all depends on how small that small chance is. Pascal mugging is typically done with probabilities that are exponentially small, e.g. 10^-10 or so.

But what about if Holden is going to not recommend SIAI for donations when there's a 1% or 0.1% chance of it making that big difference.

Comment author: TheOtherDave 11 May 2012 04:21:29PM 6 points [-]

Robin Hanson has been listed as the other major "intelligent/competent" critic of SIAI. That he criticises what seems to be the keystone of Holden's argument should be cause for concern for Holden.

So, I stipulate that Robin, whom Eliezer considers the only other major "intelligent/competent" critic of SI, disagrees with this aspect of Holden's position. I also stipulate that this aspect is the keystone of Holden's argument, and without it all the rest of it is irrelevant. (I'm not sure either of those statements is actually true, but they're beside my point here.)

I do not understand why these stipulated facts should be a significant cause for concern for Holden, who may not consider Eliezer's endorsement of what is and isn't legitimate criticism of SI particularly significant evidence of anything important.

Can you expand on your reasoning here?

Comment author: sufferer 11 May 2012 04:39:47PM *  0 points [-]

I suspect that Holden would also consider Robin Hanson a competent critic. This is because Robin is smart, knowledgeable and prestigiously accredited.

But your comment has alerted me to the fact that even if Hanson comes out as a flat-earther tomorrow the supporting posts are still weak.

The issue of the two most credible critics of SIAI disagreeing with each other is logically independent of the issue of Holden's wobbly argument against the utilitarian argument for SIAI. Many thanks.

Comment author: sufferer 11 May 2012 04:13:58PM *  1 point [-]

But if there's even a chance …

Holden cites two posts (Why We Can’t Take Expected Value Estimates Literally and Maximizing Cost-effectiveness via Critical Inquiry). They are supposed to support the argument that small or very small changes to the probability of an existential risk event occurring are not worth caring about or donating money towards.

I think that these posts both have serious problems (see the comments, esp Carl Shulman's). In particular Why We Can’t Take Expected Value Estimates Literally was heavily criticised by Robin Hanson in On Fudge Factors.

Robin Hanson has been listed as the other major "intelligent/competent" critic of SIAI. That he criticises what seems to be the keystone of Holden's argument should be cause for concern for Holden. (after all, if "even a chance" is good enough, then all the other criticisms melt away).

This would be a much more serious criticism of SIAI if Holden and Hanson could come to agreement on what exactly the problem with SIAI is, and if Holden could sort out the problems with these two supporting posts*

(*of course they won't do that without substantial revision of one or both of their positions because Hanson is on the same page as the rest of SIAI with regard to expected utility, see On Fudge Factors. Hanson's disagreement with SIAI is a different one; approximately that Hanson thinks ems first is likely and that a singleton is both bad and unlikely, and Hanson's axiology is significantly unintuitive to the extent that he is not really on the same page as most people with regard to what counts as a good or bad outcome)

Comment author: [deleted] 30 April 2012 06:54:38AM *  11 points [-]

Vladimir_M, what makes you think that elite universities have the desire and money/power to proselytize their "output"?

Mencius Moldbug has convincingly argued on his blog, that intellectual fashion among the ruling class follows intellectual fashion on Harvard by an offset of about one generation. A generation after that the judicial and journalist class exiles any opposition to such thought from public discourse and most educated people move significantly towards it. A generation after that through public schools and the by now decades long exposure to media issuing normative statements on the subject, such beliefs are marginalized even among the uneducated, making any populist opposition to society wide stated value or policy changes a futile gesture destined to live only one season.

It is indeed is a wonderful machine for generating political power through opinion in Western type societies. While I generally have no qualms about Harvard being an acceptable truth generation machine when it comes to say Physics, in areas where it has a conflict of interest, like say economics or sociology let alone political science or ethics it is not a reliable truth generating machine. It is funny how these fields get far more energetic promotion than say Physics or Chemistry.

I am fairly certain the reason creationism is still around as a political force in some US states is because creationism is not a serious threat to The Cathedral. However I do think modern style rationality is indeed only the honest interest of a small small part of academia, a far larger fraction of academia is busy engaged in producing anti-knowledge in the most blatant form of something studies departments and a more convoluted form of ugh field rationalizations found in everything from biology to philosophy.

Academia taken as a whole has no incentives to promote modern rationality, cargo cult rationality and worship of science so anti-knowledge factories can syphon status from say Computer Scientists or Mathematicians perhaps. But not actual rationality.

So yes LessWrong should spend effort on promoting that, however it should not abstain, from challenging and criticizing academia in places where it is systematically wrong.

Comment author: sufferer 30 April 2012 04:42:25PM *  -1 points [-]

As Moldbug has convincingly argued on his blog, intellectual fashion among the ruling class follows intellectual fashion on Harvard by an offset of about one generation. A generation after that the judicial and journalist class exiles any opposition to such thought from public discourse

then

creationism is still around

Contradiction much?

because creationism is not a serious threat to The Cathedral

If the "judicial and journalist class" only attacks popular irrational ideas which are "a serious threat to The Cathedral", then what other irrationalities will get through? Maybe very few irrational ideas are a "serious threat to The Cathedral", in which case you just admitted that academia cannot "proselytise it's output". What about antivax? Global warming denial? Theism? Anti-nuclear-power irrationality? Crazy, ill-thought-through and knee jerk anticapitalism of the OWS variety? So many popular irrational beliefs...

Comment author: Barry_Cotter 29 April 2012 09:45:43AM 3 points [-]

Vladimir_M, what makes you think that elite universities have the desire and money/power to proselytize their "output"?

I reversed a downvote to this because other people should also suffer by seeing a question this stupid. Fifteen members of the 111th Congress earned bachelor's degrees from Harvard, 11 current congressmen called Stanford home during their undergraduate days, ten members of Congress got their bachelors from Yale. This includes neither MBAs nor JDs Source here

Comment author: sufferer 29 April 2012 02:12:17PM *  1 point [-]

But as I said in my comment, there are numerous issues (creationism, moon landing hoax, antivax, global warming denial, and I should add theism) where a large amount of public opinion is highly divergent from the opinions of the vast majority of academics. So clearly the elite universities are not actually that good at proselytizing their output.

Perhaps it has been downvoted because people see elite universities with large endowments and lots of alumni in congress? But still, that money cannot be spent on proselytizing. And how exactly is a politician who went to Stanford or Harvard supposed to have the means and motive to come out against a popular falsehood? Somehow science is not doing so well against creationism. As an example, Rick Santorum went to Penn State (a Public Ivy), but then expressed the view that humans were not evolved from "monkeys". Newt Gingrich actually was a lecturer, and said intelligent design should be taught at school.

EDIT: Also, Yes, I am stupid in an absolute sense. If I were smart, I would be rich & happy ;-0

Comment author: Vladimir_M 28 April 2012 07:29:01PM *  17 points [-]

I don't think "parochial" is the right word here -- a more accurate term for what you're describing would be "contrarian."

In any case, insofar as there exists some coherent body of insight that can be named "Less Wrong rationality," one of its main problems is that it lacks any really useful methods for separating truth from nonsense when it comes to the output of the contemporary academia and other high-status intellectual institutions. I find this rather puzzling: on the one hand, I see people here who seem seriously interested in forming a more accurate view of the world -- but at the same time, living in a society that has vast powerful, influential, and super-high-status official intellectual institutions that deal with all imaginable topics, they show little or no interest in the question of what systematic biases and perverse incentives might be influencing their output.

Now, the point of your post seems to be that LW is good because its opinion is in line with that of these high-status institutions. (Presumably thanks to the fact that both sides have accurately converged onto the truth.) But then what exactly makes LW useful or worthwhile in any way? Are the elite universities so marginalized and powerless that they need help from a blog run by amateurs to spread the word about their output? It really seems to me that if a forum like LW is to have any point at all, it can only be in identifying correct contrarian positions. Otherwise you might as well just cut out the middleman and look at the mainstream academic output directly.

Comment author: sufferer 29 April 2012 12:47:14AM *  3 points [-]

Are the elite universities so marginalized and powerless that they need help from a blog run by amateurs to spread the word about their output?

Vladimir_M, what makes you think that elite universities have the desire and money/power to proselytize their "output"? I mean, you surely know about the trouble they are having trying to win the propaganda fight against creationism, and against global warming denial. And then there's anti-vaccination and the moon landing conspiracy.

In fact the statement that I quoted seems to so obviously deserve the answer "yes, they are unable to spread the word" that I wonder whether I am missing something.

Perhaps you were thinking that elite universities only need to spread the word amongst, say, the smartest 10% of the country for it to matter. But even in that demographic, I think you will find that few people know about this stuff. Perhaps with the release of books such as Predictably Irrational things have improved. But still, such books seem somewhat inadequate since they don't aim to cleanly teach rationality, rather they aim to give a few cute rationality-flavoured anecdotes. If someone reads Predictably Irrational I doubt that they would be able to perform a solid analysis of the Allais Paradox (because Predictably Irrational doesn't teach decision theory in a formal way) and I doubt that their calibration would improve (because Predictably Irrational doesn't make them play calibration games).

There are very, very few people employed at universities whose job description is "make the public understand science". As far as I am aware there is literally no-one in the world whose job title is "make the public understand cognitive-biases-style rationality"

View more: Next