JonahSinick comments on Beyond Statistics 101 - LessWrong

19 Post author: JonahSinick 26 June 2015 10:24AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (129)

You are viewing a single comment's thread. Show more comments above.

Comment author: JonahSinick 26 June 2015 03:02:10PM *  4 points [-]

Why did you have this impression?

Groupthink I guess: other people who I knew didn't think that it's so important (despite being people who are very well educated by conventional standards, top ~1% of elite colleges).

Tell me how exactly you're planning to use PCA day-to-day?

Disclaimer: I know that I'm not giving enough evidence to convince you: I've thought about this for thousands of hours (including working through many quantitative examples) and it's taking me a long time to figure out how to organize what I've learned.

I already have been using dimensionality reduction (qualitatively) in my day to day life, and I've found that it's greatly improved my interpersonal relationships because it's made it much easier to guess where people are coming from (before people's social behavior had seemed like a complicated blur because I saw so many variables without having started to correctly identify the latent ones).

i think the sooner you lose the need for everything to resonate deeply or have a concise insightful summary, the better.

You seem to be making overly strong assumptions with insufficient evidence: how would you know whether this was the case, never having met me? ;-)

Comment author: minusdash 26 June 2015 03:29:36PM *  5 points [-]

Qualitative day-to-day dimensionality reduction sounds like woo to me. Not a bit more convincing than quantum woo (Deepak Chopra et al.). Whatever you're doing, it's surely not like doing SVD on a data matrix or eigen-decomposition on the covariance matrix of your observations.

Of course, you can often identify motivations behind people's actions. A lot of psychology is basically trying to uncover these motivations. Basically an intentional interpretation and a theory of mind are examples of dimensionality reduction in some sense. Instead of explaining behavior by reasoning about receptors and neurons, you imagine a conscious agent with beliefs, desires and intentions. You could also link it to data compression (dimensionality reduction is a sort of lossy data compression). But I wouldn't say I'm using advanced data compression algorithms when playing with my dog. It just sounds pretentious and shows a desperate need to signal smartness.

So, what is the evidence that you are consciously doing something similar to PCA in social life? Do you write down variables and numbers, or how can I imagine qualitative dimensionality reduction. How is it different from somebody just getting an opinion intuitively and then justifying it with afterwards?

Comment author: JonahSinick 26 June 2015 06:26:56PM *  1 point [-]

See Rationality is about pattern recognition, not reasoning.

Your tone is condescending, far outside of politeness norms. In the past I would have uncharitably written this off to you being depraved, but I've realized that I should be making a stronger effort to understand other people's perspectives. So can you help me understand where you're coming from on an emotional level?

Comment author: minusdash 26 June 2015 07:14:26PM 6 points [-]

You asked about emotional stuff so here is my perspective. I have extremely weird feelings about this whole forum that may affect my writing style. My view is constantly popping back and forth between different views, like in the rabbit-duck gestalt image. On one hand I often see interesting and very good arguments, but on the other hand I see tons of red flags popping up. I feel that I need to maintain extreme mental efforts to stay "sane" here. Maybe I should refrain from commenting. It's a pity because I'm generally very interested in the topics discussed here, but the tone and the underlying ideology is pushing me away. On the other hand I feel an urge to check out the posts despite this effect. I'm not sure what aspect of certain forums have this psychological effect on my thinking, but I've felt it on various reddit communities as well.

Comment author: [deleted] 26 June 2015 11:19:26PM 5 points [-]

On one hand I often see interesting and very good arguments, but on the other hand I see tons of red flags popping up. I feel that I need to maintain extreme mental efforts to stay "sane" here.

Seconded, actually, and it's particular to LessWrong. I know I often joke that posting here gets treated as submitting academic material and skewered accordingly, but that is very much what it feels like from the inside. It feels like confronting a hostile crowd of, as Jonah put it, radical agnostics, every single time one posts, and they're waiting for you to say something so they can jump down your throat about it.

Oh, and then you run into the issue of having radically different priors and beliefs, so that you find yourself on a "rationality" site where someone is suddenly using the term "global warming believer" as though the IPCC never issued multiple reports full of statistical evidence. I mean, sure, I can put some probability on, "It's all a conspiracy and the official scientists are lying", but for me that's in the "nonsense zone" -- I actually take offense to being asked to justify my belief in mainstream science.

As much as "good Bayesians" are never supposed to agree to disagree, I would very much like if people would be up-front about their priors and beliefs, so that we can both decide whether it's worth the energy spent on long threads of trying to convince people of things.

Comment author: Vaniver 26 June 2015 08:10:19PM 2 points [-]

I feel that I need to maintain extreme mental efforts to stay "sane" here. Maybe I should refrain from commenting. It's a pity because I'm generally very interested in the topics discussed here, but the tone and the underlying ideology is pushing me away.

I would be very interested in hearing elaboration on this topic, either publicly or privately.

Comment author: minusdash 26 June 2015 09:45:15PM 14 points [-]

I prefer public discussions. First, I'm a computer science student who took courses in machine learning, AI, wrote theses in these areas (nothing exceptional), I enjoy books like Thinking Fast and Slow, Black Swan, Pinker, Dawkins, Dennett, Ramachandran etc. So the topics discussed here are also interesting to me. But the atmosphere seems quite closed and turning inwards.

I feel similarities to reddit's Red Pill community. Previously "ignorant" people feel the community has opened a new world to them, they lived in darkness before, but now they found the "Way" ("Bayescraft") and all this stuff is becoming an identity for them.

Sorry if it's offensive, but I feel as if many people had no success in the "real world" matters and invented a fiction where they are the heroes by having joined some great organization much higher above the general public, who are just irrational automata still living in the dark.

I dislike the heavy use of insider terminology that make communication with "outsiders" about these ideas quite hard because you get used to referring to these things by the in-group terms, so you get kind of isolated from your real-life friends as you feel "they won't understand, they'd have to read so much". When actually many of the concepts are not all that new and could be phrased in a way that the "uninitiated" can also get it.

There are too many cross references in posts and it keeps you busy with the site longer than necessary. It seems that people try to prove they know some concept by using the jargon and including links to them. Instead, I'd prefer authors who actively try to minimize the need for links and jargon.

I also find the posts quite redundant. They seem to be reiterations of the same patterns in very long prose with people's stories intertwined with the ideas, instead of striving for clarity and conciseness. Much of it feels a lot like self-help for people with derailed lives who try to engineer their life (back) to success. I may be wrong but I get a depressed vibe from reading the site too long. It may also be because there is no lighthearted humor or in-jokes or "fun" or self-irony at all. Maybe because the members are just like that in general (perhaps due to mental differences, like being on the autism spectrum, I'm not a psychiatrist).

I can see that people here are really smart and the comments are often very reasonable. And it makes me wonder why they'd regard a single person such as Yudkowsky in such high esteem as compared to established book authors or academics or industry people in these areas. I know there has been much discussion about cultishness, and I think it goes a lot deeper than surface issues. LessWrong seems to be quite isolated and distrusting towards the mainstream. Many people seem to have read stuff first from Yudkowsky, who often does not reference earlier works that basically state the same stuff, so people get the impression that all or most of the ideas in "The Sequences" come from him. I was quite disappointed several times when I found the same ideas in mainstream books. The Sequences often depict the whole outside world as dumber than it is (straw man tactics, etc).

Another thing is that discussion is often too meta (or meta-meta). There is discussion on Bayes theorem and math principles but no actual detailed, worked out stuff. Very little actual programming for example. I'd expect people to create github projects, IPython notebooks to show some examples of what they are talking about. Much of the meta-meta-discussion is very opinion-based because there is no immediate feedback about whether someone is wrong or right. It's hard to test such hypotheses. For example, in this post I would have expected an example dataset and showing how PCA can uncover something surprising. Otherwise it's just floating out there although it matches nicely with the pattern that "some math concept gave me insight that refined my rationality". I'm not sure, maybe these "rationality improvements" are sometimes illusions.

I also don't get why the rationality stuff is intermixed with friendly AI and cryonics and transhumanism. I just don't see why these belong that much together. I find them too speculative and detached from the "real world" to be the central ideas. I realize they are important, but their prevalence could also be explained as "escapism" and it promotes the discussion of untestable meta things that I mentioned above, never having to face reality. There is much talk about what evidence is but not much talk that actually presents evidence.

I needed to develop a sort of immunity against topics like acausal trade that I can't fully specify how they are wrong, but they feel wrong and are hard to translate to practical testable statements, and it just messes with my head in the wrong way.

And of course there is also that secrecy around and hiding of "certain things".

That's it. This place may just not be for me, which is fine. People can have their communities in the way they want. You just asked for elaboration.

Comment author: Vaniver 27 June 2015 12:35:23AM 9 points [-]

Thanks for the detailed response! I'll respond to a handful of points:

Previously "ignorant" people feel the community has opened a new world to them, they lived in darkness before, but now they found the "Way" ("Bayescraft") and all this stuff is becoming an identity for them.

I certainly agree that there are people here who match that description, but it's also worth pointing out that there are actual experts too.

the general public, who are just irrational automata still living in the dark.

One of the things I find most charming about LW, compared to places like RationalWiki, is how much emphasis there is on self-improvement and your mistakes, not mistakes made by other people because they're dumb.

It seems that people try to prove they know some concept by using the jargon and including links to them. Instead, I'd prefer authors who actively try to minimize the need for links and jargon.

I'm not sure this is avoidable, and in full irony I'll link to the wiki page that explains why.

In general, there are lots of concepts that seem useful, but the only way we have to refer to concepts is either to refer to a label or to explain the concept. A number of people read through the sequences and say "but the conclusions are just common sense!", to which the response is, "yes, but how easy is it to communicate common sense?" It's one thing to be able to recognize that there's some vague problem, and another thing to be able to say "the problem here is inferential distance; knowledge takes many steps to explain, and attempts to explain it in fewer steps simply won't work, and the justification for this potentially surprising claim is in Appendix A." It is one thing to be able to recognize a concept as worthwhile; it is another thing to be able to recreate that concept when a need arises.

Now, I agree with you that having different labels to refer to the same concept, or conceptual boundaries or definitions that are drawn slightly differently, is a giant pain. When possible, I try to bring the wider community's terminology to LW, but this requires being in both communities, which limits how much any individual person can do.

I also don't get why the rationality stuff is intermixed with friendly AI and cryonics and transhumanism.

Part of that is just seeding effects--if you start a rationality site with a bunch of people interested in transhumanism, the site will remain disproportionately linked to transhumanism because people who aren't transhumanists will be more likely to leave and people who are transhumanists will be more likely to find and join the site.

Part of it is that those are the cluster of ideas that seem weird but 'hold up' under investigation--most of the reasons to believe that the economy of fifty years from now will look like the economy of today are just confused, and if a community has good tools for dissolving confusions you should expect them to converge on the un-confused answer.

A final part seems to be availability; people who are convinced by the case for cryonics tend to be louder than the people who are unconvinced. The annual surveys show the perception of LW one gets from just reading posts (or posts and comments) is skewed from the perception of LW one gets from the survey results.

Comment author: JonahSinick 27 June 2015 01:46:52AM *  3 points [-]

One of the things I find most charming about LW, compared to places like RationalWiki, is how much emphasis there is on self-improvement and your mistakes, not mistakes made by other people because they're dumb.

I agree that LW is much better than RationalWiki, but I still think that the norms for discussion are much too far in the direction of focus on how other commenters are wrong as opposed to how one might oneself be wrong.

I know that there's a selection effect (with respect to the more frustrating interactions standing out). But people not infrequently mistakenly believe that I'm wrong about things that I know much more about than they do, with very high confidence, and in such instances I find the connotations that I'm unsound to be exasperating.

I don't think that this is just a problem for me rather than a problem for the community in general: I know a number of very high quality thinkers in real life who are uninterested in participating on LW explicitly because they don't want to engage with commenters who are highly confident that their own positions are incorrect. There's another selection effect here: such people aren't salient because they're invisible to the online community.

Comment author: Vaniver 27 June 2015 02:22:30AM 2 points [-]

I know that there's a selection effect (with respect to the more frustrating interactions standing out).

I agree that those frustrating interactions both happen and are frustrating, and that it leads to a general acidification of the discussion as people who don't want to deal with it leave. Reversing that process in a sustainable way is probably the most valuable way to improve LW in the medium term.

Comment author: Risto_Saarelma 27 June 2015 01:03:57PM *  5 points [-]

There's also the whole Lesswrong-is-dying thing that might be contribute to the vibe you're getting. I've been reading the forum for years and it hasn't felt very healthy for a while now. A lot of the impressive people from earlier have moved on, we don't seem to be getting that many new impressive people coming in and hanging out a lot on the forum turns out not to make you that much more impressive. What's left is turning increasingly into a weird sort of cargo cult of a forum for impressive people.

Comment author: V_V 27 June 2015 02:13:56PM *  4 points [-]

Actually, I think that LessWrong used to be worse when the "impressive people" were posting about cryonics, FAI, many-world interpretation of quantum mechanics, and so on.

Comment author: Risto_Saarelma 27 June 2015 06:00:51PM 5 points [-]

It has seemed to me that a lot of the commenters who come with their own solid competency are also less likely to get unquestioningly swept away following EY's particular hobbyhorses.

Comment author: [deleted] 26 June 2015 11:37:12PM 2 points [-]

I needed to develop a sort of immunity against topics like acausal trade that I can't fully specify how they are wrong, but they feel wrong and are hard to translate to practical testable statements, and it just messes with my head in the wrong way.

The applicable word is metaphysics. Acausal trade is dabbling in metaphysics to "solve" a question in decision theory, which is itself mere philosophizing, and thus one has to wonder: what does Nature care for philosophies?

By the way, for the rest of your post I was going, "OH MY GOD I KNOW YOUR FEELS, MAN!" So it's not as though nobody ever thinks these things. Those of us who do just tend to, in perfect evaporative cooling fashion, go get on with our lives outside this website, being relatively ordinary science nerds.

Comment author: VoiceOfRa 27 June 2015 01:21:05AM *  -1 points [-]

The applicable word is metaphysics.

Sorry avoiding metaphysics doesn't work. You just end up either reinventing them (badly) or using a bad 5th hand version of some old philospher's metaphysics. Incidentally, Eliezer also tried avoiding metaphysics and wound up doing the former.

Comment author: TheAncientGeek 28 June 2015 09:28:47PM 2 points [-]

Its insufficiently appreciated that physicalism is metaphysics too.

Comment author: [deleted] 28 June 2015 07:02:08PM *  2 points [-]

I don't like Eliezer's apparent mathematical/computational Platonism myself, but most working scientists manage to avoid metaphysical buggery by simply dealing with only those things with which what they can actually causally interact. I recall an Eliezer post on "Explain/Worship/Ignore", and would add myself that while "Explain" eventually bottoms out in the limits of our current knowledge, the correct response is to hit "Ignore" at that stage, not to drop to one's knees in Worship of a Sacred Mystery that is in fact just a limit to current evidence.

EDIT: This is also one of the reasons I enjoy being in this community: even when I disagree with someone's view (eg: Eliezer's), people here (including him) are often more productive and fun to talk to than someone who hits the limits of their scientific knowledge and just throws their hands up to the tune of "METAPHYSICS, SON!", and then joins the bloody Catholic Church, as if that solved anything.

Comment author: VoiceOfRa 28 June 2015 08:09:57PM 1 point [-]

I don't like Eliezer's apparent mathematical/computational Platonism myself, but most working scientists manage to avoid metaphysical buggery by simply dealing with only those things with which what they can actually causally interact.

That works up until the point where you actually have to think about what it means to "causally interact" with something. Also questions like "does something that falls into a black hole cease to exist since it's no longer possible to interact with it"?

Comment author: ChristianKl 27 June 2015 12:09:35PM 3 points [-]

My view is constantly popping back and forth between different views

That sounds like you engage in binary thinking and don't value shades of grey of uncertainty enough. You feel to need to judge arguments for whether they are true or aren't and don't have mental categories for "might be true, or might not be true".

Jonah makes strong claims for which he doesn't provide evidence. He's clear about the fact that he hasn't provided the necessary evidence.

Given that you pattern match to "crackpot" instead of putting Jonah in the mental category where you don't know whether what Jonah says is right or wrong. If you start to put a lot of claims into the "I don't know"-pile you don't constantly pop between belief and non-belief. Popping back and forth means that the size of your updates when presented new evidence are too large.

Being able to say "I don't know" is part of genuine skepticism.

Comment author: minusdash 27 June 2015 12:32:25PM 2 points [-]

I'm not talking about back and forth between true and false, but between two explanations. You can have a multimodal probability distribution and two distant modes are about equally probable, and when you update, sometimes one is larger and sometimes the other. Of course one doesn't need to choose a point estimate (maximum a posteriori), the distribution itself should ideally be believed in its entirety. But just as you can't see the rabbit-duck as simultaneously 50% rabbit and 50% duck, one sometimes switches between different explanations, similarly to an MCMC sampling procedure.

I don't want to argue this too much because it's largely a preference of style and culture. I think the discussions are very repetitive and it's an illusion that there is much to be learned by spending so much time thinking meta.

Anyway, I evaporate from the site for now.

Comment author: JonahSinick 26 June 2015 10:04:21PM *  3 points [-]

Thanks so much for sharing. I'm astonished by how much more fruitful my relationships have became since I've started asking.

I think that a lot of what you're seeing is a cultural clash: different communities have different blindspots and norms for communication, and a lot of times the combination of (i) blindspots of the communities that one is familiar with and (ii) respects in which a new community actually is unsound can give one the impression "these people are beyond the pale!" when the actual situation is that they're no less rational than members of one's own communities.

I had a very similar experience to your own coming from academia, and wrote a post titled The Importance of Self-Doubt in which I raised the concern that Less Wrong was functioning as a cult. But since then I've realized that a lot of the apparently weird beliefs on LWers are in fact also believed by very credible people: for example, Bill Gates recently expressed serious concern about AI risk.

If you're new to the community, you're probably unfamiliar with my own credentials which should reassure you somewhat:

  • I did a PhD in pure math under the direction of Nathan Dunfield, who coauthored papers with Bill Thurston, who formulated the geometrization conjecture which Perelman proved and in doing so won one of the Clay Millennium Problems.

  • I've been deeply involved with math education for highly gifted children for many years. I worked with the person who won the American Math Society prize for best undergraduate research when he was 12.

  • I worked at GiveWell, which partners with with Good Ventures, Dustin Moskovitz's foundation.

  • I've done fullstack web development, making an asynchronous clone of StackOverflow (link).

  • I've done machine learning, rediscovering logistic regression, collaborative filtering, hierarchical modeling, the use of principal component analysis to deal with multicollinearity, and cross validation. (I found the expositions so poor that it was faster for me to work things out on my own than to learn from them, though I eventually learned the official versions).You can read some details of things that I found here. I did a project implementing Bayesian adjustment of Yelp restaurant star ratings using their public dataset here

So I imagine that I'm credible by your standards. There are other people involved in the community who you might find even more credible. For example: (a) Paul Christiano who was an international math olympiad medalist, wrote a 50 page paper on quantum computational complexity with Scott Aaronson as an undergraduate at MIT, and is a theoretical CS grad student at Berkeley. (b) Jacob Steinhardt, a Hertz graduate fellow who does machine learning research under Percy Liang at Stanford.

So you're not actually in some sort of twilight zone. I share some of your concerns with the community, but the groupthink here is no stronger than the groupthink present in academia. I'd be happy to share my impressions of the relative soundness of the various LW community practices and beliefs.

Comment author: minusdash 26 June 2015 10:20:44PM 3 points [-]

Those are indeed impressive things you did. I agree very much with your post from 2010. But the fact that many people have this initial impression shows that something is wrong. What makes it look like a "twilight zone"? Why don't I feel the same symptoms for example on Scott Alexander's Slate Star Codex blog?

Another thing I could pinpoint is that I don't want to identify as a "rationalist", I don't want to be any -ist. It seems like a tactic to make people identify with a group and swallow "the whole package". (I also don't think people should identify as atheist either.)

Comment author: ChristianKl 27 June 2015 12:29:20PM 3 points [-]

Another thing I could pinpoint is that I don't want to identify as a "rationalist", I don't want to be any -ist.

Nobody forces you to do so. Plenty of people in this community don't self identify that way.

Comment author: JonahSinick 27 June 2015 01:33:50AM 3 points [-]

I'm sympathetic to everything you say.

In my experience there's an issue of Less Wrongers being unusually emotionally damaged (e.g. relative to academics) and this gives rise to a lot of problems in the community. But I don't think that the emotional damage primarily comes from the weird stuff that you see on Less Wrong. What one sees is them having born the brunt of the phenomenon that I described here disproportionately relative to other smart people, often because they're unusually creative and have been marginalized by conformist norms

Quite frankly, I find the norms in academia very creepy: I've seen a lot of people develop serious mental health problems in connection with their experiences in academia. It's hard to see it from the inside: I was disturbed by what I saw, but I didn't realize that math academia is actually functioning as a cult, based on retrospective impressions, and in fact by implicit consensus of the best mathematicians of the world (I can give references if you'd like) .

Comment author: Pentashagon 30 June 2015 04:03:17AM 5 points [-]

I was disturbed by what I saw, but I didn't realize that math academia is actually functioning as a cult

I'm sure you're aware that the word "cult" is a strong claim that requires a lot of evidence, but I'd also issue a friendly warning that to me at least it immediately set off my "crank" alarm bells. I've seen too many Usenet posters who are sure they have a P=/!=NP proof, or a proof that set theory is false, or etc. who ultimately claim that because "the mathematical elite" are a cult that no one will listen to them. A cult generally engages in active suppression, often defamation, and not simply exclusion. Do you have evidence of legitimate mathematical results or research being hidden/withdrawn from journals or publicly derided, or is it more of an old boy's club that's hard for outsiders to participate in and that plays petty politics to the damage of the science?

Grothendieck's problems look to be political and interpersonal. Perelman's also. I think it's one thing to claim that mathematical institutions are no more rational than any other politicized body, and quite another to claim that it's a cult. Or maybe most social behavior is too cult-like. If so; perhaps don't single out mathematics.

I've seen a lot of people develop serious mental health problems in connection with their experiences in academia.

I question the direction of causation. Historically many great mathematicians have been mentally and socially atypical and ended up not making much sense with their later writings. Either mathematics has always had an institutional problem or mathematicians have always had an incidence of mental difficulties (or a combination of both; but I would expect one to dominate).

Especially in Thurston's On Proof and Progress in Mathematics I can appreciate the problem of trying to grok specialized areas of mathematics. The terminology and symbology is opaque to the uninitiated. It reminds me of section 1 of the Metamath Book which expresses similar unhappiness with the state of knowledge between specialist fields of mathematics and the general difficulty of learning mathematics. I had hoped that Metamath would become more popular and tie various subfields together through unifying theories and definitions, but as far as I can tell it languishes as a hobbyist project for a few dedicated mathematicians.

Comment author: JonahSinick 30 June 2015 05:22:04AM 2 points [-]

I'm sure you're aware that the word "cult" is a strong claim that requires a lot of evidence, but I'd also issue a friendly warning that to me at least it immediately set off my "crank" alarm bells.

Thanks, yeah, people have been telling me that I need to be more careful in how I frame things. :-)

Do you have evidence of legitimate mathematical results or research being hidden/withdrawn from journals or publicly derided, or is it more of an old boy's club that's hard for outsiders to participate in and that plays petty politics to the damage of the science?

The latter, but note that that's not necessarily less damaging than active suppression would be.

Or maybe most social behavior is too cult-like. If so; perhaps don't single out mathematics.

Yes, this is what I believe. The math community is just unusually salient to me, but I should phrase things more carefully.

I question the direction of causation. Historically many great mathematicians have been mentally and socially atypical and ended up not making much sense with their later writings. Either mathematics has always had an institutional problem or mathematicians have always had an incidence of mental difficulties (or a combination of both; but I would expect one to dominate).

Most of the people who I have in mind did have preexisting difficulties. I meant something like "relative to a counterfactual where academia was serving its intended function." People of very high intellectual curiosity sometimes approach academia believing that it will be an oasis and find this not to be at all the case, and that the structures in place are in fact hostile to them.

This is not what the government should be supporting with taxpayer dollars.

Especially in Thurston's On Proof and Progress in Mathematics I can appreciate the problem of trying to grok specialized areas of mathematics.

What are your own interests?

Comment author: Pentashagon 01 July 2015 06:08:37AM 1 point [-]

The latter, but note that that's not necessarily less damaging than active suppression would be.

I suppose there's one scant anecdote for estimating this; cryptography research seemed to lag a decade or two behind actively suppressed/hidden government research. Granted, there was also less public interest in cryptography until the 80s or 90s, but it seems that suppression can only delay publication, not prevent it.

The real risk of suppression and exclusion both seem to be in permanently discouraging mathematicians who would otherwise make great breakthroughs, since affecting the timing of publication/discovery doesn't seem as damaging.

This is not what the government should be supporting with taxpayer dollars.

I think I would be surprised if Basic Income was a less effective strategy than targeted government research funding.

What are your own interests?

Everything from logic and axiomatic foundations of mathematics to practical use of advanced theorems for computer science. What attracted me to Metamath was the idea that if I encountered a paper that was totally unintelligible to me (say Perelman's proof of Poincaire's conjecture or Wiles' proof of Fermat's Last Theorem) I could backtrack through sound definitions to concepts I already knew, and then build my understanding up from those definitions. Alas, just having a cross-reference of related definitions between various fields would be helpful. I take it that model theory is the place to look for such a cross-reference, and so that is probably the next thing I plan to study.

Practically, I realize that I don't have enough time or patience or mental ability to slog through formal definitions all day, and so it would be nice to have something even better. A universal mathematical educator, so to speak. Although I worry that without a strong formal understanding I will miss important results/insights. So my other interest is building the kind of agent that can identify which formal insights are useful or important, which sort of naturally leads to an interest in AI and decision theory.

Comment author: eternal_neophyte 28 June 2015 07:58:31PM 2 points [-]

I would like to see some of those references (simply because I have no relation to Academia, and don't like things I read somewhere to gestate into unfounded intuitions about a subject).

Comment author: Mirzhan_Irkegulov 28 June 2015 07:49:41PM 0 points [-]

In my experience there's an issue of Less Wrongers being unusually emotionally damaged (e.g. relative to academics) and this gives rise to a lot of problems in the community.

I think you're just projecting.

Comment author: Viliam 08 July 2015 07:55:31PM *  9 points [-]

I would probably use different words, but I believe I fit Jonah's description. Before finding LW, I felt strongly isolated. Like, surrounded by human bodies, but intellectually alone. Thinking about topics that people around me considered "weird", so I had no one to debate them with. Having a large range of interests, and while I could find people to debate individual interests with, I had no one to talk with about the interesting combinations I saw there.

I felt "weird", and from people around me I usually got two kinds of feedback. When I didn't try to pretend anything, they more or less confirmed that I am weird (of course, many were gentle, trying not to hurt me). When I tried to play a role of someone "less weird" (that is, I ignored most of the things I considered interesting, and just tried to fit)... well, it took a lot of time and practice to do this correctly, but then people accepted me. So, for a long time it felt like the only way to be accepted would be to supress a large part of what I consider to be "myself"; and I suspect that it would never work perfectly, that there would still be some kind of intellectual hunger.

Then I found LW and I was like: "whoa... there actually are people like me! too bad they are on the other side of the planet though". Then I found some of them living closer, and... going to meetups feels incredibly refreshing. First time in my life, I don't have to suppress anything, to play any role. I just am... in an environment that feels natural. I finally started understanding how people can enjoy having social contacts.

Now let's imagine that in a parallel universe, those LessWrongers who live in a city near to mine, would instead be my neighbors since my childhood, or that we would be classmates at high school. I believe my life would be very different. (I believe there are people like this in my city, but the problem is finding those few dozen individuals among the hundreds of thousands, especially when there is no word in a public vocabulary to describe "us".)

I can't the article now, but I believe it was written by Lewis Terman, where he observed how successful are highly intelligent people. He found a difference between those who were "intelligent people in an intelligent environment" and those who were "isolated intelligent people". The former were usually very successful in life: they could talk with their parents and friends as equals, share their algorithms for life success, fit into their environment. The latter felt isolated, and often burned out at some moment of their lives. The conclusion was that for a highly intelligent person, having similarly highly intelligent family and friends makes a huge difference in their lives. -- When you observe the difference between "academia" and "LessWrong", it may be related to this.

It is easier to be academically successful when your parents are. You can pick good habits and strategies from them; you can debate your work and problems with them. If you are the only academically inclined person in the family, you lead a double life: the "real life" outside of school, and the "academic life" inside. The more you focus on your work, the more it feels like you are withdrawing from everything else. On the other hand, if you come from the same culture, focusing on the work makes you fit into the culture.

I am going to break a taboo here, but I don't know how to tell it otherwise. I have IQ about four or five sigma above the average. The difference between me and the average Mensa member is larger that the difference between Mensa and the general population. Many people in Mensa seem kind of dense to me, and average people, those are sometimes like five-years old children. (I believe for many people on LessWrong it feels the same.) Sure, intelligence in not everything: other people have skills and traits that I lack, sometimes have more success than me, and I admire that. It's just... so difficult to talk with them like with adult people. But when I go to LW meetup, it's like "whoa... finally a group of adult people, how amazing!".

But I'm already an old man, relatively speaking. Now I am 39; I found LW when I was 35. Finally I have a company of my peers (still not in my own city), but it can't fix the three decades of my life that already passed in isolation. It can make my life better, but I will always have the emotional scars of chronic loneliness. Oh, how much I envy those lucky kids who can go to LW meetups as teenagers. Makes me wonder how much my own life could be different; I probably wouldn't recognize myself.

Of course, this is just one data point; I don't know how typical or atypical I am within the LW community.

Comment author: Mirzhan_Irkegulov 09 July 2015 07:48:59AM 1 point [-]

Thank you for sharing your story, it was moving and it was candid. My question is, are you planning to be successful now? Suppose you gonna die at 80, you have 40 bloody years, that's a lot of time. Most likely you won't win Fields Medal, but science and human life has so many low-hanging fruit yet not picked. Do you plan to gain maximum productivity and do something to change the world? Or maybe you already doing it?

Comment author: JonahSinick 29 June 2015 07:16:56AM 1 point [-]

I'm speaking based on many interactions with many members of the community. I don't think this is true of everybody, but I have seen a difference at the group level.

Comment author: Mirzhan_Irkegulov 29 June 2015 07:51:14AM 2 points [-]

I didn't question that you were interacting with many members of the community. I'm saying you're projecting. Maybe people are either normal or slightly depressed/anxious/bitter/etc, meaning, they have same emotional problems just like any human being. You, however, see them as unusually emotionally damaged.

Typical Mind Fallacy, to understand other people we model them just like ourselves. You said yourself you had emotional problems before, so I believe your perception of the community is skewed. Maybe you see signs of emotional damage in other people, just like insecure promiscuous people seemingly spot depravity in other people.

Comment author: minusdash 27 June 2015 01:46:24AM *  0 points [-]

I don't really understand what you mean about math academia. Those references would be appreciated.

Comment author: JonahSinick 27 June 2015 02:02:48AM *  8 points [-]

The top 3 answers to the MathOverflow question Which mathematicians have influenced you the most? are Alexander Grothendieck, Mikhail Gromov, and Bill Thurston. Each of these have expressed serious concerns about the community.

  • Grothendieck was actually effectively excommunicated by the mathematical community and then was pathologized as having gone crazy. See pages 37-40 of David Ruelle's book A Mathematician's Brain.

  • Gromov expresses strong sympathy for Grigory Perelman having left the mathematical community starting on page 110 of Perfect Rigor. (You can search for "Gromov" in the pdf to see all of his remarks on the subject.)

  • Thurston made very apt criticisms of the mathematical community in his essay On Proof and Progress In Mathematics. See especially the beginning of Section 3: "How is mathematical understanding communicated?" Terry Tao endorses Thurston's essay in his obituary of Thurston. But the community has essentially ignored Thurston's remarks: one almost never hears people talk about the points that Thurston raises.

Comment author: itaibn0 28 June 2015 11:38:58PM 3 points [-]

I don't know about Grothendieck, but the two other sources appear to have softer criticism of the mathematical community than "actually functioning as a cult".

Comment author: ZoltanBerrigomo 29 June 2015 06:15:17AM *  2 points [-]

The links you give are extremely interesting, but, unless I am missing something, it seems that they fall short of justifying your earlier statement that math academia functions as a cult. I wonder if you would be willing to elaborate further on that?

Comment author: Will_Sawin 27 June 2015 11:29:34PM 1 point [-]

Thank you for all these interesting references. I enjoyed reading all of them, and rereading in Thurston's case.

Do people pathologize Grothendieck as having gone crazy? I mostly think people think of him as being a little bit strange. The story I heard was that because of philosophical disagreements with military funding and personal conflicts with other mathematicians he left the community and was more or less refusing to speak to anyone about mathematics, and people were sad about this and wished he would come back.

Comment author: [deleted] 28 June 2015 07:32:50PM 0 points [-]

Quite frankly, I find the norms in academia very creepy: I've seen a lot of people develop serious mental health problems in connection with their experiences in academia. It's hard to see it from the inside: I was disturbed by what I saw, but I didn't realize that math academia is actually functioning as a cult, based on retrospective impressions, and in fact by implicit consensus of the best mathematicians of the world (I can give references if you'd like) .

I've only been in CS academia, and wouldn't call that a cult. I would call it, like most of the rest of academia, a deeply dysfunctional industry in which to work, but that's the fault of the academic career and funding structure. CS is even relatively healthy by comparison to much of the rest.

How much of our impression of mathematics as a creepy, mental-health-harming cult comes from pure stereotyping?

Comment author: ChristianKl 28 June 2015 07:40:31PM *  2 points [-]

How much of our impression of mathematics as a creepy, mental-health-harming cult comes from pure stereotyping?

Jonah happens to be a math phd. How can you engage in pure stereotyping of mathematicians while you get your PHD?

Comment author: [deleted] 28 June 2015 11:26:35PM 0 points [-]

I was more positing that it's a self-reinforcing, self-creating effect: people treat Mathematics in a cultish way because they think they're supposed to.

Comment author: ZoltanBerrigomo 30 June 2015 12:11:28AM *  0 points [-]

For what its worth, I have observed a certain reverence in the way great mathematicians are treated by their lesser-accomplished colleagues that can often border on the creepy. This is something specific to math, in that it seems to exist in other disciplines with lesser intensity.

But I agree, "dysfunctional" seems to be a more apt label than "cult." May I also add "fashion-prone?"

Comment author: RichardKennaway 29 June 2015 12:25:43PM 0 points [-]

How much of our impression of mathematics as a creepy, mental-health-harming cult

Er, what? Who do you mean by "we"?

comes from pure stereotyping?

The link says of Turing:

Finally, Alan Turing, the great Bletchley Park code breaker, father of computer science and homosexual, died trying to prove that some things are fundamentally unprovable.

This is a staggeringly wrong account of how he died.

Comment author: [deleted] 29 June 2015 11:07:17PM 1 point [-]

This is a staggeringly wrong account of how he died.

Hence my calling it "pure stereotyping"!

Comment author: JonahSinick 29 June 2015 07:35:01AM 0 points [-]

I don't have direct exposure to CS academia, which, as you comment, is known to be healthier :-). I was speaking in broad brushstrokes , I'll qualify my claims and impressions more carefully later.

Comment author: [deleted] 26 June 2015 11:24:18PM 1 point [-]

Another thing I could pinpoint is that I don't want to identify as a "rationalist", I don't want to be any -ist.

I've always thought that calling yourself a "rationalist" or "aspiring rationalist" is rather useless. You're either winning or not winning. Calling yourself by some funny term can give you the nice feeling of belonging to a community, but it doesn't actually make you win more, in itself.

Comment author: [deleted] 26 June 2015 11:24:53PM 2 points [-]

There are other people involved in the community who you might find even more credible. For example: (a) Paul Christiano who was an international math olympiad medalist, wrote a 50 page paper on quantum computational complexity with Scott Aaronson as an undergraduate at MIT, and is a theoretical CS grad student at Berkeley. (b) Jacob Steinhardt, a Hertz graduate fellow who does machine learning research under Percy Liang at Stanford.

Of course, Christiano tends to issue disclaimers with his MIRI-branded AGI safety work, explicitly stating that he does not believe in alarmist UFAI scenarios. Which is fine, in itself, but it does show how people expect someone associated with these communities to sound.

And Jacob Steinhardt hasn't exactly endorsed any "Twilight Zone" community norms or propaganda views. Errr, is there a term for "things everyone in a group thinks everyone else believes, whether or not they actually do"?

Comment author: JonahSinick 27 June 2015 01:50:52AM 3 points [-]

I'm not claiming otherwise: I'm merely saying that Paul and Jacob don't dismiss LWers out of hand as obviously crazy, and have in fact found the community to be worthwhile enough to have participated substantially.

Comment author: [deleted] 28 June 2015 07:10:24PM 3 points [-]

I think in this case we have to taboo the term "LWers" ;-). This community has many pieces in it, and two large parts of the original core are "techno-libertarian Overcoming Bias readers with many very non-mainstream beliefs that they claim are much more rational than anyone else's beliefs" and "the SL4 mailing list wearing suits and trying to act professional enough that they might actually accomplish their Shock Level Four dreams."

On the other hand, in the process of the site's growth, it has eventually come to encompass those two demographics plus, to some limited extent, almost everyone who's willing to assent that science, statistical reasoning, and the neuro/cognitive sciences actually really work and should be taken seriously. With special emphasis on statistical reasoning and cognitive sciences.

So the core demographic consists of Very Unusual People, but the periphery demographics, who now make up most of the community, consist of only Mildly Unusual People.

Comment author: JonahSinick 29 June 2015 07:18:41AM 0 points [-]

Yes, this seems like a fair assessment o the situation. Thanks for disentangling the issues. I'll be more precise in the future.

Comment author: JonahSinick 26 June 2015 06:36:30PM *  0 points [-]

See my edit. Part of where I'm coming from is realizing how socially undeveloped people's in our reference class are tend to be, such that apparent malice often comes from misunderstandings.

Comment author: InquilineKea 28 June 2015 07:42:29PM 1 point [-]

(before people's social behavior had seemed like a complicated blur because I saw so many variables without having started to correctly identify the latent ones).

Interesting - what are some examples of the latent ones?