Viliam comments on Welcome to Less Wrong! (9th thread, May 2016) - Less Wrong

4 Post author: Viliam 17 May 2016 08:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (49)

You are viewing a single comment's thread. Show more comments above.

Comment author: Babson 20 May 2016 02:23:24AM 4 points [-]

I discovered SSC and LW a ~couple months ago, from (I think) a Startpage search which led me to Scott's lengthy article on IQ. Only browsed for a while, but last night rediscovered this after I read Doing Good Better and went to the EA website. I remember CFAR from a Secular Student Alliance conference two years ago.

I like Scott's writing, but I have no hard science training unfortunately.

I have realized that I've become rather used to my comfort zone, and have sort of let my innate intelligence stagnate, when I like to think it still has room to grow. I had psychological testing six years ago that put my IQ at 131 which, if I interpret the survey results correctly, puts me near the bottom of this community? Despite that, I find the philosophical elements of Yudkowsky fascinating [not so much the more mathematical stuff]. At least, this site has made me sit at a computer longer than I'm accustomed to.

It seems from EY's writing that LW wanted to be a homogeneous community of like-minded (in both senses) people, but I am curious to what extent rationalists engage in outreach (other than CFAR I guess) towards more average individuals. Because that changes how one writes. Or is there a tacit resignation that more average people just won't care or grok it; that smarter individuals should focus on their own personal growth and happiness? But then I remember Scott's writing and seeming compassion, and also the percentage of users who are social-democratic, so it seems like there would be higher demand for actually communicating with the outgroup.

I entered the humanities because I wanted to be a professor and I like to write, I like foreign languages, didn't think I would be interested in heavier things (took some psychology and philosophy as a postbac) but now I'm too far into my MA where I'm not sure I could get into an additional Master's program in something meaty and then pursue a better, more intellectually stimulating career.

Ultimately I just want to teach and "help" people. So, that's where I'm at. I read/skimmed DGB yesterday in one sitting while in the middle of yet another existential depression that my shrink thinks was caused by going off an opioid. I can't remember the last time I consumed a book in one sitting.

This was longer than I intended. Thank you.

Comment author: Viliam 21 May 2016 10:07:22PM 1 point [-]

I think the most important part of rationality is doing the basic stuff consistently. Things like noticing the problem that needs to be solved and actually spending five minutes trying to solve it, instead of just running on the autopilot. At some level of IQ, having the right character traits (or habits, which can be trained) could provide more added value than extra IQ points; and I believe you are already there.

I find the philosophical elements of Yudkowsky fascinating

Does it also make you actually do something in your life differently? Otherwise it's merely "insight porn". (This is not a criticism aimed specifically at you; I suspect this is how most readers of this website use it.)

I am curious to what extent rationalists engage in outreach (other than CFAR I guess) towards more average individuals. Because that changes how one writes.

I think the main problem is that we don't actually know how to make people more rational. Well, CFAR is doing some lessons, trying to measure the impact on their students and adjusting the lessons accordingly; so they probably already do have some partial results at the moment. That is not a simple task; to compare, teaching critical thinking at universities actually does not increase the critical thinking abilities of the students.

So, at this moment we want to attract people who have a chance of contributing meaningfully to the development of the Art of how to make people more rational. And then, when we have the Art, we can approach the average people and apply it on them.

Comment author: Babson 23 May 2016 12:39:19AM 0 points [-]

"to compare, teaching critical thinking at universities actually does not increase the critical thinking abilities of the students"

That's sad to hear.

Thank you for the advice. My primary concern is definitely to establish more rational habits. And then also to learn how to better learn.

Comment author: Viliam 23 May 2016 10:02:44AM 0 points [-]

Just like the Sequences say somewhere, putting a label "cold" on a refrigerator will not actually make it cold. Similarly, calling a lesson "critical thinking" does not do anything per se.

When I studied psychology, we had a lesson called "logic". It was completely unconnected to anything else; all I remember is drawing tables for boolean expressions "A and B", "A or B", "A implies B", "not A", and filling them with ones and zeroes. If you were able to fill the table correctly for a complex expression, you passed. It was a completely mechanical action; no one understood why the hell are we doing that; it was completely unconnected to anything else. So, I guess this kind of lesson actually didn't make anyone more "logical".

Instead we could have spent the time learning about cognitive biases, even the trivial ones, and how it applies to the specific stuff we study. For example, psychologists are prone to see "A and B" and conclude "A implies B" if it fits their prejudice. Just having one lesson that would give you dozen examples of "A and B", and you would have to write "maybe A causes B, or maybe B causes A, or maybe some unknown C causes both A and B, or maybe it's just a coincidence" would probably be more useful then the whole semester of "logic"; it could be an antidote against all that "computer games cause violence / sexism" stuff, if someone would remember the exercise.

But even when teaching cognitive biases, people are likely to apply them selectively to the stuff they want to disbelieve. I am already tired of seeing people abusing Popper this way (for example, any probabilistic hypothesis can be dismissed as "not falsifiable" and therefore "not scientific"), I don't want to give them even more ammunition.

I suspect that on some level this is an emotional decision to make -- you either truly care about what is true and what is bullshit, or you prefer to seem clever and be popular. A university lesson cannot really make you change this.