Comment author: orbenn 16 February 2015 08:19:11AM *  0 points [-]

I recently watched this Coursera course on learning how to learn and your post uses different words for some of the same things.

The course described what you call "shower-thoughts" as "diffuse mode" thinking, with an opposite called "focused mode" thinking and the brain only able to do one at a time. Focused mode uses ideas that are already clustered together to solve familiar problems while diffuse mode attempts to find useful connections between unclustered ideas to solve new problems in new ways. Not sure if these are the formal/correct terms from the literature that was behind the class, but if so it might be worth using them instead of making up our own jargon.

As for the class it definitely had some stuff that I still try to keep in mind, but it also had some things that I haven't quite figured out how to incorporate (chunking) or didn't find useful (some of the interviews). There is some overlap with what CFAR seems to be trying to teach. Overall I'd recommend taking a look if you have an hour or so per week over a month for it.

Comment author: Manfred 13 December 2014 05:54:03AM *  18 points [-]

I really liked the level of subtextual snark (e.g. almost every use of the word 'rational'). This level of skepticism and mockery is, frankly, about what should be applied, and was fun to read.

I was surprised at the density of weirdness, not because it makes bad journalism, but because it's difficult for the audience to understand (e.g. /r/hpmor is just dropped in there and the reader is expected to deal). I like Sarunas' explanation for this. Fairness-wise, this was better than I expected, though with occasional surrenders to temptation (The glaring one for me was Will and Divia Eden).

Michael Vassar as our face was inevitable if disappointing. The writing about him was great. I feel like the descriptions of his clothing are the author making him a little funnier - nobody else gets clothing description.

The author's initial inability to read lesswrong makes me think we may need a big button at the top that says "First time? Click here!" and just dumps you into a beginner version of the Sequences page.

Comment author: orbenn 14 December 2014 09:16:53AM 0 points [-]

I agree. The difficult thing about introducing others to Less Wrong has always been that even if the new person remembers to say "It's my first time, be gentle". Less Wrong has the girth of a rather large horse. You can't make it smaller without losing much of its necessary function.

Comment author: orbenn 09 December 2014 04:43:06AM 0 points [-]

Updated link to Piers Steel's meta-analysis on procrastination research (at least I think it's the correct paper): http://studiemetro.au.dk/fileadmin/www.studiemetro.au.dk/Procrastination_2.pdf

Comment author: DanArmak 01 March 2014 12:27:52PM *  5 points [-]

Everyone (and every group) thinks they are rational. This is not a distinctive feature of LW. Christianity and Buddhism make a lot of their rationality.

To the contrary, lots of groups make a big point of being anti-rational. Many groups (religious, new-age, political, etc.) align themselves in anti-scientific or anti-evidential ways. Most Christians, to make an example, assign supreme importance to (blind) faith that triumphs over evidence.

But more generally, humans are a-rational by default. Few individuals or groups are willing to question their most cherished beliefs, to explicitly provide reasons for beliefs, or to update on new evidence. Epistemic rationality is not the human default and needs to be deliberately researched, taught and trained.

And people, in general, don't think of themselves as being rational because they don't have a well-defined, salient concept of rationality. They think of themselves as being right.

Comment author: orbenn 01 March 2014 05:16:22PM *  1 point [-]

I think we're getting some word-confusion. Groups that claim "make a big point of being anti-rational" are against the things with the label "rational". However they do tend to think of their own beliefs as being well thought out (i.e. rational).

Comment author: orbenn 01 March 2014 04:59:07PM 1 point [-]

"rationality" branding isn't as good for keeping that front and center, especially compared to, say the effective altruism meme

Perhaps a better branding would be "effective decision making", or "effective thought"?

As I've already explained, there's a difficult problem here about how to be appropriately modest about our own rationality. When I say something, I never think it's stupid, otherwise I wouldn't say it. But at least I'm not so arrogant as to go around demanding other people acknowledge my highly advanced rationality. I don't demand that they accept "Chris isn't saying anything stupid" as an axiom in order to engage with me.

I think this is the core of what you are disliking. Almost all of my reading on LW is in the Sequences rather than the discussion areas, so I haven't been placed to notice anyone's arrogance. But I'm a little sadly surprised by your experience because for me, the result of reading the sequences has been to have less trust that my own level of sanity is high. I'm significantly less certain of my correctness in any argument.

We know that knowing about biases doesn't remove them, so instead of increasing our estimate of our own rationality, it should correct our estimate downwards. This shouldn't even require pride as an expense since we're also adjusting our estimates of everyone else's sanity down a similar amount. As a check to see if we're doing things right, the result should be less time spent arguing and more time spent thinking about how we might be wrong and how to check our answers. Basically it should remind us to use type 2 thinking more whenever possible, and to seek effectiveness training for our type 1 thinking whenever available.

In response to On saving the world
Comment author: orbenn 01 February 2014 04:41:46AM 6 points [-]

This was enjoyable to me because "saving the world", as you put it, is completely unmotivational for me. (Luckily I have other sources of motivation) It's interesting to see what drives other people and how the source of their drive changes their trajectory.

I'm definitely curious to see a sequence or at least a short feature list about your model for a government that structurally ratchets better instead of worse. That's definitely something that's never been achieved consistently in practice.

Comment author: Gurkenglas 11 January 2014 08:48:43PM 0 points [-]

What does "direct revival" mean? If the slices were properly reconnected, the function of the brain should be unchanged.

Comment author: orbenn 12 January 2014 06:38:02PM 0 points [-]

I think he means "create a functional human you, while primarily sourcing the matter from your old body". He's commenting that slicing the brain makes this more difficult, but it sounds like the alterations caused by current vitrification techniques make it impossible either way.

Comment author: orbenn 27 June 2012 05:19:02PM 3 points [-]

The problem here seems to be about the theories not taking all things we value into account. It's therefore less certain whether their functions actually match our morals. If you calculate utility using only some of your utility values, you're not going to get the correct result. If you're trying to sum the set {1,2,3,4} but you only use 1, 2 and 4 in the calculation, you're going to get the wrong answer. Outside of special cases like "multiply each item by zero" it doesn't matter whether you add, subtract or divide, the answer will still be wrong. For example the calculations given for total utilitarianism fail to include values for continuity of experience.

This isn't to say that ethics are easy, but we're going to have a devil of a time testing them with impoverished input.

Comment author: SilasBarta 29 March 2012 08:16:09PM *  8 points [-]

7b) Is there any evidence I'll be glad I went that a Christian retreat could not produce just as easily?

Edit: Okay, 15 seconds to this being downvoted was a little hasty.

Comment author: orbenn 29 March 2012 08:58:57PM *  1 point [-]

If the primary motivation for attending is the emotional rewards of meeting others with interest in rationality and feeling that you've learned how to be more rational, then yes, a Christian brainwashing retreat would make you glad you attended it in the same way, if and only if you are/became Christian (since non Christians likely wouldn't enjoy a Christian brainwashing retreat.)

That said, as many of us have little/no data on changes in rationality (if any) of attendees, attending is the only real option you have to test whether it might. Confirmation bias would make a positive result weak evidence, but it'd be relatively important given the lack of other evidence. Luckily even if the retreat doesn't have benefits to your objective level of rationality it sounds worthwhile on the undisputed emotional merits.

I think what SilasBarta is trying to ask is do we have any objective measurements yet from the previous minicamp that add weight to the hypothesis that this camp does in fact improve rationality or life achievement over either the short or long term?

If not then I'm still curious, are there any plans to attempt to study rationality of attendees and non-attendees to establish such evidence?

In response to comment by [deleted] on Rationality Quotes February 2012
Comment author: JoachimSchipper 01 February 2012 05:13:11PM 11 points [-]

Is this true? Naive Googling yields this, which suggests (non-authoritatively) that blood sugar and moods are indeed linked (in diabetics, but it's presumably true in the general population). However, despair is not noted and the effects generally seem milder than that (true despair is a rather powerful emotion!)

Comment author: orbenn 01 February 2012 06:56:44PM *  6 points [-]

Anecdotally: I'm not diabetic that I know of, but my mood is highly dependent on how well and how recently I've eaten. I get very irritable and can break down into tears easily if I'm more than four hours past due.

View more: Next