In response to Post Request Thread
Comment author: curiousepic 11 April 2013 01:54:47AM 16 points [-]

I'd be interested in lukeprog's (or CFAR's) thoughts on how to implement "tight feedback loops" into every day instrumental rationality (as opposed to running a business or project).

Comment author: apophenia 11 April 2013 02:01:48AM *  4 points [-]

I'd be interested in writing this one. I don't your divide is a real one; it's basically the same skill. But it's still worth talking about in that context.

Comment author: apophenia 11 April 2013 01:51:44AM 0 points [-]

I just launched the alpha of forget.io, a service for developing habits and recording data in self-experimentation. It texts you on your phone; you text it back. My stereotypical question (and the one I invented it for) is "How happy are you on a scale of 1-10?" Free to minicamp participants; costs a small fee for everyone else (although only enough to pay for the text messages).

Comment author: Metus 08 November 2012 12:17:11AM 3 points [-]

Can you point us to the more interesting checklist resources?

Comment author: apophenia 18 November 2012 12:45:23AM *  1 point [-]

Absolutely. I can give better resources if you can be more specific as to what you're looking for.

I recommend The Checklist Manifesto first as an overview, as well as a basic understanding of akrasia, and trying and failing to make and use some checklists yourself.

The resources I spent most of my time with were very specific to what I was working on, and so I wouldn't recommend them. However, just in case someone finds it useful, Human Factors of Flight-Deck Checklists: The Normal Checklist draws attention to some common failure modes of checklists outside the checklist itself.

Comment author: [deleted] 07 November 2012 04:16:57PM 2 points [-]

This is awesome. I might remove the examples, print down the rest of the list, and read it every morning when I get up and every night before going to sleep. OTOH I have a few quibbles with some examples:

Recent example from Anna: Jumping off the Stratosphere Hotel in Las Vegas in a wire-guided fall. I knew it was safe based on 40,000 data points of people doing it without significant injury, but to persuade my brain I had to visualize 2 times the population of my college jumping off and surviving. Also, my brain sometimes seems much more pessimistic, especially about social things, than I am, and is almost always wrong.

For some reason my brain is more comfortable working with numbers that with visualizations, instead. That can be bad for signalling: a few years ago there was a terrorist attack in London which affected IIRC about 300 people; my mother told me “you should call [your friend who's there] and ask him if he's all right”, and I answered “there are 10 million people in London, so the probability that he was involved is about 1 in 30,000, which is less than the probability that he would die naturally in...”; my mother called me heartless before I even finished the sentence.

Recent example from Anna's brother: Trying to decide whether to move to Silicon Valley and look for a higher-paying programming job, he tried a reframe to avoid the status quo bias: If he was living in Silicon Valley already, would he accept a $70K pay cut to move to Santa Barbara with his college friends? (Answer: No.)

There's a huge difference: someone living in Silicon Valley on $70K + x and considering whether to stay there or move to Santa Barbara and earn x would be used to living on $70K + x; whereas someone living in Santa Barbara on x and considering whether to move to Silicon Valley and earn x + $70K or stay there would be used to living on x. This would affect how much each of them would enjoy a given amount of money. Also, the former would already have a social circle in Silicon Valley, and the latter wouldn't.

Recent example from Anna: I noticed that every time I hit 'Send' on an email, I was visualizing all the ways the recipient might respond poorly or something else might go wrong, negatively reinforcing the behavior of sending emails. I've (a) stopped doing that (b) installed a habit of smiling each time I hit 'Send' (which provides my brain a jolt of positive reinforcement). This has resulted in strongly reduced procrastination about emails.

Huh, no. If they are likely to respond badly, I want to believe they are likely to respond badly. If they aren't likely to respond badly, I want to believe they aren't likely to respond badly. What is true is already so, owning it up doesn't make it worse. The solution to that problem is to think twice and re-read the email and think about ways to make it less likely for it to be interpreted in an unintended way before hitting Send.

In response to comment by [deleted] on Checklist of Rationality Habits
Comment author: apophenia 07 November 2012 10:05:51PM *  6 points [-]

This is awesome. I might remove the examples, print down the rest of the list, and read it every morning when I get up and every night before going to sleep.

Interesting you should say that. About a week ago I simplified this into a more literal checklist designed to be used as part of a nightly wind-down, to see if it could maintain or instill habits. I designed the checklist based largely on empirical results from NASA's review of the factors for effectiveness of pre-flight safety checklists used by pilots, although I chased down a number of other checklist-related resources. I'm currently actively testing effects on myself and others, both trying to test to make sure it would actually be used, and getting the time down to the minimum possible (it's hovering around two minutes).

P.S. I'm not associated with CFAR but the checklist is an experiment on their request.

If you were to test your suggestion for two weeks, I would be interested to hear the results. My prediction (with 80% certainty) is: Lbh jvyy trg cbfvgvir erfhygf sbe n avtug be gjb. Jvguva gra qnlf, lbh jvyy svaq gur yvfg nirefvir / gbb zhpu jbex naq fgbc ernqvat vg, ortva gb tynapr bire vg jvgubhg cebprffvat nalguvat, be npgviryl fgbc gb svk bar bs gur nobir ceboyrzf. (Gur nezl anzr znxrf zr yrff pregnva guna hfhny--zl fgrerbglcr fnlf lbh znl or oberq naq/be qvfpvcyvarq.)

Comment author: Kingreaper 13 August 2011 08:17:26AM 0 points [-]

No, by via rationality, I mean via rationality. You cannot use the rational part of their brain to convince them that it is good to be rational, because the rational part of them already knows that, it's just not in charge.

Convincing them, through the rational part of themselves, that eating a certain food gives them stomachache, is often easy. But that's a completely different problem, with no real relation to the problem I was talking about.

Comment author: apophenia 20 August 2011 09:49:16AM 2 points [-]

So, let's call the thing I'm talking about "winning". It is EXTREMELY helpful although not logically necessary to think winning is a good idea in order to win. I'm talking about how to convince people of that helpful step, so they can, next, learn how to win, and finally, apply the knowledge and win.

Either you're talking about a rationality that doesn't consist of winning, or I'm hearing: "You cannot use the 'winning' part of their brain to convince them that it is good to win, because the 'winning' part of them already knows that, it's just not in charge." Why on earth should I restrict myself to some arbitrary 'winning' part of their brain, if such a thing existed, to convince them that it's good to win? That sounds silly.

Please let me know if I even make sense.

Comment author: apophenia 07 August 2011 12:37:39PM *  7 points [-]
  • Figure out your goals, and then make plans for when you get off work to optimize for those. Working as a cashier doesn't seem optimal for almost any purpose--maybe you could start by figuring out how to make money more efficiently, if that's your goal?
  • Learn the major system or memory palace. This would let you store a list of things to think about or do when at work. It's also quite easy to practice while at work, once you get the basics down. I'd recommend this first, if you really won't be allowed to write.
  • Solve problems. See what problem-solving methods work and which don't. See what kinds of problems you are worst/best at, and become better at those. Math problems, world-modeling (prediction and underlying event deduction), and introspection are especially easy to do in your head.
  • Try to figure out why stuff around you is the way it is. (Why did that person buy that item?). Make predictions. Calibrate and get higher accuracy as well.
  • Introspect. Find out why you believe what you believe, and whether you should.
  • Don't improve your rationality, do something else with your time.
  • Optimize your job as a cashier, as much as is possible. Figure out how to do stuff in the least time. Experiment when interacting with customers to see if you can get tips or interesting conversation. Get a different job (manager?) at the same establishment somehow. A useful problem will motivate you more than a non-useful problem.
  • Combine all these.
Comment author: Solvent 07 August 2011 05:39:00AM 0 points [-]

Why not practise mental arithmetic, like 45 times 23. It's not really rationality, but can't hurt. It's probably good for your brain somehow.

Or you could try doing fun pointless economics or physics calculations. If you're a cashier at a supermarket, you could calculate how far the chemical potential energy in a can of soup or whatever would propel it into the air, and do the calculation for as many products as you can find. Or figure out what proportion of the money that comes through you would have had to have stolen and invested twenty years ago on order to get your current salary. Or something like that. I dunno.

(Note: Here in Australia, cashier might have a different meaning. I hope I didn't offend you by implying you were a check out guy in a supermarket.)

Comment author: apophenia 07 August 2011 12:24:13PM 3 points [-]

Downvoted for "It's probably good for your brain somehow."

Comment author: [deleted] 07 August 2011 10:21:39AM 0 points [-]

Yeah, I don't watch TED anymore. Any other specific suggestions?

In response to comment by [deleted] on Do Humans Want Things?
Comment author: apophenia 07 August 2011 10:37:48AM 0 points [-]

I can't give another suggestion unless you tell me what's undesirable about watching TED. There's a transcript on the site, but he uses graphics copiously, so I'm curious how useful it is. Less Wrong says it is too long to post as a comment.

Comment author: Perplexed 13 December 2010 06:10:24PM *  22 points [-]

I would like to see sequences of top level postings providing semi-technical tutorials on topics of interest to rationalists.

As one example of a topic: Game Theory

Actually, there is material here for several sequences, dealing with several sub-topics. We need a sequence on games with incomplete information, on iterated games, on two-person cooperative games (we have a couple articles already, but we haven't yet covered Nash's 1953 paper with threats), and on multi-person cooperative games (Shapley value, Core, Nucleolus, and all that).

Comment author: apophenia 07 August 2011 10:08:53AM 1 point [-]

I've studied game theory and rationality, and I don't use game theory even when applying rationality to game design! I've used some of the nontechnical results (threats, from Shelling's book) to negotiate and precommit but that's about it. Has someone else used game theory in real life?

Unless someone else responds to this comment, my guess is that this topic is of greater interest to readers than it is of any use.

Comment author: Kingreaper 14 December 2010 03:33:22AM 11 points [-]

I think we need more (Defence against the) Dark Arts discussion.

And yes I do think we need to learn to use them, as well as defend against them. An irrational person cannot be convinced that rationality is good through the use of rationality.

Comment author: apophenia 07 August 2011 09:47:44AM *  -1 points [-]

By "via rationality" I assume you mean "via logical argument or sound science", which is an absurd substitution. Rationalists should win. The Dark Arts therefore are a type of instrumental rationality. That said, I still disagree, at least for some irrational people (let's roughly say anyone I could convince to eating a food that gives them a stomachache).

They can be convinced they should study [instrumental] rationality, it just requires you present unreasonably large amounts of evidence and don't use logical inference or experiments. (And when I say unreasonably large, that's for people in college studying science. For merely average twenty-somethings, you may need to beat them over the head with solid bricks of evidence.) Caveat: I do not often interact with allegedly common people who don't meet the minimum bar of adjusting expectations based on (sufficient) observation, so this comment does not apply to such persons. It is still a useful comment.

I.e. look, I used this thingy called rationality and I made/saved thousands of dollars, got a boyfriend, and fixed significant mental problems. Seemed to work for me okay. You need to go REALLY overkill on the evidence for non-science folks though. Again, beat them over the head with it. Make it something that will help them personally, too. I've found it useful to get people to agree (not verbally and aloud, though that's an interesting experiment) that whatever mysterious method I used to I do that, it would be a good thing to learn, BEFORE I revealed that the answer is something weird or "educational" sounding. This second half is only slightly dark-artsy (consistency bias).

View more: Prev | Next