Personally, I prefer more produced podcasts, in the style of Serial, Freakonomics, etc, because very few people are good interviewees. I would like to hear more if you could improve the microphone quality - I couldn't distinguish some words, even upon relistening. I'm sure the person behind HPMOR Podcast would offer more tips if you contacted him.
Fascinating. Maybe he's been talking to Shane Legg of DeepMind, who also has much sooner timelines than I do.
Do you mind revealing what Shane's timelines are, and the probability that he thinks that he'll play a role in AGI?
(This is Dan from CFAR again)
We have a fair amount of data on the experiences of people who have been to CFAR workshops.
First, systematic quantitative data. We send out a feedback survey a few days after the workshop which includes the question "0 to 10, are you glad you came?" The average response to that question is 9.3. We also sent out a survey earlier this year to 20 randomly selected alumni who had attended workshops in the previous 3-18 months, and asked them the same question. 18 of the 20 filled out the survey, and their average response to that question was 9.6.
Less systematically but in more fleshed out detail, there are several reviews that people who have attended a CFAR workshop have posted to their blogs (A, B+pt2, C +pt2) or to LW (1, 2, 3). Ben Kuhn's (also linked above under "C") seems particularly relevant here, becaue he went into the workshop assigning a 50% probability to the hypothesis that "The workshop is a standard derpy self-improvement technique: really good at making people feel like they’re getting better at things, but has no actual effect."
In-person conversations that I've had with alumni (including some interviews that I've done with alumni about the impact that the workshop had on their life) have tended to paint a similar picture to these reviews, from a broader set of people, but it's harder for me to share those data.
We don't have as much data on the experiences of people who have been to test sessions or shorter events. I suspect that most people who come to shorter events have a positive experience, and that there's a modest benefit on average, but that it's less uniformly positive. Partly that's because there's a bunch of stuff that happens with a full workshop that doesn't fit in a briefer event - more time for conversations between participants to digest the material, more time for one-on-one conversations with CFAR staff to sort through things, followups after the workshop to work with someone on implementing things in your daily life, etc. The full workshop is also more practiced and polished (it has been through many more iterations) - much moreso than a test session; one-day events are in between (the ones advertised as alpha tests of a new thing are closer to the test session end of the spectrum).
Hey Dan, thanks for responding. I wanted to ask a few questions:
You noted the non-response rate for the 20 randomly selected alumni. What about the non-response rate for the feedback survey?
"0 to 10, are you glad you came?" This is a biased question, because you frame that the person is glad. A similar negative question may say "0 to 10, are you dissatisfied that you came?" Would it be possible to anonymize and post the survey questions and data?
We also sent out a survey earlier this year to 20 randomly selected alumni who had attended workshops in the previous 3-18 months, and asked them the same question. 18 of the 20 filled out the survey, and their average response to that question was 9.6.
It's great that you're following up with people long after the workshops end. Why not survey all alumni? You have their emails.
I've read most of the blog posts about CFAR workshops that you linked to - they were one of my main motivations for attending a workshop. I notice that all reviews are from people who have already participated in LessWrong and related communities. (all refer to some prior CFAR, EA and rationality related topics before they attended camp). Also, it seems like in person conversations are majorly subjected to the availability bias, as the people who attended workshops || know people who work at MIRI/CFAR || are involved in LW meetups in Berkeley and surrounding areas would contribute to the positivity of these conversations.. Also, the evaporative cooling effect may also play a role, in that people who weren't satisfied with the workshop would leave the group. Are there reviews from people who are not already familiar with LW/CFAR staff?
Also, I agree with MTGandP. It would be nice if CFAR could write a blog post or paper on how effective their teachings are, compared to a control group. Perhaps two one-day events, with subjects randomized across both days, should work well as a starting point.
Do you think it was unhelpful because you already had a high level of knowledge on the topics they were teaching and thus didn't have much to learn or because the actual techniques were not effective? Do you think your experience was typical? How useful do you think it would be to an average person? An average rationalist?
Do you think it was unhelpful because you already had a high level of knowledge on the topics they were teaching and thus didn't have much to learn or because the actual techniques were not effective?
I don't believe I had a high level of knowledge on the specific topics they were teaching (behavior change, and the like). I did study some cognitive science in my undergraduate years, and I take issue with the 'science'.
Do you think your experience was typical?
I believe that the majority of people don't get much, if anything, from CFAR's rationality lessons. However, after the lesson, people may be slightly more motivated to accomplish whatever they want to, in the short term just because they've paid money towards a course to increase their motivation.
How useful do you think it would be to an average person?
There was one average person at one of the workshops I attended. e.g. never read LessWrong/other rationality material. He fell asleep a few hours into the lesson, I don't think he gained much from attending. I'm hesitant to extrapolate, because I'm not exactly sure what an average person entails.
An average rationalist?
I haven't met many rationalists, but would believe they wouldn't benefit much/at all.
These days, most of my time on Anki is on Japanese (which I'm learning for fun) and Chinese (which I already know, but I'm brushing up on tones and characters).
Looking through my decks, I also have decks on:
- Algorithms and data structures (from a couple books I read on that)
- Communication (misc. tips on storytelling, giving talks, etc.)
- Game Design (insights and concepts that seemed valuable)
- German
- Git and Unix Command Line commands
- Haskell
- Insight (misc. stuff that seemed interesting/important)
- Mnemonics
- Productivity (notes from Lukeprog's posts and vairous other sources)
- Psychology and neuroscience
- Rationality Habits (one of the few decks I have that come all made, from Anna Salmon I think, though I also added some stuff and delted others)
- Statistics
- Web Technologies (some stuff on Angular JS and CSS that I got tired of looking up all the time)
(also a few minor decks with very few cards)
I review those pretty much every day (I sometimes leave a few unfinished, depending on how much idle time I have in queues, transport, etc.)
That's fantastic. How many cards total do you have, and how many minutes a day do you study?
What made it poor use of your time?
I didn't learn anything useful. They taught, among other things, "here's what you should do to gain better habits". Tried it and didn't work on me. YMMV.
One thing that really irked me was the use of cognitive 'science' to justify their lessons 'scientifically'. They did this by using big scientific words that felt like they were trying to attempt to impress us with their knowledge. (I'm not sure what the correct phrase is - the words weren't constraining beliefs? don't pay rent? they could have made up scientific sounding words and it would have had the same effect.)
Also, they had a giant 1-2 page listing of citations that they used to back up their lessons. I asked some extremely basic questions about papers and articles I've previously read on the list and they had absolutely no idea what I was talking about.
ETA: I might go to another class in a year or two to see if they've improved. Not convinced that they're worth donating money towards at this moment.
Those who are currently using Anki on a mostly daily or weekly basis: what are you studying/ankifying?
To start: I'm working on memorizing programming languages and frameworks because I have trouble remembering parameters and method names.
I'd like to ask LessWrong's advice. I want to benefit from CFAR's knowledge on improving ones instrumental rationality, but being a poor graduate I do not have several thousand in disposable income nor a quick way to acquire it. I've read >90% of the sequences but despite having read lukeprog's and Alicorn's sequences I am aware that I do not know what I do not know about motivation and akrasia. How can I best improve my instrumental rationality on the cheap?
Edit: I should clarify, I am asking for information sources: blogs, book recommendations, particularly practice exercises and other areas of high quality content. I also have a good deal of interest in the science behind motivation, cognitive rewiring and reinforcement. I've searched myself and I have a number of things on my reading list, but I wanted to ask the advice of people who have already done, read or vetted said techniques so I can find and focus on the good stuff and ignore the pseudoscience.
I've been to several of CFAR's classes throughout the last 2 years (some test classes and some more 'official' ones) and I feel like it wasn't a good use of my time. Spend your money elsewhere.
Is there a listing of Yvain/slatestarcodex's fiction? I just finished reading The Study of Anglophysics, and I want more.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Thinking about a quote from HPMOR (the podcast is quite good, if anyone was interested):
...
Besides the quoted "Chimpanzee Politics" are there any other references to this hypothesis? I've tried Googling around for 5 minutes and I couldn't find anything.
Edit: seems like I was looking using the wrong keywords: Wikipedia seems to have a small paragraph on evolution of human brain due to competitive social behavior, but I'd still like to see if anyone else had any articles on the matter.