5 min read1st Oct 2011122 comments

39

Can drugs improve your rationality?

I’m not sure, but it seems likely.

Remember the cognitive science of rationality. Often, irrationality is a result of ‘mindware gaps’ or ‘contaminated mindware’ — missing pieces of knowledge like probability theory, or wrong ideas like supernaturalism. Alas, we cannot yet put probability theory in a pill and feed it to people, nor can a pill deprogram someone from supernaturalism.

Another cause of irrationality is ‘cognitive miserliness’. We default to automatic, unconscious, inaccurate processes whenever possible. Even if we manage to override those processes with slow deliberation, we usually perform the easiest deliberation possible — deliberation with a ‘focal bias’ like confirmation bias.

What will increase the likelihood of cognitive override and decrease the effect of focal biases? First, high cognitive capabilities (IQ, working memory, etc.) make a brain able to do the computationally difficult processing required for cognitive override and avoidance of focal bias. Second, a disposition for cognitive reflectiveness make it more likely that someone will choose to use those cognitive capabilities to override automatic reasoning processes and reason with less bias.1

Thus, if drugs can increase cognitive capability or increase cognitive reflectiveness, then such drugs may be capable of increasing one’s rationality.

First: Can drugs increase cognitive capability?

Yes. Many drugs have been shown to increase cognitive capability. Here are a few of them:2

  • Modafinil improves working memory, digit span, visual pattern recognition, spatial planning, and reaction time.3
  • Because glucose is the brain’s main energy source,4 increases in glucose availability via sugar injestion should improve memory performance.5
  • Creatine improves cognitive performance.6
  • Donepezil improves memory performance, but perhaps only after taken for 21 days.7
  • Dopamine agonists like d-amphetamine, bromocriptine, and pergolide have all been been found to improve working memory and executive function,8 but perhaps only in those with poor memory performance.9
  • Guanfacine has shown mixed effects on cognition.10 Methylphenidate (Ritalin) has also shown mixed results for cognitive enhancement,11 though the most commonly reported motive for illicit use of prescription stimulants like methylphenidate is to enhance concentration and alertness for studying purposes.12
  • Piracetam is usually prescribed to deal with cognitive deficits and other problems, but also has also shown some cognitive benefits in healthy individuals.13

Second: Can drugs increase cognitive reflectiveness?

I’m not sure. I’m not yet aware of any drugs that have been shown to increase one’s cognitive reflectiveness.

So, can drugs improve your rationality? I haven’t seen any experimental studies test whether particular drugs improve performance on standard tests of rationality like the CRT. However, our understanding of how human irrationality works suggests that improvements in cognitive capability and cognitive reflectiveness (via drugs or other means) should increase one’s capacity to think and act rationally. That said, current drugs probably can’t improve rationality as much as demonstrated debiasing practices can.

Should we use drugs for cognitive enhancement? Scholars debate whether such modifications to human functioning are ethical or wise,14 but I think the simplicity of the transhumanist position is pretty compelling:

If we can make things better, then we should, like, do that.15

 

 

Notes

1 For a review, see Stanovich (2010), ch. 2.

2 For a broader overview, see de Jongh et al. (2008); Normann & Berger (2008); Sandberg (2011).

3 Muller et al. (2004); Turner et al. (2004); Gill et al. (2006); Caldwell et al. (2000); Finke et al. (2010); Repnatis et al. (2010).

4 Fox et al. (1988).

5 Foster et al. (1999); Sunram-Lea et al. (2002).

6 Rae et al. (2003); McMorris et al. (2006); Watanabe et al. (2002).

7 Gron et al. (2005).

8 D-amphetamine: Mattay et al. (2000); Mattay et al. (2003); Barch & Carter (2005). Bromocriptine: Kimberg et al. (1997); Kimberg et al. (2001); Mehta et al. (2001); Roesch-Ely et al. (2005); Gibbs & D’Esposito (2005a). Pergolide: Muller et al. (1998); Kimberg & D’Esposito (2003).

9 Kimberg et al. (1997); Mehta et al. (2001); Mattay et al. (2000); Mattay et al. (2003); Gibbs & D’Esposito (2005a, 2005b).

10 Muller et al. (2005); de Jongh et al. (2008).

11 de Jongh et al. (2008).

12 Teter et al. (2006).

13 Dimond & Brouwers (1976); Mondadori (1996).

14 Savulescu & Bostrom (2009).

15 I think I first heard Louie Helm put it this way.

 

References

Barch & Carter (2005). Amphetamine improves cognitive function in medicated individuals with schizophrenia and in healthy volunteers. Schizophrenia Research, 77: 43–58.

Caldwell, Caldwell, et al. (2000). A double-blind, placebo-controlled investigation of the efficacy of modafinil for sustaining the alertness and performance of aviators: A helicopter simulator study. Psychopharmacology (Berlin), 150: 272–282.

de Jongh, Bolt, Schermer, & Olivier (2008). Botox for the brain: Enhancement of cognition, mood, and pro-social behavior and blunting of unwanted memories. Neuroscience and Biobehavioral Reviews, 32: 760-776.

Dimond & Brouwers (1976). Increase in the power of human memory in normal man through the use of drugs. Psychopharmacology, 49: 307–309.

Finke, Dodds, et al. (2010). Effects of modafinil and methylphenidate on visual attention capacity: a TVA-based study. Psychopharmacology, 210: 317-329.

Foster, Lidder, & Sunram (1998). Glucose and memory: fractionation of enhancement effects? Psychopharmacology, 137: 259–270.

Fox, Raichle, et at. (1988). Nonoxidative glucose consumption during focal physiologic neural activity. Science, 241: 462–464.

Gibbs & D’Esposito (2005a). Individual capacity differences predict working memory performance and prefrontal activity following dopamine receptor stimulation. Cognitive & Affective Behavioral Neuroscience, 5: 212–221.

Gibbs & D’Esposito (2005b). A functional MRI study of the effects of bromocriptine, a dopamine receptor agonist, on component processes of working memory. Psychopharmacology (Berlin), 180: 644–653.

Gill, Haerich, et al. (2006). Cognitive performance following modafinil versus placebo in sleep-deprived emergency physicians: A double-blind randomized crossover study. Academic Emergency Medicine, 13: 158–165.

Gron, Kirstein, et al. (2005). Cholinergic enhancement of episodic memory in healthy young adults. Psychopharmacology (Berlin), 182: 170–179.

Kimberg, D’Esposito, & Farah (1997). Effects of bromocriptine on human subjects depend on working memory capacity. Neuroreport, 8: 3581–3585.

Kimberg, Aguirre, et al. (2001). Cortical effects of bromocriptine, a D-2 dopamine receptor agonist, in human subjects, revealed by fMRI. Human Brain Mapping, 12: 246–257.

Kimberg & D’Esposito (2003). Cognitive effects of the dopamine receptor agonist pergolide. Neuropsychologia, 41: 1020–1027.

Mattay, Callicott, et al. (2000). Effects of dextroamphetamine on cognitive performance and cortical activation. Neuroimage, 12: 268–275.

Mattay, Goldberg, et al. (2003). Catechol O-methyltransferase val158-met genotype and individual variation in the brain response to amphetamine. Proceedings of the National Academy of Sciences USA, 100: 6186–6191.

McMorris, Harris, et al. (2006). Effect of creatine supplementation and sleep deprivation, with mild exercise, on cognitive and psychomotor performance, mood state, and plasma concentrations of catecholamines and cortisol. Psychopharmacology, 185: 93–103.

Mehta, Swainson, et al. (2001). Improved short-term spatial memory but impaired reversal learning following the dopamine D(2) agonist bromocriptine in human volunteers. Psychopharmacology (Berlin), 159: 10–20.

Mondadori (1996). Nootropics: Preclinical results in the light of clinical effects; comparison with tacrine. Critical Reviews in Neurobiology, 10: 357–370.

Muller, von Cramon, & Pollmann (1998). D1- versus D2-receptor modulation of visuospatial working memory in humans. Journal of Neuroscience, 18: 2720–2728.

Muller, Steffenhagen, et al. (2004). Effects of modafinil on working memory processes in humans. Psychopharmacology, 177: 161–169.

Muller, Clark, et al. (2005). Lack of effects of guanfacine on executive and memory functions in healthy male volunteers. Psychopharmacology (Berlin), 182: 205–213.

Normann & Berger (2008). Neuroenhancement: status quo and perspectives. European Archives of Psychiatry and Clinical Neuroscience, 258 Supplement 5: 110-114.

Rae, Digney, et al. (2003). Oral creatine monohydrate supplementation improves brain performance: a double-blind, placebo-controlled, cross-over trial. Proceedings of the Royal Society of London Series B, Biotogical Sciences, 270: 2147–2150.

Repantis, Schlattmann, et al. (2010). Modafinil and methylphenidate for neuroenhancement in healthy individuals: A systematic review. Pharmacological Research, 62: 187-206.

Roesch-Ely, Scheffel, et al. (2005). Differential dopaminergic modulation of executive control in healthy subjects. Psychopharmacology (Berlin), 178: 420–430.

Sandberg (2011). Cognition enhancement: Upgrading the brain. In Savulescu, ter Meulen, & Kahane (eds.), Enhancing Human Capacities. Wiley-Blackwell.

Savulescu & Bostrom (2009). Human Enhancement. Oxford University Press.

Stanovich (2010). Rationality and the Reflective Mind. Oxford University Press.

Sunram-Lea, Foster, et al. (2002). Investigation into the significance of task difficulty and divided allocation of resources on the glucose memory facilitation effect. Psychopharmacology, 160: 387–397.

Teter, Robbins, et al. (2003). Cognitive enhancing effects of modafinil in healthy volunteers. Psychopharmacology, 165: 260–269.

Turner, Clark, Dowson, Robbins, & Sahakian (2004). Modafinil improves cognition and response inhibition in adult attention-deficit/hyperactivity disorder. Biological Psychiatry, 55: 1031-1040.

Watanabe, Kato, et al. (2002). Effects of creatine on mental fatigue and cerebral hemoglobin oxygenation. Neuroscience Research, 42: 279–285.

New Comment
122 comments, sorted by Click to highlight new comments since: Today at 7:56 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

"Drug X improves performance measure Y" will in general be an incomplete description of the effects of drug X.

To be a rationalist is to be the kind of person who mentally adds "among other as yet undiscovered effects" to every single bullet point above.

Upvoted for naming what was bothering me.

Of course I imagine some drugs are rather well understood by now. But Lukeprog's post doesn't seem to touch on the safety and potential downsides of taking this stuff, which would be useful.

(Also, creepy pill-man is creepy.)

5[anonymous]12y
What makes that mental addition a "rationalist" thing to do, rather than simply a good thing to do?
4NancyLebovitz12y
It's specifically about having a more accurate model of the universe. It's not the same sort of thing as brushing your teeth, even though that's also a good thing to do.
2NancyLebovitz12y
General principle: definitions put a thing into a category, and then explain how that thing is different from other things in the same category. I don't think definitions are how people generally use words-- prototype theory seems more accurate. Prototype theory says that people have best examples of concepts, and then rank actual things according to how close they are to the prototype. It would be nice to have a theory about how to decide when to use definitional thinking and when to use prototypes, but I don't.
1Morendil12y
It's a five-second skill - you have to train yourself to do it.
3DanielLC12y
And effects that lukeprog didn't bother to state.
1Morendil12y
Those count as "undiscovered" too - undiscovered by at least me. :) This article cited by Luke has more nuanced appreciations of drugs like Donepezil, and generally a more balanced take on the subject. For instance, they report that The same article goes on to suggest that perhaps 14 days is too short a timeframe for the beneficial effects to be felt. However one can also find studies like this one (not cited by Luke) which show detrimental effects on cognition in healthy subjects over four weeks of treatment. Neither Luke nor de Jongh et al. report on the frequent side-effects, which (Wikipedia says) include bradycardia, nausea, diarrhea, anorexia, abdominal pain, and vivid dreams.
[-][anonymous]12y350

I'd like to share one day's worth of experience with modafinil.

I noticed a huge difference in alertness. I was filled with an urge to be doing something every second. I don't believe I was more intelligent (some of the work I did that day turned out to be low quality) but I was much more productive. And happy. I felt like I was just "riding the day" -- that going through life, minute by minute, running errands, checking items off my to-do list, and seeing what happened next, was boundlessly fascinating.

I suspect that, at least for me, and maybe for others, most unhappiness is really fatigue, coupled with the guilt of not having accomplished much in a state of fatigue. Simply not being tired makes me deliriously happy. I am not surprised by the study that coffee reduces depression in women, though I know to be suspicious of medical study methodology. The symptoms of clinical depression look a lot like the symptoms of chronic sleep deprivation (fatigue, inability to concentrate, clumsiness, weight gain or weight loss, dramatic and irrational emotions). It's possible that some people with symptoms of depression are actually sleep deprived (or that a typical amount of sleep for a modern-day working or student life is too little for their biological needs.) I had a year when I thought I was losing my mind; in retrospect, it may have had something to do with getting no more than five hours of sleep a night.

9Nisan12y
This, with some qualifiers, beautifully describes my experience.
7Swimmer963 (Miranda Dixon-Luinenburg) 12y
Five hours of sleep a night for a whole year? I'm amazed you functioned! One five-hour night and I'm moderately functional, maybe a slightly shorter attention span and more mood swings than usual. Two nights in a row and I'm a zombie unless I drink a lot of coffee. Three nights and I'm a zombie anyway no matter how much coffee I drink. Unless I get 9+ hours of sleep every night, I will feel sleepy at various points during the day.
3vi21maobk9vp12y
It is highly personal. 9+ hours of sleep per night for a month will probably make me feel bad. Average of 6 hours per night may be slowly wearing myself out, but this rate seems to be sustainable indefinitely. But then, if I do not do anything stressful, I can do with 4 hours per night for a month..
2matt12y
4.5hrs of sleep every 24 on everyman 3 since January and I've never felt better! [full disclosure: the first couple of months were tough and involved much experimentation with schedules close to everyman 3.]
4Crux12y
Never felt better? Do you do any hard exercise?
0matt12y
Not "hard". Four hour body inspired exercise routine. I'm fit and healthy with as little exercise as I can get away with (pushups, situps, etc. 3 days per week; 2km walk with sprints 3 days per week).
0[anonymous]12y
Do you do anything hard involving your long-term memory? Do you use spaced repetition, and if so, has it suffered?
2matt12y
I'm a programmer and manager of programmers. I don't use spaced repetition (I mean to… I've cron'd it to open every morning… but I close it every morning that I figure I don't have time… and that's every morning). I've not noticed any memory deficit. I think that amounts to: no information.
0Swimmer963 (Miranda Dixon-Luinenburg) 12y
Neat. However, how regimented does your sleep schedule have to be in order for it to work? (My main problem with sleeping enough isn't that I have trouble going to bed early enough, like seems to be true for a lot of people... It's that some days I have shifts at work that start at 6 am and then I'm busy until 10 pm, and some days I get home after 11 pm and have to work 6 an the next day, and somehow even though I sleep 8-10 hours a night on the other days, I never really seem to catch up. (Also, can't nap during the day, at least not on demand. I taught myself to do it a bit during first-year university, but my schedule no longer allows napping anyway.)
0matt12y
I can usually move naps ±90 minutes with very little negative consequence (±30mins with no consequences). I can skip a nap with coffee at the cost of adding an extra hour of sleep the following night (I had to give up coffee to make normal naps work - trace caffeine doesn't stop me from napping, but does stop the naps from being effective). Re: "can't nap during the day… on demand" - the adaption period will fix that.
1juliawise12y
What are "irrational emotions"?
0AndHisHorse11y
An emotion is irrational if it is not appropriate to the situation - for example, social anxiety is irrational if it causes one to avoid pursuing some social opportunities which have a positive expected value (for any utility function, which may or may not carry a heavier penalty for failure than a bonus for success).
1Lumifer11y
Who decides (and how) which emotion is appropriate to which situation?
4AndHisHorse11y
See above. If your emotional state (and I assume the ability to distinguish a state of heightened emotion from a resting state) causes you to act in ways which do not reflect your evidence-based assessments, it is causing you to act against your rational decisions and is therefore irrational. I would say that the ability to make this judgement belongs to the best-informed rationally-acting observer: someone who has knowledge of the your mental state in both emotional states, and from the available evidence, estimate whether or not the difference in behavior can be attributed to emotional causes. This observer may very well be you yourself, in a resting state; once you have regained your perspective, as you have a lot more information on your own mental state. To expand on the example I gave above, someone experiencing social anxiety may suddenly focus on the various ways in which a social interaction can go horribly wrong, even if these futures are not very probable. Basically, anxiety hijacks the availability heuristic, causing an overestimation of the probability of catastrophe. Because this adjustment in probability is not based in evidence (though this point could be argued), it is irrational. This definition of "irrational emotions" does not depend on the utility function used. If someone weights failure more heavily than success, and will go home unhappy at the end of the night if they have 9 successful conversations and 1 boring dud, they are not necessarily irrational. However, if, on previous nights with substantial frequency they have gone 10 for 10, and before entering a conversation they freeze in fear - then, their expected value has changed without sufficient reason. That is irrational emotion.
-5Lumifer11y
1Dorikka12y
Do you take modafinil on a regular basis? If not, what made you choose not to, given your positive experience? If so, have you noticed any other effects that would be good to note?
4[anonymous]12y
I had a one-time trial and I'm planning to see my doctor for more as soon as I can.

If you don't mind sharing, how do you plan to do this? Is it as simple as "this controlled substance makes my life better, will you prescribe it for me?" Or are you "fortunate" enough to have a condition that warrants its prescription?

I ask because I've had similar experiences with Modafinil (my nickname for it is "executive lubricant"), and it is terribly frustrating to be stuck without a banned goods store.

1Douglas_Knight12y
I'm skeptical of this. Yes, five hours of sleep is bad for your mental health, but usually in a different direction. Did you have depressive symptoms that year? A key symptom of depression is lack of willpower - depressives don't normally have the willpower not to sleep. Quite the opposite, they sleep more the than normal. This would solve simple sleep deprivation. It's possible that they lack something more specific that normal people are able to get by sleeping, but even that does not sound terribly likely to me. ETA: As various people comment, this is largely backwards. I particularly regret suggesting that people who spend a lot of time in bed get useful sleep. So maybe sleep deprivation contributes to some of the symptoms of depression. But there are other symptoms and I am skeptical that the two are confused.

A key symptom of depression is lack of willpower - depressives don't normally have the willpower not to sleep.

For me personally, and I suspect also for a significant number of other people, it takes willpower to go to sleep as well as to wake up early enough. In the morning, the path of least resistance for me is to sleep in, but in the evening, it is to do something fun until I'm overcome with overwhelming sleepiness, which won't happen until it's far too late to maintain a normal sleeping schedule. Therefore, if I were completely deprived of willpower, my "days" would quickly degenerate into cycles of much more than 24 hours, falling asleep as well as waking up at a much later hour each time.

Now, the incentive to wake up early enough (so as not to miss work etc.) is usually much stronger than the incentive to go to bed early enough, which is maintained only by the much milder and more distant threat of feeling sleepy and lousy next day. So a moderate crisis of willpower will have the effect of making me chronically sleep-deprived, since I'll still muster the willpower to get up for work, but not the willpower to go to bed instead of wasting time until the wee hours.

(This is exacerbated by the fact that when I'm sleep-deprived, I tend to feel lousy and wanting to doze off through the day, but then in the evening I suddenly start feeling perfectly OK and not wanting to sleep at all.)

9Jordan12y
I suffer from this as well. It is my totally unsubstantiated theory that this is a stress response. Throughout the whole day your body is tired and telling you to go to sleep, but the Conscious High Command keeps pressing the KEEP-GOING-NO-MATTER-WHAT button until your body decides it must be in a war zone and kicks in with cortisol or adrenaline or whatever.
9taelor12y
This has been my experience as well.
3multifoliaterose12y
Me too!
8Scott Alexander12y
Depressed people can have either insomnia or hypersomnia; insomnia is significantly more common. Depression-related insomnia is usually "terminal" - people wake up very early and can't get back to sleep. Strangely enough, there have been some studies showing that depriving depressed people of sleep has a strong positive effect on their mood, but of course then they're too sleep-deprived to enjoy it.
4Swimmer963 (Miranda Dixon-Luinenburg) 12y
Actually, according to my nursing textbooks, depression can manifest either by sleeping more or less than usual. So five hours of sleep a night could, for some people, be a symptom of depression. And I do remember reading somewhere about first-year college or university students developing clinical depression after a few months of unaccustomed stress and lack of sleep. And for most university students, it probably takes willpower to go to bed early, since nearly everyone I know who is my age seems to be on a longer-than-24-hour natural sleep schedule. So lack of sleep could cause depression, although once you were depressed, you might find yourself wanting to sleep more (and having an even harder time keeping up with classes). Personal anecdote: long periods of sleep deprivation can mess up your neurotransmitter levels enough to cause an episode of psychosis. This actually happened to one of my good friends. (When you're waking up at 4:30 am every day for swim practice, and staying up late for whatever reason including just wanting to have a life, sleep deprivation can very quickly get out of hand.) You probably have to be genetically predisposed, but still...it scares me.
4Vaniver12y
It seems likely that this is a combination of youthful endurance plus a lack of night cues (computer screens make fake-sunlight at any time of the night), rather than young people actually having a circadian rhythm that's longer by hours.
9gwern12y
I disagree. The circadian rhythms in middle school and up is very well established; please see all the links & citations in http://www.gwern.net/education-is-not-about-learning#school-hours That it is not a mere preference but a biological reality is one of the reasons I regard melatonin as so useful - fight fire with fire. EDIT: Of course, it's also true that artificial light and computer screens are not helpful in the least: see the second paragraph in http://www.gwern.net/Melatonin#health-performance So you might say for young people, it's a many-edged problem: they naturally want to go to bed late, their electronic devices exacerbate the original biological problem, and then all the social dynamics can begin to contribute their share of the problem...
4Douglas_Knight12y
I think Vaniver is objecting to the narrow claim of a cycle longer than 24 hours. Without clicking through on your sources, they seem to say that teens have a shifted cycle, not a longer cycle. In particular, that shifting school later improves sleep suggests that teens have a shifted cycle. If they had an unmoored cycle of longer than 24 hours, the greater light exposure of an earlier start would probably be better.
0Vaniver12y
Douglas_Knight is correct; I'm not challenging "young people want to go to bed late and get up late" but "young people want to sleep six times a week rather than seven" (or, more reasonably, 13 times every two weeks).
5Swimmer963 (Miranda Dixon-Luinenburg) 12y
I do remember reading in a variety of places that young people, especially teenagers, tend to have more trouble sticking to an earlier sleep schedule. But you're right that this isn't necessarily biological in origin. It could just be that young people have a) greater benefits to gain from staying up late, since that's when a lot of socializing takes place, and b) less practice using willpower to force themselves to go to bed, and maybe less incentive, since with their "youthful endurance" they can push through on 2-3 hour of sleep. And being able to do this, or for example get really drunk and still make it to work early the next morning, is definitely a status thing that people are almost competitive about. Maybe some kind of signalling at work, too: "I'm so healthy and strong, I can afford to get really, really drunk and hardly get any sleep and still function...I must have awesome genes." That could explain how being a compete idiot and passing out on my friend's floor in front of my supervisor when I had an exam the next day somehow made me cooler to all the staff.
1[anonymous]12y
I don't think it's everybody -- certainly there are cases of severe depression where the person sleeps 20 hours a day. Maybe it's more that sleep deprivation can masquerade as depression. That is, if you're tired, slow, unmotivated, hopeless, lethargic, plunged in gloom, and you're sleeping four or five hours a night, your problems might be related to your sleep patterns.
0Douglas_Knight12y
Sure, fatigue can cause unhappiness, but I don't think it looks like clinical depression. You seem to be holding yourself up as an example. Did anyone think you clinically depressed when sleep deprived?

I've been self-experimenting with piracetam the past few months.

I usually study from a site called USMLEWorld with a selection of difficult case-based medical questions. For example, it might give a short story about a man coming into a hospital with a certain set of symptoms, and explain a little about his past medical history, and then ask multiple choice questions about what the most likely diagnosis is, or what medication would be most helpful. These are usually multi-step reasoning questions - for example, they might ask what side effect a certain patient could expect if given the ideal treatment for his disease, and before answering you need to determine what disease he has, what's the ideal treatment, and then what side effects that treatment could cause. My point is they're complicated (test multiple mental skills and not just simple recall) and realistic (similar to the problems a real doctor would encounter on the job).

I've tried comparing my performance on these questions on versus off piracetam. My usual procedure is to do twenty questions, take 2400 mg piracetam + 600 mg lecithin-derived choline, go do something fun and relaxing for an hour (about the time I've been to... (read more)

Wouldn't a comparison between control-then-piracetam days with control-then-control days tell us a bit more about how effective piracetam is, accounting for possible fatigue?

0D_Malik10y
The main claimed benefit for piracetam is not backwards recall right after supplementation; this seems to be a benefit, but it's small. The main claimed benefit is reduction of long-term cognitive decline with high-dose piracetam over time. See for instance http://examine.com/supplements/Piracetam/#main_clinical_results . (You probably know this; this is directed at the other people reading your comment.)
0Alexei12y
I've taken Piracetam + Choline combination daily for a week (twice) and I've never noticed any positive effects. If anything, I was more irritated and prone to head-aches. Although, I didn't have a solid method of measuring the difference like you, so this is purely anecdotal.
0Jayson_Virissimo12y
If you get headaches you should probably up the choline dosage or use a more bioavailable form like CDP choline.

Lukeprog, I noticed in your last two posts you've used a stock photo to represent the subject of the post. I may be different from everyone else, but despite the usefulness of this design choice, I associate it with probloggers or whatever you would call them. So, personally (and this is only personal taste), I would try to use them very sparingly. I hope you don't mind my suggestion.

You aren't the target audience for the stock photo, it's a random person seeing Less Wrong for the first time. People like pictures.

3eugman12y
I felt I was quite humble in giving my opinion (or maybe just self-effacing). Still, I'm willing to logically concede the point.
2Kevin12y
Yes, you were, my slightly antagonistic tone was because there have been highly modded comments saying the same thing in several previous Lukeprog posts and I thought it was getting old.
5eugman12y
Upvoted for owning up to things. I wish I had known someone else had made the same argument, or I wouldn't have posted anything. Thanks!
3printing-spoon12y
Yeah, and the texture in this picture makes my skin crawl. The pills look like growths or something.

Lately I've been extraordinarily surprised at how effective potassium and potassium salt are. By which I mean that simple potassium is probably the most positively mind altering supplement I've ever tried.

7wedrifid12y
Wait... Potassium AND potassium salt? You have actually tried using non-salt forms of potassium as a mind altering supplement? That's seriously baddass!
2Kevin12y
I told you I was hardcore.
2Scott Alexander12y
In what form did you take the potassium?
3Kevin12y
Potassium chloride initially, but since bought some potassium gluconate pills... research indicates you don't want to consume large amounts of chloride (just moderate amounts).
2anonym12y
Please elaborate. In what ways have you found it to be mind-altering?
9Kevin12y
About 15 minutes after consumption, it manifests as a kind of pressure in the head or temples or eyes, a clearing up of brain fog, increased focus, and the kind of energy that is not jittery but the kind that makes you feel like exercising would be the reasonable and prudent thing to do. I have done no tests, but "feel" smarter from this in a way that seems much stronger than piracetam or any of the conventional weak nootropics. It is not just me -- I have been introducing this around my inner social circle and I'm at 7/10 people felt immediately noticeable effects. The 3 that didn't notice much were vegetarians and less likely to have been deficient. Now that I'm not deficient, it is of course not noticeable as mind altering, but still serves to be energizing, particularly for sustained mental energy as the night goes on.
1pjeby12y
How much did you take?
3Kevin12y
Initially 1 teaspoon of potassium salt in water.
5Kevin12y
Note: I now consider 1 teaspoon at once to be a dose larger than necessary. I only recommend such a large dose at once if it is important to you to be able to viscerally sense potassium coursing through your body. I'd recommend drinking at least 24 ounces of water with that much potassium. Definitely, definitely don't eat it straight with no water.
0Nick_Tarleton12y
How long does the increased energy last?
1Kevin12y
A few hours? But though I'm recommending the one teaspoon initially for people to be able to viscerally feel the effects of potassium, I've switched to trying to keep my dosage lower and steadier throughout the day.
2gwern12y
Well, I can't say I've heard that one before; resources? And are you sure you simply didn't have a deficiency? Fixing a deficiency can have dramatic effects but be useless for everyone else (eg. that LWer who fell in love with sulbutiamine because he had B-vitamin problems, even though sulbutiamine is only a very mild stimulant to me).
0Kevin12y
I'm sure I was deficient but most people are deficient if they don't eat fruits and vegetables as a voluminous staple of their diet.

Having experimented with nootropics (using gwern's site as a guide), I can report there is little exciting in the way of "being smarter" - but there is plenty of low-hanging fruit in the stimulants! Being more alert and motivated is a pretty good proxy for being smarter to boot.

Maybe also this: Single Dose of 'Magic Mushrooms' Hallucinogen May Create Lasting Personality Change

A single high dose of the hallucinogen psilocybin, the active ingredient in so-called "magic mushrooms," was enough to bring about a measurable personality change lasting at least a year in nearly 60 percent of the 51 participants in a new study, according to the Johns Hopkins researchers who conducted it.

From the abstract:

A large body of evidence, including longitudinal analyses of personality change, suggests that core personality traits are predominantly stable after age 30. To our knowledge, no study has demonstrated changes in personality in healthy adults after an experimentally manipulated discrete event. Intriguingly, double-blind controlled studies have shown that the classic hallucinogen psilocybin occasions personally and spiritually significant mystical experiences that predict long-term changes in behaviors, attitudes and values. (...) Consistent with participant claims of hallucinogen-occasioned increases in aesthetic appreciation, imagination, and creativity, we found significant increases in Openness following a high-dose psilocybin session. In partici

... (read more)

My own impression on reading that yesterday was that your average LWer doesn't really need Openness; what we need is Conscientiousness!

EDIT: I've posted article based on Spent dealing with Openness: http://lesswrong.com/lw/82g/on_the_openness_personality_trait_rationality/

2Metus12y
Now if we only had a drug that increases conscientiousness.
6wedrifid12y
Stimulants in general. And most (other) things that increase dopamine or norepinephrine can be expected to some extent. Pramiracetam. For many anabolic steroids increase motivation as a side effect, a significant component of conscientiousness.
3NancyLebovitz12y
I think amphetamines can do that, at least for people with ADD. Is anything known about a physical basis for conscientiousness?
2VincentYu12y
DeYoung and Gray (2009) wrote a review on the neuroscience of the Big Five traits in The Cambridge handbook of personality. The two relevant paragraphs on conscientiousness: It seems like high levels of serotonin and blood-glucose are associated with high levels of some specific facets of conscientiousness.

The statistics in the linked paper are very badly done: see Does psilocybin cause changes in personality? Maybe, but not so fast.

Creatine improves cognitive performance.

Isn't this primarily true for vegetarians? I was under the impression that most people have all the creatine their brains can make use of.

1gwern12y
Not just vegetarians; if you had clicked through to my page, you'd see my summary:
3Vaniver12y
I did click through to your page; I decided not to quote it directly, which was a mistake. My impression is that of LWers, vegetarians are the most common group (though perhaps there are lots of sleep-deprived people). Overall, I was disappointed with taking a qualified statement ("creatine deficiency causes intelligence problems; make sure you have enough") and turning it into an unqualified statement ("creatine improves cognitive performance").
0gwern12y
But still not very common. Vegetarian LWers would be, what, 10% maybe? (Not sure any surveys have covered it, but I don't see it discussed very often). /shrug That's Luke's description, not mine. I've edited the page to include specific citations for each group and some PDF links, incidentally.

What about assuefaction? Drugs to which the brain can adjust and compensate doesn't seem to be good long-term improvement.

I decided to try some of the suggestions here. There was a piracetam powder I ordered. How on earth are you supposed to ingest that crap?!! I have such a powerful negative taste reaction, even disguising the ~2g in 3 glasses of water, or 2 glasses of milk, or in a mouthful of food (though it suggests consuming on an empty stomach)... that even if it was prescribed by a doctor to cure aging, I'd be hard-pressed to take the recommended dosage on a daily basis. What can I do to continue this experiment without having to annihilate my taste buds?

1gwern12y
You mustn't done much reading about piracetam because everyone complains about the taste - it's impressively nasty, isn't it? (BTW, if you purchased the powder, I guess you noticed the price difference between the powders and the pills; you should have wondered why there was such a price difference and not then been surprised at the taste.) Anyway, what you can do about it: 1. hide it with citrus fruit juices (eg. unsweetened grapefruit juice) 2. cap the piracetam powder (might be a bit expensive if you don't already own a capsule machine and empty pills) 3. 'parachute' (make pills using toilet paper)
2pengvado12y
I would have expected any price difference to have something to do with the cost of making pills. If that's not the case... is there a competitive market in powder but not a competitive market in pills, or do all the sellers agree on this same method of price discrimination, or what?
1gwern12y
There's usually a cost to the convenience of pills, yes; but the greatest the difference, the more convenience is being provided (because otherwise people would just cap their own or not buy the substance at all). Piracetam seems to have unusual differentials, pointing to some greater convenience being provided - which I believe to be related to its revolting taste.
0wedrifid12y
I'm inclined to agree. A large difference between the price of pills and the price of the powder tells us a whole lot more about the depth of the market than about taste. If the market for piracetam were large one would far more closely track the other.
0beriukay12y
Grapefruit juice worked great! I also tested and found V8 to work pretty well. The powder floated on the top and I barely noticed it, even using just half a glass.
0beriukay12y
You are correct. I basically wiki'd it, glanced at some of the LW material, browsed amazon and saw largely positive reviews for the product. Maybe those people all have capsule-makers. I'll report back with my experiences with citrus and/or parachuting later. Thanks for the tips!
0[anonymous]12y
Can't you just chew on a cracker, spit that out, and make a 'pill' out of that? Though I suppose some people might find that more unappetizing than eating paper. [edit: never mind, grandparent discussed eating it with food and said it was suggested against.]

The post would be more actionable if you could give a bit more analysis on what side effects are reported for each substance.

Could you expand the "CRT" initialism? I'm not finding it in the linked text on a quick scan. Thanks!

1lukeprog12y
Ah. That was a copy-paste fail. The link is now fixed.
2Paul Crowley12y
Thanks! Think it might still be best to expand the initialism in the text, but now I know what you mean.

What about improving rationality with neurofeedback? The theory is that if you can see some kind of representation of your own brain activity (EEG for example), you should be able to learn to modify it. It has been shown that people could learn to control pain by watching the activity of their pain centres (http://www.newscientist.com/article/mg18224451.400-controlling-pain-by-watching-your-brain.html). Neurofeedback is also used to treat ADHD, increase concentration, and "it has been shown that it can improve medical students' memory and make them feel calmer before exams."

5GilPanama12y
I did quite a bit of EEG neurofeedback at the age of about 11 or 12. I may have learned to concentrate a little better, but I'm really not sure. The problem is that once I was off the machine, I stopped getting the feedback! Consider the following interior monologue: "Am I relaxing or focusing in the right way? I don't have the beeping to tell me, how do I know I am doing it right?" In theory, EEG is a truly rational way to learn to relax, because one constantly gets information about how relaxed one is and can adjust one's behavior to maximize relaxation. In practice, I'm not sure if telling 12-year-old me that I was going to have access to electrical feedback from my own brain was the best way to relax me. The EEG did convince me that physicalism was probably true, which distressed me because I had a lot of cached thoughts about how it is bad to be a soulless machine. My mother, who believed in souls at the time, reassured me that if I really was a machine that could feel and think, there'd be nothing wrong with that. I wonder how my rationality would have developed if, at that point, she had instead decided to argue against the evidence?
[-][anonymous]9y00

There are several key reasons that rationalists may underestimate the dangers of drugs. I can also think of one good reason, other than the obvious one (which is the application of rationality techniques including calling on helpful social influence)

hypothesised risk factors

  1. There is a big distance in the kinds of inferences that can be made from consistently health literature and popular social commentary. Rationalists may be biased to base their decision to use or continue to use drugs based on medical evidence, without incorporating evidence from commo

... (read more)
[This comment is no longer endorsed by its author]Reply

Well, it's not surprising that drugs can help with cognition. But we've to be very careful about two things : the effects it has on other parts of the body, and the long-term effects, both to the body and to the brain itself.

The human body is a very complex and delicate machinery, and the human brain the most delicate part of it... it's very easy to create long term problems in it by trying to push it a bit too much. Just look at the professional sport players, and how badly they are damaged after a few years of taking drugs to enhance their performances.

T... (read more)

5wedrifid11y
The analogy works well when considering stimulants. However when considering drugs or supplements that are neuroprotective or that actively promote neurogenesis the analogy becomes fallacious. Cerebrolysin, for example is more analogous to opening up your computer and replacing the CPU and RAM with more powerful and more reliable components. Sure, it is invasive and requires caution and knowledge to do but the life expectancy of the core components is increased, not decreased.
4AndHisHorse11y
I am of the impression that the reason for the health problems of professional athletes is the degree to which they push their bodies (which, perhaps, might not be possible/feasible without drugs and supplements) rather than a direct effect from the drugs themselves. Further, while I share your caution regarding the risks of causing damage to the body or brain through some unknown mechanism or weakness, there is a point at which I believe people would be best advised to take supplemental drugs. Further, whether or not you have a problem depends on your reference point: as a young man of moderate resources in a developed country, I am not of below-average health for the human race, but I am also not optimizing my physical and mental faculties. (For the time being, anyway). And there may be some drugs which might reasonably be expected to provide a health benefit which outweighs the probability of "increasing bugs", which it would be rational to take given all but the most extremely loss-averse utility functions. I would say that a superior metaphor would be upgrading your CPU - the process may have unintended side effects, but it may not, and there is fair evidence that it will have some positive outcomes. The difficulty lies in weighing these expectations, which I think is inhibited by setting a hard limit.
5wedrifid11y
Your impression approximately matches my research.
3Lumifer11y
"Drug" is a fuzzy concept. Specifically, I don't see a well-defined boundary between "drugs" and "food" (and/or drink). Obvious substances that straddle that boundary are psychoactives -- coffee, alcohol, qat, etc. But if we think about human biochemistry, I can affect my metabolism -- pretty radically, too -- purely by varying my diet. For example, I can switch my body into ketosis by not eating carbs. No drugs involved, and yet I am seriously messing with the "very complex and delicate machinery" of my body. Add exercise. "Runners high" is well-known phenomenon. Is running a drug? Add lifestyle, etc. Stress, sleep patterns, etc. all strongly affect your body. So what's so special about pills and capsules that you have to be so very careful about taking them, while the remaining multitude of way to affect your body and mind gets a free pass?
0AndHisHorse11y
Because pills and capsules are things which have substantial effects on the human body in ways which bypass common natural pathways. Variations in diet (to an extent) and lifestyle changes (of certain kinds) were common in the ancestral environment, which means that they have been reliably and exhaustively tested on the human race over the course of our entire history as a species. There are some artificial things which have been well-tested. Alcohol, for example, has effects which are well-known; enough people make use of it that there have been substantial incentives to research it exhaustively, at least to the point where we feel fairly comfortable imbibing it in moderation without fear of catastrophic side effects. The same goes for a lot of other substances, some of which are natural (I would assume that natural psychoactive substances were more likely to be discovered before societies grew very picky about what they put in their bodies, i.e. before the introduction of many regulatory bodies). The issues with the remaining substances is that we don't have enough knowledge about (some of) them to justify what could be existential risk. There have been enough drug recalls over the years that it has become apparent that a successful clinical trial, possibly funded by the same company which has an incentive to bring the drug to market quickly, is not sufficient evidence to dismiss the possibility. As a result, it is not irrational to take extra care - acquiring extra information and erring on the side of the status quo (which happens to include one self, currently still alive and well).
-1Lumifer11y
That is not true with respect to a large part of contemporary Western diet. Things like refined sugar, hydrogenated oils, a wide variety of food preservatives, flavourings and colorings are new and appeared an instant ago on the evolutionary time scale. To give a basic example, take a look at the ingredients of Coke: high-fructose corn syrup, phosphoric acid, caramel color, caffeine -- I don't think you can make an argument that humans evolved to drink this. That's not true with respect to lifestyle, too. Sitting pretty motionless on a chair for 10+ hours per day is not something evolution prepared our bodies for. My point is precisely that people in the Western world routinely consume large amounts of these "remaining substances" without a second thought. Why eating hydrogenated soybean oil, for example, is not risky? By the way, do you consider over-the-counter supplements drugs? do your arguments apply to them?
0AndHisHorse11y
That is why I included these qualifiers. Things such as alcohol and relatively sedentary lifestyles are either common enough to be well-studied, or pervasive enough to be unavoidable. There are some risks that come with our environment that we do not evaluate in the same way as we evaluate the choice to start a new medication, because the costs of disentangling ourselves from these incredibly common things are higher (in what I estimate to be a typical human utility function with normal time-discounting; your results may vary) than the opportunity costs of declining to try a new supplement. Further, there is a sort of natural experimentation occurring with these substances which a large number of people consume; if there are substantial negative side effects to them, odds are good that they will become obvious in others before they become a problem for some given person. We reassure ourselves that, since this has not happened, we have some fairly decent evidence that these popular substances are not terrible. New, rare, and unpopular drugs do not have this "natural experiment" advantage.
0Lumifer11y
You're basically making an argument against anything "new, rare, and unpopular", but that argument applies equally well to drugs, food, and lifestyle. Remember the original issue? "Drugs are risky", but what is a drug? If I decide that ketosis is great and convert my diet to 80% saturated fat, is that less risky than starting to take a baby aspirin per day just because the first is "food" and the second one is a "drug"? If I decide to take doses of naringin that's dangerous because naringin is a drug, right? But if I eat a lot of grapefruits to get an equivalent dose, that's OK because grapefruits are food?
2AndHisHorse11y
I wouldn't argue against taking an asprin a a day any more than I would argue against converting your diet to 80% saturated fats; both asprin and saturated fats are commonly ingested substances. If you decide to take a supplement which is found in natural foods, I would not assign that any more risk than eating the equivalent amount of food. Either way, the issue would seem to be in the dosage, provided that the food has been proven safe. If it takes 100 grapefruits to equal a single dose of naringin, however, I would be worried - because you are consuming it in excess of what would ordinarily be expected. The reason I am less worried about things such as dietary changes is that individuals experience dietary variation fairly frequently, and even from personal experience we know that we have mechanisms which alert us when our diet is lacking (sometimes). However, I do not believe that they are without risk, or that one should simply try out an extreme dietary change without prior research. It is substances which have been relatively untested, but are in fact designed to subvert our body's mechanisms, which I have reason to worry about. Not to disavow, but to worry about, and to examine more intensely than substances which are probably, as a class, less harmful.
0Jiro11y
I think you should worry about a diet consisting of 80% fat, however, you should worry about it on different grounds to worrying about untested substances.
2Jayson_Virissimo11y
Why?
1AndHisHorse11y
Fair. I neglected to include 80% fat as having a standing similar to 100 grapefruits' worth of naringin.
-3Eugine_Nier11y
And the same logic applies to them as well. There's this think called the organic food movement, you may have heard of it. It is.

On the other hand, you should consider what evolution can do. Evolution is not the world's best algorithm for inventing things. However, it is an excellent optimising algorithm. Balancing multiple considerations to decide the optimum amount of substance A in your body is the sort of problem that algorithm should do really well.

Essentially the only exception to this rule is when your cells are reacting to DNA/RNA that doesn't belong to you. If cold virus RNA is making your nose run, stop it by all means. But you should trust your own body on most other matt... (read more)

Or as Eliezer puts it:

Algernon's Law: Any simple major enhancement to human intelligence is a net evolutionary disadvantage.

But here's gwern writing about about loopholes in Algernon's Law.

On the other hand, you should consider what evolution can do.

It frustrates me how often this argument against using mind enhancing substances is used and, more importantly, the weight it is given. Not only is evolution optimizing for different critiera (which DuncanS mentions) it is also optimising for an entirely different environment. Further, our expectations that random chemicals will be bad for us is to a massive extent screened off when we go ahead and test them and find that they make things better!

Yet another situation in which evolution should not be expected to give superior results to what we can come up with with science is when we know what we are going to be doing at a specific time. What is best as a general baseline is not going to be the best state when studying for a test. Which is in turn going to be less good when doing unpleasant and potentially traumatic things that you don't want to remember.

-3DuncanS12y
Consuming chemicals that have been tested is certainly an improvement on consuming chemicals that haven't been. Consuming chemicals to make your brain work better seems to me to be a rather similar activity to overclocking a computer. Let's add more voltage. Let's pour liquid nitrogen into it. Perhaps it will go faster ! Perhaps it will, but will it still be working in 5 years time? First of all, note just how crude these efforts are compared to the technological research undertaken by the companies that actually make microchips. The same is true of the brain - it can make dopamine and deliver at synapses - exact points of contact throughout the brain. Yet you see people discussing just adding more dopamine everywhere, and thinking that this is in some sense improving on nature in a clever way. I have to mention a point against myself - which is that I do take general anaesthetics, which, while not an intelligence enhancer, is definitely an intelligence modifier for specific circumstances. However, turning brain function off is arguably simpler than trying to make it better. It is possible, definitely, to improve human intelligence by combining it with a computer. So it's not the case that I'm against the idea that it's impossible to improve on the natural intelligence we all have - it obviously is. What I'm pointing out is that all of these drug ideas are bound to be something that evolution has at some point tried out, and thrown away. And they are really unsophisticated ideas compared with those the brain has actually adopted. Even the situation dependent argument isn't as strong as you might think - for example your brain has a lot of adaptations to cover the "unpleasant and potentially traumatic things" situation, for example - and these adaptations generally disagree with your view that you shouldn't remember them. It's probably the case that intelligence tests are a novel environment, however....
8lessdazed12y
Gwern has a good overview of this argument.
0Randolf12y
Well there could be many reasons why evolution has" thrown them out". Maybe they are harmful in the long term, maybe their use consumes precious energy, or maybe they just aren't "good enough" for evolution to have kept them. That is, maybe they just don't give any signifigant evolutionary advantage. Evolution doesn't create perfect beings, it creates beings which are good enough to survive.

There can be harmul side-effects and that topic is not covered by the article; on the other hand, pure evolutionary argument can be doubted because of changed environment.

If I stimulate my brain, it is natural to assume my brain requires more energy now. So I probably need more glucose. In evolutionary relevant context, that would make me more likely to starve - after all, I would need more highly valued energy and thinking clearly wouldn't make a killed bull magically appear before me.

This is still true for the most of the Earth's population. It is not true for many of LessWrong readers, though. There are some primarily-mental jobs now (in some places of the world - the places where LessWrong readers come from). Keeping more things in you mind means being a better programmer, teacher, scientific researcher. Being better at your profession often helps you to evade starvation. And getting needed amount of calories - if you already know where to get all these vitamins and microelements - is trivial in these parts of the world.

So, this modification was not a benefit earlier, and it was quite costly; both factors are significantly reduced in some parts of modern world.

Of course, increased mental capability can lead to some personality traits that make it harder to reproduce; but that is again a question of side-effects and not a self-evident thing. If you consider it harmful, you can try to spend effort on fighting these side-effects - some people report significant success..

9dlthomas12y
Maybe inclusive genetic fitness is not my utility function.
3soreff12y
Same here. As a childfree human, maximizing the number of copies of my DNA is right up there with paperclip maximization on my list of priorities. :-) One other categories of exceptions: We aren't in the EEA anymore. In particular, we have much looser constraints on available calories than in the EEA, and that changes the optimal settings even for reproductive success.
5handoflixue12y
There's this really pretty large class of issues called "genetic disorders", and a wide variety of other ways the body fails just fine without encountering foreign DNA/RNA... I'm assuming insulin for diabetics also has unexpected drawbacks and isn't really in our best interests? Or, put succinctly: "Scientists are so ignorant! If it was possible to cure cancer, why didn't we just evolve to not have cancer in the first place?"
-5DuncanS12y