Solving sleep: just a toe-dipping
In the quest to become more effective and productive, sleep is an enormously important process to optimize. Most of us spend (or at least think we should spend) 7.5 to 8.5 hours in bed every night, a third of a 24 hour day. Not sleeping well and not sleeping sufficiently have known and large drawbacks, including decreased attention, greater irritability, depressed immune function, and generally weakened cognitive ability. If you’re looking for more time, either for subjective life-extension, or so that you can get more done in a day, taking steps to sleep most efficiently, so as to not spend more than the required amount of time in bed and to get the full benefit of the rest, is of high value.
Understanding the inner mechanisms of this process, can let us work around them. Sleep, baffling as it is (and it is extremely baffling), is not a black box. Knowing how it works, you can organize your behavior to accommodate the world as it is, just as taking advantage of the principles of aerodynamics, thrust, and lift, enables one to build an airplane.
The most important thing to know about sleep and wakefulness is that it is the result of a dual process: how alert a person feels is determined by two different and opposite functions. The first is termed the homeostatic sleep drive (also, homeostatic drive, sleep load, sleep pressure, and process S), which is determined solely by how long it has been since an individual has last slept fully. The longer he/she’s been awake, the greater his/her sleep drive. It is the brain's biological need to sleep. Just as sufficient need for calories produces hunger, sufficient sleep-drive produces sleepiness. Sleeping decreases sleep drive, and sleep drive drops faster (when sleeping) then it rises (when awake).
Neuroscience is complicated, but it seems the chemical correlate of sleep drive is the build-up of adenosine in the basal forebrain and this is used as the brain’s internal measure of how badly one needs sleep.1 (Caffeine makes us feel alert by competing with adenosine for bonding sites and thereby inhibiting reuptake.)
This is only half the story, however. Adenosine levels are much higher (and sleep drive correspondingly lower) in the evening, when one has been awake for a while, than in the middle of the night, when one has just slept for several hours. If sleepiness were only determined by sleep drive, you would have a much more fragmented sleep: sleeping several times during the day, and waking up several times during the night. Instead, humans typically stay awake through the day, and sleep through the whole night. This is due to the second influence on wakefulness: the circadian alerting signal.
For most of human history, there was little that could be done at night. Darkness made it much more difficult to hunt or gather than it was during day. Given that the brain requires some fraction of the nychthemeron (meaning a 24-hour period) asleep, it is evolutionarily preferable to concentrate that fraction of of the nychthemeron in the nighttime, freeing the day to do other things. For this reason, there is also a cyclical component to one’s alertness: independent of how long it has been since an individual has slept, there will be times in the nychthemeron when he/she will feel more or less tired.
Roughly, the circadian alerting signal (also known as process C) counters the sleep-drive, so that as sleep drive builds up during the day, alertness stays constant, and as sleep drive increases over the course of the night, the individual will stay asleep.
The alerting signal is synchronized to circadian rhythms, which are in turn attuned to light exposure. The circadian clock is set so that the alerting signal begins to increase again (after a night of sleep) at the time when the optic nerve is first exposed to light in the morning (or rather, when the the optic nerve has habitually been first exposed to light, since it takes up to a week to reset circadian rhythms), and increases with the sleep drive until about 14 hours later (from the point that the alerting signal started rising).
This is why if you pull an “all-nighter” you might find it difficult to fall asleep during the following day, even if you feel exhausted. Your sleep drive is high, but the alerting signal is triggering wakefulness, which makes it hard to fall asleep.
For unknown reasons, there is a dip in the circadian alerting about 8 hours after the beginning of the cycle. This is why people sometimes experience that “2:30 feeling.” This is also the time at which biphasic cultures typically have an afternoon siesta. This is useful to know, because this is the best time to take a nap if you want to make up sleep missed the night before.
The neurochemistry of the circadian alerting signal is more complex than that of the sleep drive, but one of the key chemicals of process C is melatonin, which is secreted by the pineal gland about 12 hours after the start of the circadian cycle (two hours before habitual bedtime). It is mildly sleep-inducing.
This is why taking melatonin tablets before is recommended by gwern and others. I second this recommendation. Though not FDA-approved, there seem to be little in the way of negative side effects and they make it much easier to fall asleep.
The natural release of melatonin is inhibited by light, and in particular blue light (which is why it is beneficial applications to red-shift the light of their computer screens, like flux or reds.shift, or wear red-tinted goggles, before bed). By limiting light exposure in the late evening you allow natural melatonin secretion, which both stimulates sleep and prevents the circadian clock from shifting (which would make it even more difficult to fall asleep the following night). Recent studies have shown bright screens ant night do demonstrably disrupt sleep.2
The thing that interests me about this fact that alertness is controlled by both process S and process C, is that it may be possible to modulate each of those processes independently. It would be enormously useful to be able to “turn off” the circadian alerting signal on demand, so that a person can fall asleep at any time off the day, to make up sleep loss whenever is convenient. Instead of accommodating circadian rhythms when scheduling, we could adjust the circadian effect to better fit our lives. When you know you’ll need to be awake all night, for instance, you could turn off the alerting signal around midday and sleep until your sleep drive is reset. In fact, is suspect that those people who are able to live successfully on a polyphasic sleep schedule get the benefits by retraining the circadian influence. In the coming posts, I want to outline a few of the possibilities and (significant) problems in that direction.
Meetup : Chicago Rationality Training Group - Meeting 1: "Why Rationality?" and Using the Inner Simulator
Discussion article for the meetup : Chicago Rationality Training Group - Meeting 1: "Why Rationality?" and Using the Inner Simulator
The first meeting of the rationality training group will be at one o'clock on Sunday April 5th, in room 145 of Harper Memorial Library at the University of Chicago.
My ideas for this group:
I would like to meet on a weekly basis to learn and practice both epistemic rationality (critical thinking and calibration: “what do you know and why do you think you know it?”) as well as CFAR-style instrumental rationality ("what do I need to do to win?"). We’ll start by working our way though the CFAR curriculum (circa January 2015) and some key parts of the LessWrong sequences, but I would like this group to evolve over time and we are not constrained to that material. Ultimately I’d love for this to turn into less a series of lessons, and more a “rationality dojo”: a place to practice what we know, consider ways to apply it better, and experiment with original techniques. If you have any interest in this at all, I recommend you come to this first meeting since I’ll explain what I’m thinking in more detail and we can collectively begin to decide what we want this group to be.
The topics of this first meeting will be “Why bother with rationality?” and using one’s inner simulator, which is fundamental to several CFAR techniques.
The suggested readings for this week are:
http://lesswrong.com/lw/go/why_truth_and/
http://lesswrong.com/lw/2p5/humans_are_not_automatically_strategic/
http://lesswrong.com/lw/8ma/how_rationality_can_make_your_life_more_awesome/
http://lesswrong.com/lw/nb/something_to_protect/
Each of these are pretty short. They shouldn’t take very long to read, but I’d like to to spend some time thinking about each one.
I’m sending this email, I’m posting to meetup.com, and I’m listing this event on LessWrong. If you have other ideas for how to publicize this event, let me know. Pass along this invite to anyone who you think might be interested in this.
If you have any questions for me, feel free to ask.
Discussion article for the meetup : Chicago Rationality Training Group - Meeting 1: "Why Rationality?" and Using the Inner Simulator
Interesting (or semi-interesting) part time jobs?
I'm currently taking time off from school to focus on my eduction. I'm reading (a lot), mastering some skills, and finishing some projects.
It takes money to live, so I need money. I was considering what my options were for jobs that would keep me engaged, and I thought I'd ask LessWrong.
Constraints:
1. I don't yet have a bachelor's degree. I am however, an intelligent and courteous student at a prestigious university, who doesn't drink smoke or do drugs.
2. I need at least $800/month (500 for rent, internet, and bus fares; 150 for food; 150 for savings).
3. I'm looking for less than 16 hours a week, or the taking time off to focus on learning becomes sort of mute. However, that is on average; it is feasible for me to work many hours one week and than little to none the next.
Optimization criteria:
1. Something interesting, especially something where I would learn something new. This may come in all kinds of forms (for instance, puts me in close contact with the sorts of people I wouldn't usually talk to), including some that I haven't thought of yet. It may even be a new approach to a generic job that makes it challenging or engaging. Jobs that will let me just sit and read without distraction, or even just listen to audio books while I work, would be great.
2. The fewer hours I have to work, the better.
I'm currently running experiments (mostly surveys) for a decision research lab. The work itself a little boring, but I do get to spend some of my time around marketing Ph.d students who are interested in behavioral economics and I get paid $12/hour. It works, but I'm open to other options.
Any ideas?
The buildup to Jaynes?
Not too long ago, I asked LessWrong which math topics to learn. Eventually, I want to ask for what the prerequisites for each of those topics are and how I should go about learning them. This is a special case of that.
I'm rereading the sequences and Eliezer seems to love E.T. Jaynes. As part of my rationality self-study, I want to work my way through his Probability Theory: the Logic of Science. What math topics do I already need to understand to prepare myself for this? I learned calculus once upon a time, but not fantastically well, and I plan to start by reviewing that.
Also,
Despite Eliezer's praise of the "thousand-year-old vampire", it there a better book to learn probability theory?
Does anyone want to learn this (or the other math from my post above) with me? I'd love to have a partner or maybe even a work group. Location is no obstacle. [Two caveats: 1. I'm busy with stuff and may not be able to get into this for a few months 2. I hard, but I am incredibly slow at computation (such that on every math test I have ever taken, it took me at least 3 times as long as the second slowest person in the class to finish). You might find that I go to slow for you.]
Memory Improvement: Mnemonics, Tools, or Books on the Topic?
I want a perfect eidetic memory.
Unfortunately, such things don't exist, but that's not stopping me from getting as close as possible. It seems as if the popular solutions are spaced repetition and memory palaces. So let's talk about those.
Memory Palaces: Do they work? If so what's the best resource (book, website etc.) for learning and mastering the technique? Is it any good for memorizing anything other than lists of things (which I find I almost never have to do)?
Spaced Repetition: What software do you use? Why that one? What sort of cards do you put in?
It seems to me that memory programs and mnemonic techniques assist one of three parts of the problem of memory: memorizing, recalling, and not forgetting.
"Not forgetting" is the long term problem of memory. Spaced repetition seems to solve the problem of "not forgetting." You feed the information you want to remember into your program, review frequently, and you won't forget that information.
Memory Palaces seem to deal with the "memorizing" part of the problem. When faced with new information that you want to be able to recall, you put it in a memory palace, vividly emphasized so as to be affective and memorable. This is good for short term encoding of information that you know you want to keep. You might put it into your spaced repetition program latter, but you just want to not forget it until then.
The last part is the problem of "recalling." Both of the previous facets of the problem of memory had a distinct advantage: you knew the information that you wanted to remember in advance. However, we frequently find ourselves in situations in which we need/want to remember something that we know (or perhaps we don't) we encountered, but didn't consider particularly important at the time. Under this heading falls the situation of making connections when learning or being reminded of old information by new information: when you learn y, you have the thought "hey, isn't that just like x?" This is the facet of the memory problem that I am most interested in, but I know of scarcely anything that can reliably improve ease of recall of information in general. Do you know of anything?
I'm looking for recommendations: books on memory, specific mnemonics, or practices that are known to improve recall, or anything else that might help with any of the three parts of the problem.
Neo-reactionaries, why are you neo-reactionary?
Through LessWrong, I've discovered the no-reactionary movement. Servery says that there are some of you here.
I'm curious, what lead you to accept the basic premises of the movement? What is the story of your personal "conversion"? Was there some particular insight or information that was important in convincing you? Was it something that just "clicked" for you or that you had always felt in a vague way? Were any of you "raised in it"?
Feel free to forward my questions to others or direct me towards a better forum for asking this.
I hope that this is in no way demeaning or insulting. I'm genuinely curious and my questioning is value free. If you point me towards compelling evidence of the neo-reactionary premise, I'll update on it.
Can science come to understand consciousness? A problem of philosophical zombies (Yes, I know, P-zombies again.)
In response to the classic Mysterious Answers to Mysterious Questions, I express some skepticism that consciousness is can be understood by science. I postulate (with low confidence) that consciousness is “inherently mysterious”, in that it is philosophically and scientifically impenetrable. The mysteriousness is a fact about our state of mind, but that state of mind is due to a fundamental epistemic feature of consciousness and is impossible to resolve.
My issue with understanding the cause of consciousness involves p-zombies. Any experiment with the goal of understanding consciousness would have to be able to detect consciousness, which seems to me to be philosophically impossible. To be more specific, any scientific investigation of the cause of consciousness would have (to simplify) an independent variable that we could manipulate to see how this manipulation affects the dependent variable, the presence or absence of consciousness. We assume that those around us are conscious, and we have good reason to do so, but we can't rely on that assumption in any experiment in which we are investigating consciousness. Before we ask “what is causing x?”, we first have to know that x is present.
As Eliezer points out, that an individual says he's conscious is a pretty good signal of consciousness, but we can't necessarily rely on that signal for non-human minds. A conscious AI may never talk about its internal states depending on its structure. (Humans have a survival advantage to social sharing of internal realities; an AI will not be subject to that selection pressure. There’s no reason for it to have any sort of emotional need to share its feelings, for example.) On the flip side, a savvy but non-conscious AI may talk about it's "internal states", not because it actually has internal states, but because it is “guessing the teacher's password” in the strongest way imaginable: it has no understanding whatsoever of what those states are, but computes that aping internal states will accomplish it's goals. I don't know how we could possibly know if the AI is aping consciousness for it own ends or if it actually is conscious. If consciousness is thus undetectable, I can't see how science can investigate it.
That said, I am very well aware that “Throughout history, every mystery, ever solved has turned out to be not magic*” and that every single time something has seemed inscrutable to science, a reductionist explanation eventually surfaced. Knowing this, I seriously downgrade my confidence that "No, really, this time it is different. This phenomenon really is beyond the grasp of science." I look forward to someone coming forward with something clever that dissolves the question, but even so, it does seem inscrutable.
*- Though, to be fair, this is a selection bias. Of course, all the solved mysteries weren't magic. All the mysteries that are acctully magic remain unsolved, because they're magic! This is NOT to say I believe in magic, just to say that it's hardly saying much to claim that all the things we've come to understand were in principle understandable. To steelman: I do understand that with each mystery that was once declared to be magical, then later shown not to be, our collective priors for the existence of magical things decrease. (There is a sort of halting problem: if a question has remained unsolved since the dawn of asking questions, is that because it is unsolvable, or because we're right around the corner form solving it?)
How to build the skill and the habit of experimentation?
I want to make regular experimentation a part of my life and don't really know how. I thought that I should associate with and assist people who do run experiments (I'm interning with a psych lab and a paranormal investigator, and hope to work with some behavioral economists who run field-experiments), but I relied that I haven't taken the time to consider if that is actually a good approach or if there is somthign else I should be doing in addition.
How do I gain proficiency with experimental methods and build the habit of running simple experiments regularly? I suppose that there's a certain kind of phenomenon that to the educated mind is automatically flagged as ripe for experimentation (I'm thinking of Feynman's curiosity about the ants in his room, from Surely You're Joking, or Harry James Potter-Evans-Verres testing with his army to find out what the optimal way to fight is, prior to the first of Quirrell’s battles), but I don't have that intuition, yet.
What are the key insights, procedures, or guidelines that I need to know in order to experiment fruitfully? How do I build that intuition?
I’m looking either for recommendations or critiques. Perhaps personal experimentation is not as useful as my veneration of science in general leads me to believe? It seems to me beneficial that when faced with a problem, confusion, or dispute, one of my go-to approaches is to run an experiment.
Suggestions?
The "best" mathematically-informed topics?
Recently, I asked LessWrong about the important math of rationality. I found the responses extremely helpful, but thinking about it, I think there’s a better approach.
I come from a new-age-y background. As such, I hear a lot about “quantum physics.”

Accordingly, I have developed a heuristic that I have found broadly useful: If a field involves math, and you cannot do the math, you are not qualified to comment on that field. If you can’t calculate the Schrödinger equation, I discount whatever you may say about what quantum physics reveals about reality.
Instead of asking which field of math are “necessary” (or useful) to “rationality,” I think it’s more productive to ask, “what key questions or ideas, involving math, would I like to understand?” Instead of going out of my way to learn the math that I predict will be useful, I’ll just embark on trying understand the problems that I’m learning the math for, and working backwards to figure out what math I need for any particular problem. This has the advantage of never causing me to waste time on extraneous topics: I’ll come to understand the concepts I’ll need most frequently best, because I’ll encounter them most frequently (for instance, I think I’ll quickly realize that I need to get a solid understanding of calculus, and so study calculus, but there may be parts of math that don't crop up much, so I'll effectively skip those). While I usually appreciate the aesthetic beauty of abstract math, I think this sort of approach will also help keep me focused and motivated. Note, that at this point, I’m trying to fill in the gaps in my understanding and attain “mathematical literacy” instead of a complete and comprehensive mathematical understanding (a worthy goal that I would like to pursue, but which is of lesser priority to me).
I think even a cursory familiarity with these subjects is likely to be very useful: when someone mentions say, an economic concept, I suspect that the value of even just vaguely remembering having solved a basic version of the problem will give me a significant insight into what the person is talking about, instead of having a hand-wavy, non-mathematical conception.
Eliezer said in the simple math of everything:
It seems to me that there's a substantial advantage in knowing the drop-dead basic fundamental embarrassingly simple mathematics in as many different subjects as you can manage. Not, necessarily, the high-falutin' complicated damn math that appears in the latest journal articles. Not unless you plan to become a professional in the field. But for people who can read calculus, and sometimes just plain algebra, the drop-dead basic mathematics of a field may not take that long to learn. And it's likely to change your outlook on life more than the math-free popularizations or the highly technical math.
(Does anyone with more experience than me foresee problems with this approach? Has this been tired before? How did it work?)
So, I’m asking you: what are some mathematically-founded concepts that are worth learning? Feel free to suggest things for their practical utility or their philosophical insight. Keep in mind that there is a relevant cost benefit analysis to consider: there are some concepts that are really cool to understand, but require many levels of math to get to. (I think after people have responded here, I’ll put out another post for people to vote on a good order to study these things, starting with those topics that have the minimal required mathematical foundation and working up to the complex higher level topics that require calculus, linear algebra, matrices, and analysis.)
These are some things that interest me:
- The math of natural selection and evolution
- The Schrödinger equation
- The math of governing the dynamics of political elections
- Basic optimization problems of economics? Other things from economics? (I don’t know much about these. Are they interesting? Useful?)
- The basic math of neural networks (or “the differential equations for gradient descent in a non-recurrent multilayer network with sigmoid units”) (Eliezer says it’s simper than it sounds, but he was also a literal child prodigy, so I don’t know how much that counts for.)
- Basic statistics
- Whatever the foundations of bayesianism are
- Information theory?
- Decision theory
- Game theory (does this even involve math?)
- Probability theory
- Things from physics? (While I like physics, I don’t think learning more of it would significantly improve my understanding of macro-level processes that that would impact my decisions. It's not as interesting to me as some of the other things on this list, right now. Tell me if I'm wrong or what particular sub-fields of physics are most worthwhile.)
- Some common computer science algorithms (What are these?)
- The math that makes reddit work?
- Is there a math of sociology?
- Chaos theory?
- Musical math
- “Sacred geometry” (an old interest of mine)
- Whatever math is used in meta analyses
- Epidemiology
I’m posting most of these below. Please upvote and downvote to tell me how interesting or useful you think a given topic is. Please don’t vote on how difficult they are, that’s a different metric that I want to capture separately. Please do add your own suggestions and any comments on each of the topics.
Note: looking around, I fount this. If you’re interested in this post, go there. I’ll be starting with it.
Edit: I looking at the page, I fear that putting a sort of "vote" in the comments might subtlety dissuade people from commenting and responding in the usual way. Please don't be dissuaded. I want your ideas and comments and explicitly your own suggestions. Also, I have a karma sink post under
Edit2: If you know of the specific major equations, problems, theorems, or algorithms that relate to a given subject, please list them. For instance, I just added Price's Equation as a comment to the listed "math of natural selection and evolution" and the Median Voter Theorem has been listed under "the math of politics."
Should we go all in on existential risk? - Considering Effective Altruism
Apparently, at a recent EA summit Robin Hanson berated the attendees for giving to more than one charity. I think his critique is salient: given our human scope insensitivity, giving all your charity-money to one cause feels like helping with only *one* thing, even if that one organization does vastly more good, much more efficiently, than any other group, and so every dollar given to that organization does more good than an anything else that could be done with that dollar. More rational and more effective is to find the most efficient charity and give only to that charity, until it has achieved its goal so completely that it is no longer the most efficient charity.
That said, I feel that there are at least some circumstances under which it is appropriate to divide one's charity dollars: those that include risky investments.
If a positive singularity were to occur, the impact would be enormous: it would swamp any other good that I could conceivably do. Yet, I don't know how likely a positive singularity is; it seems to be a long shot. Furthermore, I don't know how much my charity dollars affect the probability one way or another. It may be that a p-singularity will either happen or it won't, and there's not much I can do about it. There's a huge pay-off but high uncertainty. In contrast, I could (for instance) buy mosquito nets for third world counties, which has a lower, but much more certain pay-off.
Some people are more risk-seeking than others, and it seems to be a matter of preference whether one takes risky bets or more certain ones. However, there are "irrational" answers, since one can calculate the expected pay-off of a gambit by mere multiplication. It is true that it is imprudent to bet one's life savings on an unlikely chance of unimaginable wealth, but this is because of quirks of human utility calculation: losses are more painful than gain are enjoyable, and there is a law of diminishing marginal returns in play (to most of us, a gift of a billion dollars is not very emotionally different than two billion, and we would not be indifferent between a 100% chance of getting a billion dollars and a 50% chance of getting two billion dollars on the one hand, and a 50% chance of getting nothing on the other. In fact, I would trade my 50/50 chance of a billion, for a 100% certainty of a 10 million). But, we would do well to stick to mathematically calculated expected-pay-offs, for any "games" that are small enough or frequent enough, that improbable flukes will be canceled out on the net.
Let's say you walk into the psychology department, Kahneman and Tversky offer you a trade off: you can save 50 lives, or you can "sell" some or all of those lives for a 0.005% increase in the probability of an outcome in which no one ever dies again and every problem that has ever plagued humanity is solved and post-humans impregnate the universe with life. That sounds fantastic, but at best you can only increase the probability of such an outcome by a quarter of a percent. Is any ratio of "lives saved" to "incremental increases in the probability of total awesomeness" rational? Is it just a matter of personal preference how much risk you personally decide to take on? Ought you to determine your conversion factor between human lives and increases in the probability of a p-singularity, and go all in based on whether the ratio that is offered you is above or below your own (i.e. you're getting a "good deal")?
I feel like there's a good chance that we'll screw it all up and be extinct in the next 200 years. I want to stop that, but I also want to hedge my bets. If it does all go boom, I want to have spent at least some of my resources making the time we have better for as many people as possible. It even seems selfish to to not help those in need so that I can push up the probability of an awesome, but highly uncertain future. That feels almost like making reckless investment with other people's money. But maybe I just haven't gotten myself out of the cognitive-trap that Robin accused us off.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)