Recently, I asked LessWrong about the important math of rationality. I found the responses extremely helpful, but thinking about it, I think there’s a better approach.
I come from a new-age-y background. As such, I hear a lot about “quantum physics.”
Accordingly, I have developed a heuristic that I have found broadly useful: If a field involves math, and you cannot do the math, you are not qualified to comment on that field. If you can’t calculate the Schrödinger equation, I discount whatever you may say about what quantum physics reveals about reality.
Instead of asking which field of math are “necessary” (or useful) to “rationality,” I think it’s more productive to ask, “what key questions or ideas, involving math, would I like to understand?” Instead of going out of my way to learn the math that I predict will be useful, I’ll just embark on trying understand the problems that I’m learning the math for, and working backwards to figure out what math I need for any particular problem. This has the advantage of never causing me to waste time on extraneous topics: I’ll come to understand the concepts I’ll need most frequently best, because I’ll encounter them most frequently (for instance, I think I’ll quickly realize that I need to get a solid understanding of calculus, and so study calculus, but there may be parts of math that don't crop up much, so I'll effectively skip those). While I usually appreciate the aesthetic beauty of abstract math, I think this sort of approach will also help keep me focused and motivated. Note, that at this point, I’m trying to fill in the gaps in my understanding and attain “mathematical literacy” instead of a complete and comprehensive mathematical understanding (a worthy goal that I would like to pursue, but which is of lesser priority to me).
I think even a cursory familiarity with these subjects is likely to be very useful: when someone mentions say, an economic concept, I suspect that the value of even just vaguely remembering having solved a basic version of the problem will give me a significant insight into what the person is talking about, instead of having a hand-wavy, non-mathematical conception.
Eliezer said in the simple math of everything:
It seems to me that there's a substantial advantage in knowing the drop-dead basic fundamental embarrassingly simple mathematics in as many different subjects as you can manage. Not, necessarily, the high-falutin' complicated damn math that appears in the latest journal articles. Not unless you plan to become a professional in the field. But for people who can read calculus, and sometimes just plain algebra, the drop-dead basic mathematics of a field may not take that long to learn. And it's likely to change your outlook on life more than the math-free popularizations or the highly technical math.
(Does anyone with more experience than me foresee problems with this approach? Has this been tired before? How did it work?)
So, I’m asking you: what are some mathematically-founded concepts that are worth learning? Feel free to suggest things for their practical utility or their philosophical insight. Keep in mind that there is a relevant cost benefit analysis to consider: there are some concepts that are really cool to understand, but require many levels of math to get to. (I think after people have responded here, I’ll put out another post for people to vote on a good order to study these things, starting with those topics that have the minimal required mathematical foundation and working up to the complex higher level topics that require calculus, linear algebra, matrices, and analysis.)
These are some things that interest me:
- The math of natural selection and evolution
- The Schrödinger equation
- The math of governing the dynamics of political elections
- Basic optimization problems of economics? Other things from economics? (I don’t know much about these. Are they interesting? Useful?)
- The basic math of neural networks (or “the differential equations for gradient descent in a non-recurrent multilayer network with sigmoid units”) (Eliezer says it’s simper than it sounds, but he was also a literal child prodigy, so I don’t know how much that counts for.)
- Basic statistics
- Whatever the foundations of bayesianism are
- Information theory?
- Decision theory
- Game theory (does this even involve math?)
- Probability theory
- Things from physics? (While I like physics, I don’t think learning more of it would significantly improve my understanding of macro-level processes that that would impact my decisions. It's not as interesting to me as some of the other things on this list, right now. Tell me if I'm wrong or what particular sub-fields of physics are most worthwhile.)
- Some common computer science algorithms (What are these?)
- The math that makes reddit work?
- Is there a math of sociology?
- Chaos theory?
- Musical math
- “Sacred geometry” (an old interest of mine)
- Whatever math is used in meta analyses
- Epidemiology
I’m posting most of these below. Please upvote and downvote to tell me how interesting or useful you think a given topic is. Please don’t vote on how difficult they are, that’s a different metric that I want to capture separately. Please do add your own suggestions and any comments on each of the topics.
Note: looking around, I fount this. If you’re interested in this post, go there. I’ll be starting with it.
Edit: I looking at the page, I fear that putting a sort of "vote" in the comments might subtlety dissuade people from commenting and responding in the usual way. Please don't be dissuaded. I want your ideas and comments and explicitly your own suggestions. Also, I have a karma sink post under
Edit2: If you know of the specific major equations, problems, theorems, or algorithms that relate to a given subject, please list them. For instance, I just added Price's Equation as a comment to the listed "math of natural selection and evolution" and the Median Voter Theorem has been listed under "the math of politics."
+1 to this post.
Learn about first and second derivatives and finding a maximum of a function. Then think about how you might find a maximum if you can only make little hops at a time.
Learn a little linear algebra (what a matrix inverse, determinant, etc. is). Understand the relationship between solving a system of linear equations and matrix inverse. Then think about what you might want to do if you have more equations than unknowns (can't invert exactly but can find something that's "as close to an inverse as possible" in some sense). A huge chunk of stuff that falls under the heading of "statistics/machine learning/neural networks/etc" is basically variations of that idea.
Read Structure and Interpretation of Computer Programs: one of the highest concept/page density for computer science books.
Important algorithmic ideas are, in my opinion: hashing, dynamic programming/memoization, divide and conquer by recursion, splitting up tasks to be done in parallel, and locality (things you want at a particular point are often close in space and time).
Locality is sort of like "a smoothness assumption on access." The reason your laptop is fast even though your hard disk is slow is due to locality being generally true.
"I will always link to my ingroup", says Scott. So it is with me: I always recommend learning about association vs causation. If you are into learning by doing, try to find some media articles that make claims of the form "scientists report that to [Y], do [X]," and look up the original study and think about if the media claim actually follows (it generally does not). This will also give you practice reading empirical papers, which is a good skill to have. Stuff the authors do in such papers isn't magic, after all: the set of statistical ideas that come up over and over again in them is fairly small.
Don't think like that. There are no wizards, just people doing sensible things.
+1 for Structure and Interpretation of Computer Programs (aka SICP, aka "the wizard book") - this is a legendary programming book. Here is an interactive version: https://xuanji.appspot.com/isicp/.
I also agree on the important algorithmic ideas, with one addition: algorithmic analysis. Just as you can describe the movement of the planets with a few simple equations, and that's beautiful, you can describe any sequence of steps to finish a task as an algorithm. And you can mathematically analyze the efficiency of that sequence: as the task gets lar... (read more)