Comment author: PeterCoin 29 May 2016 08:14:26AM 0 points [-]

I'm confused here: You seem to be analyzing a troubleshooting process. How exactly did the troubleshooting process fail? I can see that there's some criticisms of what was done. But I don't see how this troubleshooting process resulted in disaster.

Comment author: fowlertm 29 May 2016 04:52:03PM 0 points [-]

Because I missed numerous implications, needlessly increased causal opacity, and failed to establish a baseline before I started fiddling with variables. Those are poor troubleshooting practices.

Comment author: fowlertm 06 December 2015 04:22:28PM 0 points [-]

So a semi-related thing I've been casually thinking about recently is how to develop what basically amounts to a hand-written programming language.

Like a lot of other people I make to-do lists and take detailed notes, and I'd like to develop a written notation that not only captures basic tasks, but maybe also simple representations of the knowledge/emotional states of other people (i.e. employees).

More advanced than that, I've also been trying to think of ways I can take notes in a physical book that will allow a third party to make Anki flashcards or evernote entries based on my script. It has to be extremely dense to fit in the margins of a book, and must capture distinct commands like "make a single cloze deletion card for this sentence" and "make four separate cards for this sentence, cloze deleting a different piece of information for each card but otherwise leaving everything intact" and so on.

Any thoughts?

Comment author: iarwain1 04 October 2015 09:08:31PM 2 points [-]

Why do you say Carnegie Mellon? I'm assuming it's because they have the Center for Formal Epistemology and a very nice-looking degree program in Logic, Computation and Methodology. But don't some other universities have comparable programs?

Do you have direct experience with the Carnegie Mellon program? At one point I was seriously considering going there because of the logic & computation degree, and I might still consider it at some point in the future.

Comment author: fowlertm 12 October 2015 03:56:48PM 1 point [-]

I mentioned CMU for the reasons you've stated and because Lukeprog endorsed their program once (no idea what evidence he had that I don't).

I have also spoken to Katja Grace about it, and there is evidently a bit of interest in LW themes among the students there.

I'm unaware of other programs of a similar caliber, though there are bound to be some. If anyone knows of any, by all means list them, that was the point of my original comment.

Comment author: fowlertm 04 October 2015 04:07:43PM 4 points [-]

I think there'd be value in just listing graduate programs in philosophy, economics, etc., by how relevant the research already being done there is to x-risk, AI safety, or rationality. Or by whether or not they contain faculty interested in those topics.

For example, if I were looking to enter a philosophy graduate program it might take me quite some time to realize that Carnegie Melon probably has the best program for people interested in LW-style reasoning about something like epistemology.

Comment author: fowlertm 01 June 2015 03:15:20AM 1 point [-]

Data point/encouragement: I'm getting a lot out of these, and I hope you keep writing them.

I'm one of those could-have-beens who dropped mathematics early on despite a strong interest and spent the next decade thinking he sucked at math before he rediscovered numerical proclivites in his early 20's because FAI theory caused him to peek at Discrete Mathematics.

In response to FOOM Articles
Comment author: lukeprog 05 March 2015 09:58:13PM 4 points [-]

Besides Superintelligence, the latest "major" publication on the subject is Yudkowsky's Intelligence explosion microeconomics. There are also a few articles related to the topic at AI Impacts.

In response to comment by lukeprog on FOOM Articles
Comment author: fowlertm 06 March 2015 02:41:37AM 3 points [-]

Both unknown to me, thanks :)

Comment author: PhilGoetz 21 February 2015 12:10:12AM 1 point [-]

I disagree. Masculinity is an especially important and problematic set of values.

Comment author: fowlertm 21 February 2015 03:48:19PM 1 point [-]

Why? What's wrong with wanting to be masculine?

Comment author: SanguineEmpiricist 20 February 2015 03:09:32AM 2 points [-]

"Now, I deliberately compare two future versions of myself, one armed with the technique I just discovered and one without. Seeing how much farther along I will be results in a net gain of motivation."

Isaac Levi one of the founders of formal epistemology does something similar called "Mild Contractions". From the title of one of his books

"Mild Contraction: Evaluating Loss of Information Due to Loss of Belief". His epistemology constructed from decision theory is very advanced if not the most advanced.

http://www.amazon.com/Mild-Contraction-Evaluating-Information-Belief-ebook/dp/B00DZO8P4G/ref=sr_1_1?s=books&ie=UTF8&qid=1424401748&sr=1-1&keywords=mild+contraction

Comment author: fowlertm 21 February 2015 03:30:55PM *  1 point [-]

Interesting tie-in, thanks.

Incidentally, how cool would it be to be able to say "my epistemology is the most advanced"? If nothing else it'd probably be a great pickup line at LW meetups.

Comment author: John_Maxwell_IV 04 February 2015 01:34:02AM 0 points [-]

I suffer from mild Carpal Tunnel (or something masquerading as CT) which makes progress in programming slow. When I feel down about this fact I imagine how hard programming would be without hands.

This book solved my crippling carpal tunnel syndrome, FWIW.

Comment author: fowlertm 04 February 2015 04:39:41AM 0 points [-]

It's worth a lot, I'll look into it.

Comment author: LawrenceC 01 February 2015 05:45:36PM *  1 point [-]

I tried making one just for the math behind rationality/decision theory back in October, but I never got around to finishing it. The main problems I ran into were:

  • Where should the skill tree start? I'm sure that basic math like algebra, geometry, trig, etc are all really useful, but I'm not sure about the dependencies between them. I ended up lumping them all into "basic mathematics".

  • How should the skill tree split subjects? Many subjects are best learned iteratively - for example, it's probably best to get a rudimentary understanding of probability theory, then learn more probability theory later on once you've picked up other related subjects (Linear Algebra, Multivariate Calculus, etc) and then again after more subjects (Measure theory). The complication is that these other subjects are often split into different "levels". I found that I didn't have enough familiarity with math to split subjects naturally.

One method that seems promising is taking a bunch of textbooks/courses, and trying to figure out the dependencies between them.

Comment author: fowlertm 03 February 2015 05:28:36PM 1 point [-]

Agreed. I think in light of the fact that a lot of this stuff is learned iteratively you'd want to unpack 'basic mathematics'. I'm not sure of the best way to graphically represent iterative learning, but maybe you could have arrows going back to certain subjects, or you could have 'statistics round II' as one of nodes in the network.

It seems like insights are what you're really aiming at, so maybe instead of 'probability theory' you have a node for 'distributions' and 'variance' at some early point in the tree then later you have 'Bayesian v. Frequentist reasoning'.

This would help also help you unpack basic mathematics, though I don't know much about the dependencies either. I hope too, soon :)

View more: Next