You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Luke is doing an AMA on Reddit

18 Post author: Spurlock 15 August 2012 05:38PM

I'm sure most of us are used to just being able to badger him about things in the comments here on LW, but for anyone interested here's the link.

Comments (40)

Comment author: [deleted] 15 August 2012 10:43:30PM 24 points [-]

In contrast to some sibling commenters, I'm glad EY isn't doing an AMA. It sometimes didn't turn out well in the past when people unfamiliar with the sequences tried to ask him questions.

Comment author: Eliezer_Yudkowsky 16 August 2012 09:44:47PM 14 points [-]

That was exactly my reaction to reading Luke's AMA - "no, I probably shouldn't try this."

Comment author: FiftyTwo 16 August 2012 04:39:21PM 6 points [-]

What past instances are you referring to?

Comment author: jaibot 16 August 2012 06:36:23PM 0 points [-]

I wonder if he'd be willing to do an AMA on /r/HPMOR

Comment author: RobertLumley 16 August 2012 06:44:49PM 1 point [-]

I don't think there would be anything to gain by this. Generally speaking good questions about HPMOR get answered either in r/HPMOR or in the LW discussion threads on HPMOR. He would probably ignore bad questions anyway.

Comment author: Suryc11 15 August 2012 09:15:32PM *  10 points [-]

I was very pleasantly surprised to see the AMA announcement on Reddit's frontpage, given how relatively non-mainstream the S.I. is and how many page views Reddit gets (and gives).

Also, although there is a large inferential distance between Luke and most Redditors (as siodine noted), I thought Luke did a great job trying to bridge the intuition gap--with the usual abundance of links and all.

Comment author: Emile 17 August 2012 08:14:07AM 8 points [-]

The actual link is here.

Comment author: siodine 15 August 2012 06:30:02PM 10 points [-]

Jesus, those comments are very eye opening; there's a huge inferential distance even between LW/SIers and fellow futurologists. I hope there isn't a similar distance between futurologists and the general public.

Comment author: [deleted] 15 August 2012 06:51:55PM 6 points [-]

There probably is.

Comment author: dbaupp 15 August 2012 06:58:50PM 9 points [-]

Possibly larger.

Comment author: FiftyTwo 15 August 2012 09:53:49PM 7 points [-]

Very definitely, its easy to forget the level of knowledge necessary t work at for this stuff. For example I recently realised that in a room of competitive debaters (college educated well read people) no-one knew what I meant by epistemic uncertainty. And very few philosophers know anything about QM or neurology...

TL;DR Illusion of transparency is a bitch.

Comment author: Wei_Dai 16 August 2012 05:18:51AM 9 points [-]

For example I recently realised that in a room of competitive debaters (college educated well read people) no-one knew what I meant by epistemic uncertainty.

Wait, what do you mean by "epistemic uncertainty"? The top Google results for the phrase contrast it with "aleatoric uncertainty" which is so esoteric that it's not even in LW's vocabulary (zero results for "aleatoric" on LW search).

Comment author: [deleted] 16 August 2012 10:28:57PM 0 points [-]

"Epistemic uncertainty" sounds like a fancy way of saying "ignorance". "Aleatoric" I think means "stochastic" (the cognate of that word in Italian is not terribly uncommon).

Comment author: fubarobfusco 16 August 2012 11:33:41PM *  0 points [-]

Wikipedia says:

Aleatoric uncertainty, aka statistical uncertainty, which is unknowns that differ each time we run the same experiment. For an example of simulating the take-off of an airplane, even if we could exactly control the wind speeds along the run way, if we let 10 planes of the same make start their trajectories would still differ due to fabrication differences. Similarly, if all we knew is that the average wind speed is the same, letting the same plane start 10 times would still yield different trajectories because we do not know the exact wind speed at every point of the runway, only its average. Aleatoric uncertainties are therefore something an experimenter cannot do anything about: they exist, and they cannot be suppressed by more accurate measurements.
Epistemic uncertainty, aka systematic uncertainty, which is due to things we could in principle know but don't in practice. This may be because we have not measured a quantity sufficiently accurately, or because our model neglects certain effects, or because particular data are deliberately hidden.

http://en.wikipedia.org/wiki/Uncertainty_quantification

Comment author: tim 17 August 2012 01:03:18AM 0 points [-]

Could we say that aleatoric uncertainty would be akin to not knowing whether a coin will land heads or tails (but we know the odds are 1:1) and epistemic uncertainty would be akin to not knowing the odds of the coin at all?

Comment author: Vaniver 21 August 2012 09:38:35PM 0 points [-]

Aleatoric uncertainty is basically seeing randomness as a property of the universe, rather than a property of minds. Unless you verge into quantum territory, basically all randomness is actually epistemic uncertainty, and even if you verge into quantum territory, you can view quantum randomness as epistemic uncertainty.

Bayesians are comfortable viewing all uncertainties as epistemic. Non-Bayesians aren't, and all of the people I know who do professional decision-making under uncertainty dread someone even mentioning aleatoric uncertainty because it's a dead giveaway that the person mentioning it isn't Bayesian, and thus a long, unproductive philosophical discussion may be necessary before they can get anywhere.

Comment author: fubarobfusco 17 August 2012 02:18:16AM -1 points [-]

The Wikipedia definition makes it sound more like aleatoric uncertainty is not knowing whether it will land heads or tails (because it will do something different each time), and epistemic uncertainty is not having a camera accurate enough to see whether it has landed heads or tails.

Comment author: jswan 21 August 2012 09:04:54PM 2 points [-]

I realize that LW collectively doesn't like unreferenced definitions, but in this case maybe it's OK... a friend of mine whose PhD is in decision theory explained aleatory uncertainty to me as the uncertainty of chance with known parameters: if you roll a normal six-sided die, you know it's going to come up with a value in the range 1-6, but you don't know what it will be. There's no chance it will come up 7. Epistemic uncertainty is the uncertainty of chance with unknown parameters: there may not be enough data to know the bounds of an event, or it may have such large and random bounds that trying to place them is not very meaningful.

Comment author: Miller 17 August 2012 02:41:28AM -1 points [-]

You could probably mad words any two buzz words together though. How about quantum rationality?

Comment author: [deleted] 16 August 2012 05:13:23AM *  4 points [-]

epistemic uncertainty

I find myself in the embarrassing position of not knowing what that term refers to...

EDIT A few upvotes but no definitions. In case it wasn't clear, can someone tell me what "epistemic uncertainty" means, if it is a thing.

Comment author: Suryc11 16 August 2012 09:02:31AM *  3 points [-]

Isn't it simply the extent to which one is not certain about some (piece of) knowledge? At least that was my intuition when I first read that.

After googling, the closest definition I could find was on Wikipedia under systemic uncertainty--in contrast to statistical uncertainty (aleatoric uncertainty) apparently.

Comment author: shminux 16 August 2012 05:48:06AM 1 point [-]

welcome to the club!

Comment author: NancyLebovitz 16 August 2012 05:47:57AM *  1 point [-]

Do you mean they weren't familiar with the phrase "epistemic uncertainty" or they didn't know the concept?

Comment author: FiftyTwo 16 August 2012 04:30:26PM *  0 points [-]

The phrase. In context the argument I was making wasn't that complicated (uncertainty of moral status of fetus), but the inferential gap was in not realising that the phrasing I found natural was fairly incomprehensible.

Comment author: Raemon 16 August 2012 04:07:28AM 1 point [-]

If you need to do TL;DR for a single paragraph...

Dunno. Feels like there's some kind of joke opportunity here for inferential distance but I can't quite nail it.

Comment author: FiftyTwo 16 August 2012 04:33:49PM 1 point [-]

The TL;DR was mainly for the purposes of humour in this instance rather than actual ease of reading. It also seems a generally useful thing to be reminded of.

Comment author: J_Taylor 16 August 2012 11:58:52PM 0 points [-]

very few philosophers know anything about QM or neurology

Very few philosophers need to know anything about QM or neurology.

Comment author: loup-vaillant 17 August 2012 01:35:17PM *  1 point [-]

QM potentially answers cool philosophical questions like, "does cut & paste transportation preserves identity" (it looks like it does, for our universe doesn't seem to encode any identity at all).

Neurology will most probably tell us nearly everything we will ever know about how humans actually work. I expect many questions formerly considered "philosophical" will be answered by this piece of science.

Therefore, I think nearly all philosophers need to know some QM and neurology.

Comment author: [deleted] 17 August 2012 01:53:56PM 1 point [-]

Therefore, I think nearly all philosophers need to know some QM and neurology.

The question is whether knowing a little QM and neurology is more or less harmful than knowing none at all.

Comment author: FiftyTwo 18 August 2012 01:20:17AM 0 points [-]

Nothing can protect you from people who fail to apply their knowledge well. Partial knowledge at least makes them aware that there is more to learn.

Comment author: J_Taylor 17 August 2012 11:42:26PM 0 points [-]

I agree with your first statement.

However, as for your second statement, I would really like an example, because I am not entirely sure what you mean. (I am sincerely requesting examples.)

Unfortunately, I strongly disagree with your third statement. The time it would take to learn QM with sufficient rigor to be interesting could be better spent reading the findings of experimental psychology or learning more mathematics. For the majority of philosophers, their subject matter simply does not overlap with QM in such a way that knowing rigorous QM would help them.

Further, I agree with what paper-machine seemed to imply in their post. A little QM can make a philosopher stupid.

Of course, in certain subjects, knowing QM or neurology should be mandatory.

Comment author: FiftyTwo 18 August 2012 01:17:14AM *  3 points [-]

However, as for your second statement, I would really like an example, because I am not entirely sure what you mean. (I am sincerely requesting examples.)

Few quick examples:

  • A lot of philosophy of mind assumes there is a singular unified self, whereas neurology might lead you to think of the mind as a group of systems, and this could resolve some dilmnas.

  • Lots of traditional moral theories assume people make choices in certain ways not backed by observation of their brains.

  • Your willingness to accept materialist explanations for the mind probably increases exponentially the more you know about the mechanics of the brain. (Are the any dualist neuroscientists?)

  • A lot of philosophy uses 'armchair' reflection and introspection to get foundational intuitions and make judgements. Knowing the hardware you're running that on is probably helpful. (E.g. showing how easy it is to trigger people's intuitions one way or the other changed the debate about Gettier cases massively.)

Comment author: J_Taylor 18 August 2012 08:58:15PM 3 points [-]

I see and concede. I had been thinking at an excessively low-level.

Comment author: John_Maxwell_IV 16 August 2012 04:43:22AM 3 points [-]

Well remember, there are probably lots of people coming from /r/IAmA and leaving questions.

Comment author: buckwheats 17 August 2012 11:33:22PM *  0 points [-]

The AMA may have received comments form curious people outside of r/futurology since there was an announcement for it on the front page. One thing about r/futurology, too, is that it recently tripled in size - only a few months ago it has around 6k subscribers. A lot of the growth came a week or two ago from a thread featured on r/bestof that got a lot of attention. Those things probably contributed to the inferential distance... If the AMA had happened a few months ago it may have been less, or indeed if it had happened a few months from now, counting on there being significant attrition of those new subscribers.

Comment author: FiftyTwo 15 August 2012 09:48:58PM *  1 point [-]

Although in theory we can badger Luke whenever we like, its nice to have a socially approved opportunity to ask 'stupid' or off topic questions.

Comment author: Locke 15 August 2012 09:31:15PM 1 point [-]

Why is it just Luke doing the AMA? Eliezer already has an account for HPMOR, after all.

Comment author: RobertLumley 15 August 2012 10:09:42PM 1 point [-]

This is such low hanging fruit that I'm embarrassed it never occurred to me before. Props to Luke for doing this. One by EY might be worth the time as well, especially given how popular HPMOR is on Reddit.

Comment author: RobertLumley 16 August 2012 03:40:40AM 1 point [-]
Comment author: TylerJay 16 August 2012 12:02:58AM 0 points [-]

Well done