Comment author: iarwain1 14 January 2016 06:35:52PM 3 points [-]

I'm an undergrad going for a major in statistics and minors in computer science and philosophy. I also read a lot of philosophy and cognitive science on the side. I don't have the patience to read through all of the LW sequences. Which LW sequences / articles do you think are important for me to read that I won't get from school or philosophy reading?

Meetup : Baltimore Area: Epistemology of Disagreement

1 iarwain1 12 January 2016 01:33PM

Discussion article for the meetup : Baltimore Area: Epistemology of Disagreement

WHEN: 31 January 2016 03:00:00PM (-0500)

WHERE: 1852 Reisterstown Rd, Pikesville, MD 21208

This is the second recent meetup in the Baltimore area. We'll be meeting at the Panera Bread in Pikesville.

Discussion topic is whether and how much to update your opinions when you discover that those who you respect as epistemic peers or superiors disagree with you.

My contact info:

  • Cell: 443-453-6673 (might not pick up if I don't recognize the number, so leave a message)
  • Email: nyratynaqre@tznvy.pbz (rot13'd to avoid spam)

UPDATE: Meetup was pushed off for one week due to snow forecast.

Discussion article for the meetup : Baltimore Area: Epistemology of Disagreement

Comment author: OrphanWilde 11 January 2016 08:59:18PM *  0 points [-]

Some political predictions (Edited for formatting):

  • Another stock market slump within the next year: 50% (70% within two years)
  • Cor: Average stock value collapse, given slump, of 70%, +- 10%: 90%
  • Trump to get Republican nomination: 65%
  • Cruz to get Republican nomination: 35%
  • Hillary to get Democratic nomination: 30%
  • Rel: Hillary to be indicted on criminal charges: 50%
  • Sanders to get Democratic nomination: 60%
  • Republicans to win 2016 presidential race, regardless of nomination: 80%
  • Republicans to win moderate majority in both houses in 2016: 80%
  • Republicans to keep moderate majority in congress in 2018 given economic crash: 60%
  • Republicans to keep at least parity in congress in 2018 given economic crash: 80%
  • Republicans to keep moderate majority in congress in 2018 without economic crash: 30%
  • Republicans to keep at least parity in congress in 2018 without economic crash: 60%
  • Democrats to win 2020 presidential election without economic crash: 40%
  • Democrats to win 2020 presidential election with economic crash: 10%
  • Democrats to win 2024 presidential election, given loss of 2020 presidential election, without economic crash: 80%
  • Democrats to win 2024 presidential election, given loss of 2020 presidential election, with economic crash: 70%
  • US National Health Database goes online in next ten years: 30%
  • WHO to change Health Index ranking rules substantially given the US national health database goes online: 60%
  • WHO to change Health Index ranking rules substantially given the US national health database doesn't go online: 10%
  • Average global temperatures to warm by more than .7 degrees (Celsius) over the next ten years: 0%
  • Average global temperatures to warm by more than .5 degrees (Celsius) over the next ten years: 0%
  • Average global temperatures to warm by more than .3 degrees (Celsius) over the next ten years: 10%
  • Average global temperatures to warm by more than .1 degrees (Celsius) over the next ten years: 20%
  • Average global temperatures to warm by more than .07 degrees (Celsius) over the next ten years: 30%
  • Average global temperatures to warm by more than .04 degrees (Celsius) over the next ten years: 70%
  • Major military conflict between two first world nations over the next ten years: 10%
  • Threat of major military conflict between two first world nations over the next ten years: 70%
Comment author: iarwain1 11 January 2016 09:32:52PM *  2 points [-]

So probability of either Trump or Cruz is 100%?

Comment author: Lumifer 07 January 2016 03:58:07PM 3 points [-]

Moderately.

On the plus side it's forcing people to acknowledge the uncertainty involved in many numbers they use.

On the minus side it's treating everything as a normal (Gaussian) distribution. That's a common default assumption, but it's not necessarily a good assumption. To start with an obvious problem, a lot of real-world values are bounded, but the normal distribution is not.

Comment author: iarwain1 07 January 2016 10:23:26PM 0 points [-]

It's open source. Right now I only know very basic Python, but I'm taking a CS course this coming semester and I'm going for a minor in CS. How hard do you think it would be to add in other distributions, bounded values, etc.?

Comment author: iarwain1 07 January 2016 02:27:14PM *  5 points [-]
Comment author: IlyaShpitser 31 December 2015 11:48:25PM 1 point [-]

"Everydayrationalism."

Comment author: iarwain1 01 January 2016 12:26:54AM 0 points [-]

?

[LINK] 52 Concepts To Add To Your Cognitive Toolkit

7 iarwain1 31 December 2015 02:35PM

Excellent list by Brenton Mayer and Peter McIntyre: http://mcntyr.com/52-concepts-cognitive-toolkit/

I think the list can also serve as a useful index and/or introduction to a lot of LessWrong concepts.

A note of caution: I find that brief lists like this can actually be counterproductive, since they make you feel like you understand the issues when all you did was read a short definition and peg a name on the concept. I'd recommend doing the following: Look through the list carefully and slowly. If there's a concept there that you've read a lot about then you can go on to the next one, although do take a look at where the link points to in case it's to an interesting article you haven't seen before. If you haven't read a lot about the concept then ideally you should click on the link and read all about it. If you're more pressed for time, then at least take a few moments to reflect on each concept and think how it might apply to you. If you have even a slight suspicion that there might be something in the concept that wasn't completely obvious to you before, then click on the link even though you're pressed for time. If you're so time constrained that you can't even do this, then consider just bookmarking the list and getting back to it later. Personally I think it's better to read it later carefully than to read it now and think you understand it when you really don't.

Comment author: IlyaShpitser 24 December 2015 07:04:55PM *  3 points [-]

Absolutely agree it is important for scientists to know about cognitive biases. Francis Bacon, the father of the empirical method, explicitly used cognitive biases (he called them "idols," and even classified them) as a justification for why the method was needed.

I always said that Francis Bacon should be LW's patron saint.

Comment author: iarwain1 24 December 2015 08:11:35PM 3 points [-]

So it sounds like you're only disagreeing with the OP in degree. You agree with the OP that a lot of scientists should be learning more about cognitive biases, better statistics, epistemology, etc., just as we are trying to do on LW. You're just pointing out (I think) that the "informed laymen" of LW should have some humility because (a) in many cases (esp. for top scientists?) the scientists have indeed learned lots of rationality-relevant subject matter, perhaps more than most of us on LW, (b) domain expertise is usually more important than generic rationality, and (c) top scientists are very well educated and very smart.

Is that correct?

Comment author: IlyaShpitser 24 December 2015 06:02:28PM *  1 point [-]

I don't have any problem with Bayesian epistemology at all. You can have whatever epistemology you want.

What I do have a problem with is this "LW myopia" where people here think they have something important to tell to people like Ed Witten about how people like Ed Witten should be doing their business. This is basically insane, to me. This is strong evidence that the type of culture that gets produced here isn't particularly sanity producing.


Solomonoff induction is useless to know about for anyone who has real work to do (let's say with actual data, like physicists). What would people do with it?

Comment author: iarwain1 24 December 2015 06:58:51PM 2 points [-]

In many cases I'd agree it's pretty crazy, especially if you're trying to go up against top scientists.

On the other hand, I've seen plenty of scientists and philosophers claim that their peers (or they themselves) could benefit from learning more about things like cognitive biases, statistics fallacies, philosophy of science, etc. I've even seen experts claim that a lot of their peers make elementary mistakes in these areas. So it's not that crazy to think that by studying these subjects you can have some advantages over some scientists, at least in some respects.

Of course that doesn't mean you can be sure that you have the advantage. As I said, probably in most cases domain expertise is more important.

Comment author: IlyaShpitser 23 December 2015 05:12:56PM *  1 point [-]

jacob_cannell above seems to think it is very important for physicists to know about Solomonoff induction.

Solomonoff induction is one of those ideas that keeps circulating here, for reasons that escape me.


If we are talking about Bayesian methods for data analysis, almost no one on LW who is breathlessly excited about Bayesian stuff actually knows what they are talking about (with 2-3 exceptions, who are stats/ML grad students or up). And when called on it retreat to the "Bayesian epistemology" motte.


Bayesian methods didn't save Jaynes from being terminally confused about causality and the Bell inequalities.

Comment author: iarwain1 24 December 2015 03:00:20PM 1 point [-]

I still haven't figured out what you have against Bayesian epistemology. It's not like this is some sort of LW invention - it's pretty standard in a lot of philosophical and scientific circles, and I've seen plenty of philosophers and scientists who call themselves Bayesians.

Solomonoff induction is one of those ideas that keeps circulating here, for reasons that escape me.

My understanding is that Solomonoff induction is usually appealed to as one of the more promising candidates for a formalization of Bayesian epistemology that uses objective and specifically Occamian priors. I haven't heard Solomonoff promoted as much outside LW, but other similar proposals do get thrown around by a lot of philosophers.

Bayesian methods didn't save Jaynes from being terminally confused about causality and the Bell inequalities.

Of course Bayesianism isn't a cure-all by itself, and I don't think that's controversial. It's just that it seems useful in many fundamental issues of epistemology. But in any given domain outside of epistemology (such as causation or quantum mechanics), domain-relevant expertise is almost certainly more important. The question is more whether domain expertise plus Bayesianism is at all helpful, and I'd imagine it depends on the specific field. Certainly for fundamental physics it appears that Bayesianism is often viewed as at least somewhat useful (based on the conference linked by the OP and by a lot of other things I've seen quoted from professional physicists).

View more: Prev | Next