Comment author: IlyaShpitser 31 December 2015 11:48:25PM 1 point [-]

"Everydayrationalism."

Comment author: iarwain1 01 January 2016 12:26:54AM 0 points [-]

?

Comment author: IlyaShpitser 24 December 2015 07:04:55PM *  3 points [-]

Absolutely agree it is important for scientists to know about cognitive biases. Francis Bacon, the father of the empirical method, explicitly used cognitive biases (he called them "idols," and even classified them) as a justification for why the method was needed.

I always said that Francis Bacon should be LW's patron saint.

Comment author: iarwain1 24 December 2015 08:11:35PM 3 points [-]

So it sounds like you're only disagreeing with the OP in degree. You agree with the OP that a lot of scientists should be learning more about cognitive biases, better statistics, epistemology, etc., just as we are trying to do on LW. You're just pointing out (I think) that the "informed laymen" of LW should have some humility because (a) in many cases (esp. for top scientists?) the scientists have indeed learned lots of rationality-relevant subject matter, perhaps more than most of us on LW, (b) domain expertise is usually more important than generic rationality, and (c) top scientists are very well educated and very smart.

Is that correct?

Comment author: IlyaShpitser 24 December 2015 06:02:28PM *  1 point [-]

I don't have any problem with Bayesian epistemology at all. You can have whatever epistemology you want.

What I do have a problem with is this "LW myopia" where people here think they have something important to tell to people like Ed Witten about how people like Ed Witten should be doing their business. This is basically insane, to me. This is strong evidence that the type of culture that gets produced here isn't particularly sanity producing.


Solomonoff induction is useless to know about for anyone who has real work to do (let's say with actual data, like physicists). What would people do with it?

Comment author: iarwain1 24 December 2015 06:58:51PM 2 points [-]

In many cases I'd agree it's pretty crazy, especially if you're trying to go up against top scientists.

On the other hand, I've seen plenty of scientists and philosophers claim that their peers (or they themselves) could benefit from learning more about things like cognitive biases, statistics fallacies, philosophy of science, etc. I've even seen experts claim that a lot of their peers make elementary mistakes in these areas. So it's not that crazy to think that by studying these subjects you can have some advantages over some scientists, at least in some respects.

Of course that doesn't mean you can be sure that you have the advantage. As I said, probably in most cases domain expertise is more important.

Comment author: IlyaShpitser 23 December 2015 05:12:56PM *  1 point [-]

jacob_cannell above seems to think it is very important for physicists to know about Solomonoff induction.

Solomonoff induction is one of those ideas that keeps circulating here, for reasons that escape me.


If we are talking about Bayesian methods for data analysis, almost no one on LW who is breathlessly excited about Bayesian stuff actually knows what they are talking about (with 2-3 exceptions, who are stats/ML grad students or up). And when called on it retreat to the "Bayesian epistemology" motte.


Bayesian methods didn't save Jaynes from being terminally confused about causality and the Bell inequalities.

Comment author: iarwain1 24 December 2015 03:00:20PM 1 point [-]

I still haven't figured out what you have against Bayesian epistemology. It's not like this is some sort of LW invention - it's pretty standard in a lot of philosophical and scientific circles, and I've seen plenty of philosophers and scientists who call themselves Bayesians.

Solomonoff induction is one of those ideas that keeps circulating here, for reasons that escape me.

My understanding is that Solomonoff induction is usually appealed to as one of the more promising candidates for a formalization of Bayesian epistemology that uses objective and specifically Occamian priors. I haven't heard Solomonoff promoted as much outside LW, but other similar proposals do get thrown around by a lot of philosophers.

Bayesian methods didn't save Jaynes from being terminally confused about causality and the Bell inequalities.

Of course Bayesianism isn't a cure-all by itself, and I don't think that's controversial. It's just that it seems useful in many fundamental issues of epistemology. But in any given domain outside of epistemology (such as causation or quantum mechanics), domain-relevant expertise is almost certainly more important. The question is more whether domain expertise plus Bayesianism is at all helpful, and I'd imagine it depends on the specific field. Certainly for fundamental physics it appears that Bayesianism is often viewed as at least somewhat useful (based on the conference linked by the OP and by a lot of other things I've seen quoted from professional physicists).

In response to comment by username2 on LessWrong 2.0
Comment author: So8res 13 December 2015 11:55:29PM 12 points [-]

I have the requisite decision-making power. I hereby delegate Vaniver to come up with a plan of action, and will use what power I have to see that that plan gets executed, so long as the plan seems unlikely to do more harm than good (but regardless of whether I think it will work). Vaniver and the community will need to provide the personpower and the funding, of course.

In response to comment by So8res on LessWrong 2.0
Comment author: iarwain1 14 December 2015 01:00:57AM 1 point [-]

and the funding

A Kickstarter, perhaps?

Comment author: Bryan-san 09 December 2015 08:21:06PM 1 point [-]

Whoever is running the meetup needs to make Meetup Posts for each meeting before they show up on the sidebar. IIRC regular meetups are often not posted there if the creator forgets about it. You can ask the person who runs the meetups to post them on LW more often or ask them if you can post them in their stead.

I run the San Antonio meetup and you are very welcome to attend here if it's the nearest one to you!

Comment author: iarwain1 09 December 2015 09:26:43PM 1 point [-]

Not sure what you mean by this. I actually posted the meeting for the Baltimore area myself.

The Baltimore and Washington DC meetups do show up if I click on "Nearest Meetups", just that they appear in the 5th and 8th spots. That list appears to be sorted first by date and then alphabetically. The San Antonio meetup appears at the #4 slot, and the Durham meetup does not appear at all.

Basically the "nearest" part of nearest meetups seems to be completely broken.

Comment author: iarwain1 09 December 2015 04:56:58PM *  1 point [-]

I'm from Baltimore, MD. We have a Baltimore meetup coming up Jan 3 and a Washington DC meetup this Sun Dec 13. So why do the two meetups listed in my "Nearest Meetups" sidebar include only a meetup in San Antonio for Dec 13 and a meetup in Durham NC for Sep 17 2026 (!)?

Comment author: iarwain1 06 December 2015 03:31:13PM 0 points [-]

On the science of how to learn: Make It Stick.

Comment author: OrphanWilde 03 December 2015 09:49:40PM 0 points [-]

I predict pushback -- LW won't like that idea one little bit :-)

I've had this particular post in my drafts for... oh... over a year, now? For pretty much that reason.

I'm still not sure, though, that by the prediction metric science will look as badly as you hint and religions will shine.

Which group has higher average happiness levels, religious or non-religious?

Comment author: iarwain1 03 December 2015 10:22:01PM 0 points [-]

See this article (full article available from sidebar), which argues that although conventional wisdom gives religion the advantage here, the reality may not be so clear-cut.

Comment author: ChristianKl 02 December 2015 06:23:46PM 0 points [-]

I meant when philosophers themselves claim they aren't looking at things in a probabilistic way. [...] This was one of those discussions where he didn't understand why I was so confused.

The point isn't that you don't do either.

He claimed that although he's comfortable talking about credences and probabilities, he's also comfortable talking about the world in a non-probabilistic way.

Your post is mainly talking about world in a non-probabilistic way. Given that's the case the professor with whom you are talking get's confused.

To me it looks like the problem is belief in belief of logical positivism.

My intuitive (!) position is that I'm aware I can't prove (even probabilistically) that I'm not a Boltzmann brain

The fact that you intuition is that you can't prove that you are not a Boltzmann brain, doesn't change that your intuition is that you aren't a Boltzmann brain.

I intuition is that P!=NP but at the same time I'm certain that I don't have the mathetical skills to prove P!=NP.

The fact that you don't have an intuitive mental distinction between "X is true" and "I can prove X is true" is a problem.

Comment author: iarwain1 02 December 2015 06:51:23PM 0 points [-]

The point isn't that you don't do either.

Sorry, don't know what you mean to say here. Could you rephrase?

Your post is mainly talking about world in a non-probabilistic way.

Could you elaborate on what you mean?

To me it looks like the problem is belief in belief of logical positivism.

Again, could you elaborate? I don't see any reason to associate anything I've said with logical positivism.

The fact that you intuition is that you can't prove that you are not a Boltzmann brain, doesn't change that your intuition is that you aren't a Boltzmann brain.

Of course I intuit that I'm not a Boltzmann brain, and of course I act as if I'm not. Not sure where I indicated otherwise. Again, my issue is with taking intuitions far beyond these fundamental we-need-to-start-somewhere levels and using them as strong evidence of truth.

View more: Prev | Next