Comment author: skeptical_lurker 02 December 2015 05:43:18PM 4 points [-]

Using personal preference or personal intuitions as priors instead of some objective measure along the lines of Solomonoff Induction

Solomonoff Induction is uncomputable, and even if you use a computable approximation, you can't calculate it because no-one's written a program to do that AFAIK.

So if you are trying to work out which hypothesis is simpler, how do you do that? You use your personal intuition.

Mathematical Platonism

I actually think this is plausible. The argument goes: can you imagine 2+2 equalling 3? Maybe this is a personal intuition thing, but it does feel like maths is discovered not invented. If I decided that the derivative of sin(X) is x^5, and used this maths to design an airplane, it wouldn't fly. The maths exists whether I want it to or not. In physics, the equation for the electron produced two results, and one was thrown away until the positron was discovered - the existence of the positron, which is a real, physical thing, could have been predicted by the mathematics.

This is the 'unreasonable effectiveness or mathematics'. If maths describes physics perfectly, and the electrons exist, then why don't the equations for the electrons exist to the same extent?

Now, if you buy this argument, and if chairs, tables and morality can be described in terms of maths, then maybe the platonic form of a chair exists, and maybe moral realism exists? Admittedly, this generalisation is a lot more dubious, for one thing there are probably a very large number of moral systems and chairs which can be mathematically described, so this argument is less 'there exists a perfect platonic form of a chair' and more 'there are an infinite number of platonic chairs'.

The existence of non-physical minds

One argument is to go for broke and argue that the physical world does not exist at all. We know the mental world exists, because we have experiences, so the simplest explanation is that only the mental world exists and the physical world is an illusion. This then leads to libertarian free will.

(I don't actually buy this argument, I'm just explaining it.)

Not looking at the world in a probabilistic way

Because its higher status to believe in something 110%, even if this is gibberish? Because having unreasonable faith is good psychosomatically?

You underestimate the power of the dark side epistemology.

Comment author: iarwain1 02 December 2015 06:15:41PM 0 points [-]

So if you are trying to work out which hypothesis is simpler, how do you do that? You use your personal intuition.

I was using Solomonoff Induction as an example of a system that uses Occamian priors. My question was on those who assert that they don't use Occamian priors at all, or for that matter any other type of objective prior. This usually seems to lead either to rejecting Bayesian epistemology in general or to asserting that any arbitrary prior works. I actually have no problem (in theory) rejecting Bayesian epistemology, as long as you still use some sort of probability-based reasoning.

When I referred to "personal intuitions" I meant controversial or arbitrary-sounding personal intuitions, such as "I feel there's a god" or "I feel abortion is immoral" and then using those intuitions not as some sort of evidence but as priors. I get why someone would perhaps use universal intuitions as priors, along the lines of "there exists an external material world", but why use an intuition where you know the next person over likely has a different intuition?

Comment author: ChristianKl 02 December 2015 04:19:14PM 2 points [-]

Not looking at the world in a probabilistic way

Your post has only one instance of naming a probability and that's not 100%.

You say thinks like "The only thing I can think of is that people who support using intuitions like this say". There you speak about people having an identity that consists of them using an intuition a certain way. Not that they are using the intuition to 80% in a certain way but that they generally use them in a certain way.

But this just seems totally ludicrous to me. If we trust cognitive science, evolutionary psychology, etc., and if those fields give us perfectly plausible reasons for why we might intuitively feel this way / talk this way, even if it didn't reflect the truth

In a similar way you can argue that you don't have any evidence that you aren't a Bolzmann brain and therefore shouldn't act as if you are sure that you aren't. You always have to use thinking tools that aren't perfect.

I think you might make progress if you look at trying to understand the epistemology and ontology that you are actually using instead of focusing on the epistemology and ontology you think one should use.

Comment author: iarwain1 02 December 2015 06:00:21PM *  0 points [-]

Your post has only one instance of naming a probability and that's not 100%.

I meant when philosophers themselves claim they aren't looking at things in a probabilistic way. I actually had this conversation with my philosophy professor. He claimed that although he's comfortable talking about credences and probabilities, he's also comfortable talking about the world in a non-probabilistic way. This was one of those discussions where he didn't understand why I was so confused.

In a similar way you can argue that you don't have any evidence that you aren't a Bolzmann brain and therefore shouldn't act as if you are sure that you aren't. You always have to use thinking tools that aren't perfect.

Understood (I think). My intuitive (!) position is that I'm aware I can't prove (even probabilistically) that I'm not a Boltzmann brain, and I can't prove a bunch of other things. Which either leads me to accept certain very basic things without justification (along the lines of EY's where recursive justification hits bottom, or to just go with a pragmatic view of truth. Personally I'm fine with both of those.

I understand that you have to start somewhere (or else accept that you can't get anywhere in finding objective non-pragmatic truth), but what I have a hard time understanding is when people continue using intuitions far beyond the starting point to make grand metaphysical assertions.

Comment author: MrMind 24 November 2015 08:17:32AM *  1 point [-]

Which is weird because, if you take seriously the ethnic-IQ correlation (which I don't), Asians show an higher-than-westerners average IQ.

Comment author: iarwain1 24 November 2015 02:59:18PM 7 points [-]

Nothing to do with IQ, but with modes of thinking. According to Nisbett, Eastern thinking is more holistic and concrete vs. the Western formal and abstract approach. He says that Easterners often make fewer thinking mistakes when dealing with other people, where a more holistic approach is needed (for example, Easterners are much less prone to the Fundamental Attribution Error). But at the same time they tend to make more thinking mistakes when it comes to thinking about scientific questions, as that often requires formal, abstract thinking. Nisbett also speculates that this is why science developed only in the west even though China was way ahead of the west in (concrete-thinking-based) technological progress.

In general there's very little if any correlation between IQ and rationality. A lot of Keith Stanovich's work is on this.

Comment author: username2 23 November 2015 07:17:06PM 2 points [-]

Why are there many LWers from, say, Europe, but not China?

Comment author: iarwain1 23 November 2015 08:06:56PM 2 points [-]

I'm going to guess it's based on some of the East-West thinking differences outlined by Richard Nisbett in The Geography of Thought (I very highly recommend that book, BTW). I don't remember everything in the book, but I remember he had some stuff in there about why easterners are often less interested in, and have a harder time with, the sort of logical/scientific thinking that LW advocates.

Comment author: [deleted] 25 October 2015 09:41:12PM *  2 points [-]

In defense of Luke, when I've spent the time to read through philosophy books by strong-naturalist academic philosophers, they've often devoted page-counts easily equivalent in length to "Philosophy: a diseased discipline" to carefully, charitably, academically, verbosely tearing non-naturalist philosophy a new asshole. Luke's post has tended to be a breath of fresh air that I reread after reading any philosophy paper that doesn't come from a strongly naturalist perspective.

It sincerely worries me that the academics in philosophy who do really excellent work, work that does apply to the real world-that-is-made-of-atoms, work that does map-the-territory, have to spend large amounts of effort just beating down obviously bad beliefs over and over again. You should be able to shoot down a bad idea once, preferably in the peer-review phase, and not have to fight it again and again like a bad zombie.

(Examples of obviously bad ideas: p-zombies, Platonism, Bayesian epistemology (the latter two may require explanation).)

Now, to signal fairness even where I'm blatantly opinionated, plenty of people on LW are indeed irritatingly "men of one idea", that usually being some variation on AIXI. And in fact, plenty of people on LW hold philosophical opinions I consider obviously bad, like mathematical Platonism.

But the answer to those bad things hasn't usually been "more philosophy", as if any philosophy is good philosophy, but instead more naturalism, investing more effort to accommodate conceptual theorizing to the world-that-is-made-of-atoms.

Since significant portions of academic philosophy (for instance, Thomas Nagel) are instead devoted to the view - one that I once expected to be contrarian but which I now find depressingly common - that science and naturalism are wrong, or that they are unjustified, or that they are necessarily incapable of answering some-or-another important question - having one page on a contrarian intellectual-hipsters' website devoted to ragging on these ought-to-be-contrarian views is a bit of a relief.

In response to comment by [deleted] on Deliberate Grad School
Comment author: iarwain1 26 October 2015 03:49:39PM 4 points [-]

Examples of obviously bad ideas: p-zombies, Platonism, Bayesian epistemology (the latter two may require explanation).

Could you provide that explanation?

Comment author: James_Miller 20 October 2015 02:28:29AM 2 points [-]

What's the probability that this is caused by aliens?

Submitting...

Comment author: iarwain1 20 October 2015 05:14:29PM *  3 points [-]

Can you put in an "I'd just like to see the results" option?

Comment author: iarwain1 19 October 2015 02:05:42PM *  7 points [-]

What makes a good primary care physician and how do I go about finding one?

Comment author: shminux 14 October 2015 04:20:40PM 1 point [-]

That's probably the most useful and least controversial part of it.

Comment author: iarwain1 15 October 2015 03:41:22PM 0 points [-]

Is it still somewhat controversial? Meaning, are there respected physicists who think that conscious observers do magically cause things to happen?

Comment author: IlyaShpitser 14 October 2015 11:05:12PM 2 points [-]

I suppose modal logics of belief.

Comment author: iarwain1 14 October 2015 11:31:15PM *  0 points [-]

Thanks! Ok, so now a more detailed question:

As I said, I'd like to do formal epistemology. I'm an undergrad right now, and I need to decide on my major. If that's about all the formal stuff I'll need then there are a bunch of different majors that include that, and the question becomes which additional courses could help with formal epistemology or related disciplines.

Here's what I've come up with so far:

  • Choice 1: Applied Statistics. This allows several electives in other subjects, so I could do e.g. a minor in CS with only one or two extra course requirements.
  • Choice 2: Mathematical Statistics. Less electives in other subjects, more electives in math/stats. I could still probably do a CS minor along with it if I wanted.
  • Choice 3: Math degree, possibly with a stats focus.
  • Choice 4: Some other degree (e.g., CS, economics) and just make sure to get the probability theory in at some point.

I'm anyway doing a minor in philosophy, which includes at least some logic.

Comment author: IlyaShpitser 14 October 2015 09:31:26PM *  6 points [-]

Linear algebra, function optimization, probability theory.

:)

Comment author: iarwain1 14 October 2015 10:38:38PM 0 points [-]

That's it?

View more: Prev | Next