Comment author: antigonus 06 November 2013 03:12:01AM *  0 points [-]

I agree with vallinder's point, and would also like to add that arguments for moral realism which aren't theistic or contractarian in nature typically appeal to moral intuitions. Thus, instead of providing positive arguments for realism, they at best merely show that arguments for the unreliability of realists' intuitions are unsound. (For example, IIRC, Russ Shafer-Landau in this book tries to use a parity argument between moral and logical intuitions, so that arguments against the former would have to also apply to the latter.) But clearly this is an essentially defensive maneuver which poses no threat to the orthogonality thesis (even if motivational judgment internalism is true), because the latter works just as well when you substitute "moral intuition" for "goal."

Comment author: buybuydandavis 18 March 2012 09:41:47PM 0 points [-]

If I scratch my nose, that action has no truth value. No color either.

The proposition "I scratched my nose" does have a truth value.

See the distinction. Don't hand wave it with "it's all the same", "that's just semantics", etc. You started saying that this is more of a question. I've tried to clarify the answer to you.

Comment author: antigonus 18 March 2012 09:49:34PM *  0 points [-]

If I scratch my nose, that action has no truth value. No color either.

The proposition "I scratched my nose" does have a truth value.

Bayesian epistemology maintains that probability is degree of belief. Assertions of probabilities are therefore assertions of degrees of belief, which are psychological claims and therefore obviously have or can have truth-value. Of course, Bayesians can be more nuanced and take some probability claims to be about degrees of belief in the minds of some idealized reasoner; but "the degree of belief of an idealized reasoner would be X given such-and-such" is still truth-evaluable.

See the distinction. Don't hand wave it with "it's all the same", "that's just semantics", etc. You started saying that this is more of a question. I've tried to clarify the answer to you.

The question was primarily about the role of probability in Pearl's account of causality, not the basic meaning of probability in Bayesian epistemology.

Comment author: breckes 18 March 2012 06:57:01PM 1 point [-]

Do you know Jon Williamson's work? It seems to give an answer to your question (but I've not read it yet). Here's the first paragraph of Section 9.1 “Mental yet Objective” of his book “Bayesian Nets and Causality”:

Epistemic causality embodies the following position. The causal relation is mental rather than physical: a causal structure is part of an agent’s representation of the world, just as a belief function is, and causal claims do not directly supervene on mind-independent features of the world. But causality is objective rather than subjective: some causal structures are more warranted than others on the basis of the agent’s background knowledge, so if two people disagree about what causes what, one may be right and the other wrong. Thus epistemic causality sits between a wholly subjective mental account and a physical account of causality, just as objective Bayesianism sits between strict subjectivism and physical probability.

Here's a link to his papers on causality. At least the fifth, “Causality”, contains an introduction to epistemic causality.

Comment author: antigonus 18 March 2012 09:23:49PM 0 points [-]

Nope, I wasn't familiar. Very interesting, thanks!

Comment author: buybuydandavis 18 March 2012 08:52:54PM 0 points [-]

There may be an objectively correct way to throw globs of paint at the wall if I wish to do it in a way that is consistent with certain desired properties given my state of knowledge. That would not make that correct way of throwing globs of paint "true".

A la Jaynes, there is a correct way to assign degrees of belief based on your state of knowledge if you want your degrees of belief to be consistent with certain constraints, but that doesn't make any particular probability assignment "true". Probability assignments don't have truth value, they assign degrees of belief to propositions that do have truth value. It is a category error, under Jaynes perspective, to assert that a probability assignment is "true", or purple, or hairy, or smelly.

Comment author: antigonus 18 March 2012 09:21:38PM *  0 points [-]

Probability assignments don't have truth value,

Sure they do. If you're a Bayesian, an agent truly asserts that the (or, better, his) probability of a claim is X iff his degree of belief in the claim is X, however you want to cash out "degree of belief". Of course, there are other questions about the "normatively correct" degrees of belief that anyone in the agent's position should possess, and maybe those lack determinate truth-value.

Comment author: buybuydandavis 18 March 2012 08:09:59PM 0 points [-]

I was pointing out that your original statement characterizing "most people here" as asserting that "probability claims are true ..." is antithetical to Jaynes's approach, which I take as the canonical, if not universal, view on this list.

Comment author: antigonus 18 March 2012 08:39:04PM 0 points [-]

I don't see the relation between the two. It seems like you're pointing out that Jaynes/people here don't believe there are "objectively correct" probability distributions that rationality compels us to adopt. But this is compatible with there being true probability claims, given one's own probability distribution - which is all that's required.

Comment author: buybuydandavis 18 March 2012 09:09:24AM *  0 points [-]
  1. Probability is "in the mind," i.e., probability claims are true only in relation to some prior distribution and set of information to be conditionalized on;

That statement is too imprecise to capture Jaynes's view of probability. He demonstrates (YMMV) that there is a unique way to assign probability to represent your degree of belief in propositions in a way that is consistent with certain desired properties of degrees of belief. That doesn't make the probability assignment "true", it just makes it consistent with your knowledge and the desired properties. IN particular, it won't make the probability distribution you assign match some ill defined long term frequency of some event occurring.

Comment author: antigonus 18 March 2012 05:21:29PM *  0 points [-]

That statement is too imprecise to capture Jaynes's view of probability.

Of course; it wasn't intended to capture the difference between so-called objective Bayesianism vs. subjective Bayesianism. The tension, if it arises at all, arises from any sort of Bayesianism. That the rules prescribed by Jaynes don't pick out the "true" probability distributions on a certain question is compatible with probability claims like "It will probably rain tomorrow" having a truth-value.

Comment author: Jayson_Virissimo 18 March 2012 07:20:56AM 2 points [-]

I don't understand where the tension is supposed to come in. The idea that causation is in the mind, not in the world is part of the Humean tradition and has been a respected (although minority) position in philosophy for centuries. If anything, it seems to mesh particularly well with empiricist leaning philosophies (especially those with an anti-metaphysical stance).

Comment author: antigonus 18 March 2012 07:57:37AM 2 points [-]

I don't understand where the tension is supposed to come in.

It just seems really weird to be able to correctly say that A caused B when, in fact, A had nothing to do with B. If that doesn't seem weird to you, then O.K.

The idea that causation is in the mind, not in the world is part of the Humean tradition

I think that's unclear; I side with those who think Hume was arguing for causal skepticism rather than some sort of subjectivism.

Causation, Probability and Objectivity

7 antigonus 18 March 2012 06:54AM

Most people here seem to endorse the following two claims:

1. Probability is "in the mind," i.e., probability claims are true only in relation to some prior distribution and set of information to be conditionalized on;
2. Causality is to be cashed out in terms of probability distributions á la Judea Pearl or something.

However, these two claims feel in tension to me, since they appear to have the consequence that causality is also "in the mind" - whether something caused something else depends on various probability distributions, which in turn depends on how much we know about the situation. Worse, it has the consequence that ideal Bayesian reasoners can never be wrong about causal relations, since they always have perfect knowledge of their own probabilities.

Since I don't understand Pearl's model of causality very well, I may be missing something fundamental, so this is more of a question than an argument.

Comment author: antigonus 18 March 2012 12:10:34AM 3 points [-]

No considerations are given for the strength of the advantage

I wish this were stressed more often. It's really easy to think up selective pressures on any trait and really hard to pin down their magnitude. This means that most armchair EP explanations have very low prior probabilities by default, even if they seem intuitively reasonable.

Comment author: antigonus 15 March 2012 07:31:31AM *  6 points [-]

The word "cult" never makes discussions like these easier. When people call LW cultish, they are mostly just expressing that they're creeped out by various aspects of the community - some perceived groupthink, say. Rather than trying to decide whether LW satisfies some normative definition of the word "cult," it may be more productive to simply inquire as to why these people are getting creeped out. (As other commenters have already been doing.)

View more: Next