Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Cyan 27 December 2009 10:48:42PM 1 point [-]

I'm surprised that a post that basically does nothing but acknowledge inductive bias is presently at -2.

Comment author: AndrewKemendo 29 December 2009 10:11:12AM 1 point [-]

I had not read that part. Thanks.

I do not see any difference in inductive bias as it is written there and dictionary and wikipedia definitions of faith:

Something that is believed especially with strong conviction(http://www.merriam-webster.com/dictionary/faith)

Faith is to commit oneself to act based on sufficient experience to warrant belief, but without absolute proof.

Comment author: Psychohistorian 28 December 2009 05:58:40PM *  3 points [-]

Intuition and induction are in my view very similar to what is understood as faith.

I don't see how this works. Induction is, basically, the principle of inferring the future from the past (or the past from the present), which basically requires the universe to consistently obey the same laws. The problem with this, of course, is that the only evidence we have that the future will be like the past is the fact that it always has been, so there's a necessary circularity. You can't provide evidence for induction without assuming induction is correct; indeed, the very concept of "evidence" assumes induction is correct.

Intuition, on the other hand, is entirely susceptible to being analyzed on its merits. If our intuition tends to be right, we are justified in relying on it, even if we don't understand precisely how it works. If it isn't typically right for certain things, or if it contradicts other, better evidence, we're wrong to rely on it, even though believing contrary to our intuition can be difficult.

I don't see how either of these concepts can be equated with a conventional use of "faith."

Edited in response to EY's comment below: I'm not meaning to compare faith in induction to faith in religion at all. The "leap" involved differs extraordinarily, as one is against evidence and the other is evidence. Not to mention every religious person also believes in induction, so the faith required for religion is necessarily in addition to that required by everyone to not get hit by a bus.

Comment author: AndrewKemendo 29 December 2009 10:06:14AM 0 points [-]

I think you, EY and most use the term faith in a historical context related to religion rather than its definitional context as it relates to epistemological concerns of trust in an idea or claim

The best definition I have found so far for faith is thus:

Faith is to commit oneself to act based on sufficient experience to warrant belief, but without absolute proof.

So I have no problem using faith and induction interchangeably because it is used just as you say:

inferring the future from the past (or the past from the present), which basically requires the universe to consistently obey the same laws.

Religions claim that they do this. Of course they don't because they do not apply a constant standard to their worldview to all events. It is not because of their faith that they are wrong, it is because of their inconsistent application of accepting claims and ignoring evidence.

The point of the system is to deconstruct why you see their claims of evidence as faith and vice versa. Hence the incorruptible example.

Comment author: Vladimir_Nesov 27 December 2009 02:04:43PM 3 points [-]

We know that heuristics “fill the gaps” in knowledge of recognizable scenarios such that we can comfortably demand less evidence and still come to a reasonable conclusion about our surroundings in most cases.

Intuition (what you call "faith") is evidence. Like any evidence, it comes with uncertainty about its implications, and dependence of its reliability on other known factors. You can't cut it some slack as a special case, rather you already know something about your mind and its heuristics that boosts the probability of them computing the right answers.

You already know that your lawful intelligence does a lot of work, considers a lot of evidence, much more than it theoretically needed, presenting its conclusions for you to feel as intuition. Even though you can't see the machinery, can't name specific pieces of evidence that drive your intuition, you know that it's there. Perhaps the greatest strength that intuitive mind gives you is ability to locate the hypotheses, something other tools just can't do.

At the same time, you know that you run on corrupted hardware, that the answers given by intuition are unreliable and may be systematically biased towards stupidity or against your values, but so are the answers of any other tool. Conditioned on the presence of features known to evoke standard biases, your intuition can be considered either more or less likely to give the correct answers. Sometimes, you have nothing except intuition and intuition is known to be compromised, but in that case you just have to acknowledge significant uncertainty in what intuition cries to be an inevitable conclusion. In other cases, you have data to update intuition's estimates to something very unlike what intuition says by itself. Often, intuition and other sources of evidence agree.

Comment author: AndrewKemendo 28 December 2009 07:07:17AM 4 points [-]

Intuition (what you call "faith") is evidence.

If you will, please define intution as you understand it.

From how I understand intuition, it is knowledge for which the origin cannot be determined. I have certainly experienced the "I know I read something about that somewhere but I just can't remember" feeling before and was right about it. However just as equally I have been wrong about conclusions that I have come to through this means.

I think your entire post gives the same visceral description as someone would describe about having "felt the holy spirit "or some other such nonsense.

I honestly think that the issue of intuition is a MAJOR hurdle for rationality. I tend to err on the side of intuition being false evidence - hence why I indicated that our heuristics filled in the blanks. That is why I categorize intuition with faith similarly.

Comment author: JRMayne 27 December 2009 05:17:02PM 9 points [-]

Perhaps I misunderstand, but I think what you're calling "faith," is simply confidence level. Renaming it "faith," I think, is going to muddle the terminology more than it assists the ability to argue the point.

Let's take your view of evolution: You're saying you accept it based 1% on faith. I would say that I am 99% confident of it (assuming your view), which has the advantage that doubts that enter into the analysis would reduce my confidence level, but not increase my faith level, which would remain at zero.

Nor do I believe that this renaming helps us meet the divergent views between the faithful and the evidence-based. To give credence to faith as a necessary filler of doubt doesn't help move people toward evidence, in my view. Further, it is exactly the sort of logic the faithful cite - science is just faith in something else.

Comment author: AndrewKemendo 28 December 2009 06:59:43AM 2 points [-]

confidence level.

Most people do not understand what a confidence interval or confidence levels are. At least in my interactions. Unless you have had some sort of statistics (even basic) you probably haven't heard of it.

Comment author: Psychohistorian 27 December 2009 09:47:56PM 13 points [-]

This creates the mistaken implication that there is some need for an affirmative belief to sum up to 100%, and I think it improperly relabels "uncertainty" as "faith."

A perfectly rational being would assign some percentage chance to evolution being true, some percentage chance to each religion's creation story being true, and some chance for any number of other theories that have not yet occurred to us. It would not feel bound to a binary, "Evolution right; creationism wrong!" that the human mind naturally gravitates to. It would be perfectly happy to think, "Evolution P=.999, Creationism P=1.2e-45" or whatever values it determined were appropriate.

Similarly, there is no red gap that needs to be bridged by faith. The only thing one truly must have faith in (and please correct me if you can; I'd love to be wrong) is induction, and if you truly lacked faith in induction, you'd literally go insane.

Rather, the religious claim of, "Well, you need to have faith to believe X" is a fundamental misunderstanding of what constitutes a proper degree of certainty. It's like the DirectTV ad where the "competitor" says, "Direct TV has 1080P. We don't. But neither we nor directTV broadcast in 1 million P! Look who just leveled the playing field!." You can "win" almost any argument by raising the bar to the certainty standard, because if it's an argument about reality, both sides will be infinitely far from meeting it. Implying that some degree of "faith" is necessary to fill the bar to 100 wrongly gives this argument credibility. The red part is occupied by likely alternatives, not by faith.

You need faith to believe that a fair coin flipped fifty times will always land heads. You do not need faith to believe that a fair coin flipped fifty times will not always land heads, even though it is possible. I think this gives a more accurate understanding of what is meant by faith.

Comment author: AndrewKemendo 28 December 2009 06:58:08AM 0 points [-]

I think it improperly relabels "uncertainty" as "faith."

Perhaps. The way I see uncertainty as it pertains to one or another claim is that there will almost always be a reasonable counter claim and in order to dismiss the counter claim and accept the premise, that is faith in the same sense.

The only thing one truly must have faith in (and please correct me if you can; I'd love to be wrong) is induction, and if you truly lacked faith in induction, you'd literally go insane.

Intuition and induction are in my view very similar to what is understood as faith. I failed to make that clear, however I would use those interchangeably.

I recognize that faith is a touchy issue because it is so dramatically irrational and essentially leads to the slippery slope of faith. I view the issue similar to how the case was made for selecting the correct contrarian views, we are concluding approximately for what we do not know or for counterclaims.

Comment author: cabalamat 24 December 2009 04:02:13PM 0 points [-]

So can rationality work on a large scale? Arguably, it always does work. I rarely hear political or social arguments that are obviously (to everyone) pure hokum. If you look at how the last 4 U.S. presidents campaigned, it was always on "save you money" talking points and "less waste, more justice" platform. All rational things in the mind of the average person.

Sure. What's not rational is to believe that politicians will deliver on the promise of reducing waste. All politicians say they will do it, and have done for a long time, but governments are not noticable less wasteful than they were 50 or so years ago.

It's therefore irrational to believe a politician when they say they will cut waste, unless they say in detail how they will do so (which they usually don't).

Comment author: AndrewKemendo 25 December 2009 02:10:15PM 2 points [-]

Sure. What's not rational is to believe ... politicians

I think that is likely the best approach

Comment author: AndrewKemendo 23 December 2009 12:32:56PM 1 point [-]

Your argument seems to conclude that:

It is impossible to reason with unreasonable people

Agreed. Now what?

Ostensibly your post is about how to swing the ethos of a large group of people towards behaving differently. I would argue that has never been necessary and still is not.

A good hard look at any large political or social movement reveals a small group of very dedicated and motivated people, and a very large group of passive marginally interested people who agree with whatever sounds like it is in their best interest without them really doing too much work.

So can rationality work on a large scale? Arguably, it always does work. I rarely hear political or social arguments that are obviously (to everyone) pure hokum. If you look at how the last 4 U.S. presidents campaigned, it was always on "save you money" talking points and "less waste, more justice" platform. All rational things in the mind of the average person.

I think however your implication is that rationality is not always obviously rational. Well friend, that is why you have to completely understand the implications of rational decision making in terms that the majority can agree on in order to describe why they are better decisions. You often have to connect the dots for people so that they can see how to get from some contrarian or "non-intuitive" idea to their goal of raising a happy family.

This is the essence of "selling." Of course spinners and politicians sell lots of crap to people by telling half truths, overcomplicated arguments or simply outright lying. These are obviously disingenuous. If you need to lie to sell your ethos it is probably wrong. That or you just aren't wise enough to make it comprehensible.

Comment author: AndrewKemendo 17 December 2009 11:54:08AM *  4 points [-]

I am not a fan of internet currency in all its forms generally because it draws attention away from the argument.

Reddit, which this is based on, went to disabling a subtractive karma rule for all submissions and comments. Submissions with down votes greater than up votes just don't go anywhere while negative comment votes get buried similar to how they do here. That seems like a good way to organize the system.

Is the reason that it was implemented in order to be signaling for other users or is it just an artifact of the reddit API? Would disabling the actual display of the "points" simultaneously disable the comment ranking? What would be the most rational way to organize the comments. The least biased way would be for it to be based on time. The current way and the way reddit works is direct democracy and that of course is the tyranny of the majority. The current way may be the most efficient if the readers have such a high vale of their time that they only have time to read the most popular comments and skip the rest. However even if that is efficient it is not necessarily optimized to elucidate the best discussion points as users typically vote up things that they agree with rather than strong arguments.

I personally do not submit more responses and posts because of the karma system. As I have seen heavily on reddit, there is karma momentum where people tend to vote similar to how others have voted (as human nature would dictate). Based on that, I know that people will reference the total points of submitters and make decisions on how to take their comments and suggestions in light of that primed information - when the arguments should be evaluated independently.

Maybe I'm missing something though.

Comment author: AndrewKemendo 13 December 2009 06:43:20AM *  1 point [-]

The most important of which is: if you only do what feels epistemically "natural" all the time, you're going to be, well, wrong.

Then why do I see the term "intuitive" used around here so much?

I say this by way of preamble: be very wary of trusting in the rationality of your fellow humans, when you have serious reasons to doubt their conclusions.

Hmm, I was told here by another lw user that the best thing humans have to truth is consensus.

Somewhere there is a disconnect between your post and much of the consensus, at least in practice, of LW users.

Comment author: AndrewKemendo 13 December 2009 03:27:41AM 2 points [-]

From my understanding Mr. Yudkowski has two separate but linked interests, that of rationality, which predominates in writings and blog posts and designing AI, which is the interaction with SIAI. While I disagree about their particular approach (or lack thereof) I can see how it is rational to follow both simultaneously toward similar ends.

I would argue that rationality and AI are really the same project at different levels and different stated outcomes. Even if an AI never develops, increasing rationality is a good enough goal in and of itself.

View more: Next