AndrewKemendo

Posts

Sorted by New

Wiki Contributions

Comments

Sorted by

I had not read that part. Thanks.

I do not see any difference in inductive bias as it is written there and dictionary and wikipedia definitions of faith:

Something that is believed especially with strong conviction(http://www.merriam-webster.com/dictionary/faith)

Faith is to commit oneself to act based on sufficient experience to warrant belief, but without absolute proof.

I think you, EY and most use the term faith in a historical context related to religion rather than its definitional context as it relates to epistemological concerns of trust in an idea or claim

The best definition I have found so far for faith is thus:

Faith is to commit oneself to act based on sufficient experience to warrant belief, but without absolute proof.

So I have no problem using faith and induction interchangeably because it is used just as you say:

inferring the future from the past (or the past from the present), which basically requires the universe to consistently obey the same laws.

Religions claim that they do this. Of course they don't because they do not apply a constant standard to their worldview to all events. It is not because of their faith that they are wrong, it is because of their inconsistent application of accepting claims and ignoring evidence.

The point of the system is to deconstruct why you see their claims of evidence as faith and vice versa. Hence the incorruptible example.

Intuition (what you call "faith") is evidence.

If you will, please define intution as you understand it.

From how I understand intuition, it is knowledge for which the origin cannot be determined. I have certainly experienced the "I know I read something about that somewhere but I just can't remember" feeling before and was right about it. However just as equally I have been wrong about conclusions that I have come to through this means.

I think your entire post gives the same visceral description as someone would describe about having "felt the holy spirit "or some other such nonsense.

I honestly think that the issue of intuition is a MAJOR hurdle for rationality. I tend to err on the side of intuition being false evidence - hence why I indicated that our heuristics filled in the blanks. That is why I categorize intuition with faith similarly.

confidence level.

Most people do not understand what a confidence interval or confidence levels are. At least in my interactions. Unless you have had some sort of statistics (even basic) you probably haven't heard of it.

I think it improperly relabels "uncertainty" as "faith."

Perhaps. The way I see uncertainty as it pertains to one or another claim is that there will almost always be a reasonable counter claim and in order to dismiss the counter claim and accept the premise, that is faith in the same sense.

The only thing one truly must have faith in (and please correct me if you can; I'd love to be wrong) is induction, and if you truly lacked faith in induction, you'd literally go insane.

Intuition and induction are in my view very similar to what is understood as faith. I failed to make that clear, however I would use those interchangeably.

I recognize that faith is a touchy issue because it is so dramatically irrational and essentially leads to the slippery slope of faith. I view the issue similar to how the case was made for selecting the correct contrarian views, we are concluding approximately for what we do not know or for counterclaims.

Sure. What's not rational is to believe ... politicians

I think that is likely the best approach

Your argument seems to conclude that:

It is impossible to reason with unreasonable people

Agreed. Now what?

Ostensibly your post is about how to swing the ethos of a large group of people towards behaving differently. I would argue that has never been necessary and still is not.

A good hard look at any large political or social movement reveals a small group of very dedicated and motivated people, and a very large group of passive marginally interested people who agree with whatever sounds like it is in their best interest without them really doing too much work.

So can rationality work on a large scale? Arguably, it always does work. I rarely hear political or social arguments that are obviously (to everyone) pure hokum. If you look at how the last 4 U.S. presidents campaigned, it was always on "save you money" talking points and "less waste, more justice" platform. All rational things in the mind of the average person.

I think however your implication is that rationality is not always obviously rational. Well friend, that is why you have to completely understand the implications of rational decision making in terms that the majority can agree on in order to describe why they are better decisions. You often have to connect the dots for people so that they can see how to get from some contrarian or "non-intuitive" idea to their goal of raising a happy family.

This is the essence of "selling." Of course spinners and politicians sell lots of crap to people by telling half truths, overcomplicated arguments or simply outright lying. These are obviously disingenuous. If you need to lie to sell your ethos it is probably wrong. That or you just aren't wise enough to make it comprehensible.

I am not a fan of internet currency in all its forms generally because it draws attention away from the argument.

Reddit, which this is based on, went to disabling a subtractive karma rule for all submissions and comments. Submissions with down votes greater than up votes just don't go anywhere while negative comment votes get buried similar to how they do here. That seems like a good way to organize the system.

Is the reason that it was implemented in order to be signaling for other users or is it just an artifact of the reddit API? Would disabling the actual display of the "points" simultaneously disable the comment ranking? What would be the most rational way to organize the comments. The least biased way would be for it to be based on time. The current way and the way reddit works is direct democracy and that of course is the tyranny of the majority. The current way may be the most efficient if the readers have such a high vale of their time that they only have time to read the most popular comments and skip the rest. However even if that is efficient it is not necessarily optimized to elucidate the best discussion points as users typically vote up things that they agree with rather than strong arguments.

I personally do not submit more responses and posts because of the karma system. As I have seen heavily on reddit, there is karma momentum where people tend to vote similar to how others have voted (as human nature would dictate). Based on that, I know that people will reference the total points of submitters and make decisions on how to take their comments and suggestions in light of that primed information - when the arguments should be evaluated independently.

Maybe I'm missing something though.

The most important of which is: if you only do what feels epistemically "natural" all the time, you're going to be, well, wrong.

Then why do I see the term "intuitive" used around here so much?

I say this by way of preamble: be very wary of trusting in the rationality of your fellow humans, when you have serious reasons to doubt their conclusions.

Hmm, I was told here by another lw user that the best thing humans have to truth is consensus.

Somewhere there is a disconnect between your post and much of the consensus, at least in practice, of LW users.

From my understanding Mr. Yudkowski has two separate but linked interests, that of rationality, which predominates in writings and blog posts and designing AI, which is the interaction with SIAI. While I disagree about their particular approach (or lack thereof) I can see how it is rational to follow both simultaneously toward similar ends.

I would argue that rationality and AI are really the same project at different levels and different stated outcomes. Even if an AI never develops, increasing rationality is a good enough goal in and of itself.

Load More