You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

G0W51 comments on My Skepticism - Less Wrong Discussion

2 Post author: G0W51 31 January 2015 02:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (100)

You are viewing a single comment's thread. Show more comments above.

Comment author: G0W51 31 January 2015 06:05:26PM *  -1 points [-]

I suppose the question then is why we should make the necessary assumptions.

Comment author: dxu 31 January 2015 07:24:32PM *  1 point [-]

There is no "why". If there was, then the assumptions wouldn't be called "assumptions". If you want to have a basis for believing anything, you have start from your foundations and build up. If those foundations are supported, then by definition they are not foundations, and the "real" foundations must necessarily be further down the chain. Your only choice is to pick suitable axioms on which to base your epistemology or to become trapped in a cycle of infinite regression, moving further and further down the chain of implications to try and find where it stops, which in practice means you'll sit there and think forever, becoming like unto a rock.

The chain won't stop. Not unless you artificially terminate it.

Comment author: TheAncientGeek 31 January 2015 09:09:03PM 0 points [-]

So it's Ok to use non rationalist assumptions?

Comment author: dxu 31 January 2015 11:02:40PM *  -1 points [-]

I haven't the slightest idea what you mean by "non rationalist" (or "Ok" for that matter), but I'm going to tentatively go with "yes", if we're taking "non rationalist" to mean "not in accordance with the approach generally advocated on LessWrong and related blogs" and "Ok" to mean "technically allowed". If you mean something different by "non rationalist" you're going to have to specify it, and if by "Ok" you mean "advisable to do so in everyday life", then heck no. All in all, I'm not really sure what your point is, here.

Comment author: TheAncientGeek 01 February 2015 11:46:49AM *  0 points [-]

Your guesses are about right:.

The significance is that if rationalists respond to sceptical challenges by assuming what they can't prove, then they are then in the same position as reformed epistemology. That is, they can't say why their axioms are rational, and can't say why theists are irrational, because theists who follow RE are likewise taking the existence of God as something they are assuming because they can't prove it: rationalism becomes a label with little meaning.

Comment author: dxu 01 February 2015 05:44:44PM *  -1 points [-]

So you're saying that taking a few background axioms that are pretty much required to reason... is equivalent to theism.

I think you may benefit from reading The Fallacy of Grey, as well as The Relativity of Wrong.

Comment author: TheAncientGeek 01 February 2015 07:49:15PM *  0 points [-]

The axioms of rationality are required to reason towards positive conclusions about a real world. They are not a minimal set, because sceptics have a smaller set, which can do less.

Comment author: dxu 01 February 2015 07:50:30PM *  0 points [-]

which can do less.

Most people probably aren't satisfied with the sort of "less" that universal skepticism can do.

Also, some axioms are required to reason, period. Let's say I refuse to take ~(A ∧ ~A) as an axiom. What now? (And don't bring up paraconsistent logic, please--it's silly.)

Comment author: TheAncientGeek 01 February 2015 07:53:48PM *  -1 points [-]

Rational axioms do less than theistic axioms, and a lot of people arent happy with that "less" either.

Comment author: dxu 01 February 2015 07:55:10PM *  0 points [-]

Rational axioms do less than theists axiom

Not in terms of reasoning "towards positive conclusions about a real world", they don't.

a lot of people arent happy with that "less" either.

Most of whom are theists trying to advance an agenda. "Rational" axioms, on the other hand, are required to have an agenda.

Comment author: G0W51 31 January 2015 10:02:52PM -1 points [-]

If there is no why, is any set of axioms better than any other? Could one be just as justified believing that, say, what actually happened is the opposite of what one's memories say?

Comment author: dxu 31 January 2015 11:03:17PM *  0 points [-]

(Note: I'm going to address your questions in reverse order, as the second one is easier to answer by far. I'll go into more detail on why the first one is so hard to answer below.)

Could one be just as justified believing that, say, what actually happened is the opposite of what one's memories say?

Certainly, if you decide to ignore probability theory, Occam's Razor, and a whole host of other things. It's not advisable, but it's possible if you choose your axioms that way. If you decide to live your life under such an assumption, be sure to tell me how it turns out.

If there is no why, is any set of axioms better than any other?

At this point, I'd say you're maybe a bit confused about the meaning of the word "better". For something to be "better" requires a criterion by which to judge that something; you can't just use the word "better" in a vacuum and expect the other person to be able to immediately answer you. In most contexts, this isn't a problem because both participants generally understand and have a single accepted definition of "better", but since you're advocating throwing out pretty much everything, you're going to need to define (or better yet, Taboo) "better" before I can answer your main question about a certain set of axioms being better than any other.

Comment author: G0W51 01 February 2015 02:17:25AM -1 points [-]

Certainly, if you decide to ignore probability theory, Occam's Razor, and a whole host of other things. It's not advisable, but it's possible if you choose your axioms that way. If you decide to live your life under such an assumption, be sure to tell me how it turns out.

Why would one need to ignore probability theory and Occam's Razor? Believing that the world is stagnant and that the memories one is currently thinking of are false, and that the memory of having more memories is false, seems to be a simple explanation to the universe.

At this point, I'd say you're maybe a bit confused about the meaning of the word "better". For something to be "better" requires a criterion by which to judge that something; you can't just use the word "better" in a vacuum and expect the other person to be able to immediately answer you. In most contexts, this isn't a problem because both participants generally understand and have a single accepted definition of "better", but since you're advocating throwing out pretty much everything, you're going to need to define (or better yet, Taboo) "better" before I can answer your main question about a certain set of axioms being better than any other.

By better, I mean "more likely to result in true beliefs." Or if you want to taboo true, "more likely to result in beliefs that accurately predict percepts."

Comment author: ike 01 February 2015 02:56:05AM *  0 points [-]

Or if you want to taboo true, "more likely to result in beliefs that accurately predict percepts."

If I were to point out that my memories say that making some assumptions tend to lead to better perception predictions (and presumably yours also), would you accept that?

Are you actually proposing a new paradigm that you think results in systematically "better" (using your definition) beliefs? Or are you just saying that you don't see that the paradigm of accepting these assumptions is better at a glance, and would like a more rigorous take on it? (Either is fine, I'd just respond differently depending on what you're actually saying.)

Comment author: G0W51 01 February 2015 04:20:33AM *  -1 points [-]

If I were to point out that my memories say that making some assumptions tend to lead to better perception predictions (and presumably yours also), would you accept that?

I'd only believe it if you gave evidence to support it.

Are you actually proposing a new paradigm that you think results in systematically "better" (using your definition) beliefs? Or are you just saying that you don't see that the paradigm of accepting these assumptions is better at a glance, and would like a more rigorous take on it? (Either is fine, I'd just respond differently depending on what you're actually saying.)

The latter. What gave you the suggestion that I was proposing an improved paradigm?

Comment author: ike 01 February 2015 04:30:21AM *  0 points [-]

What gave you the suggestion that I was proposing an improved paradigm?

You seemed to think that not taking some assumptions could lead to better beliefs, and it wasn't clear to me how strong your "could" was.

You seem to accept induction, so I'll refer you to http://lesswrong.com/lw/gyf/you_only_need_faith_in_two_things/

Comment author: G0W51 01 February 2015 05:38:02PM 0 points [-]

Though the linked article stated that one only needs to believe that induction has a non-super-exponentially small chance of working and that a single large ordinal is well-ordered, but it did really justify this. It spoke nothing about why belief in one's percepts and reasoning skills is needed.

Comment author: dxu 01 February 2015 06:24:13AM -1 points [-]

Believing that the world is stagnant and that the memories one is currently thinking of are false, and that the memory of having more memories is false, seems to be a simple explanation to the universe.

Not in the sense that I have in mind.

"more likely to result in true beliefs."

Unfortunately, this still doesn't solve the problem. You're trying to doubt everything, even logic itself. What makes you think the concept of "truth" is even meaningful?