Comment author: non-expert 06 February 2013 08:25:04PM 2 points [-]

How has Rationality, as a universal theory (or near-universal) on decision making, confronted its most painful weaknesses? What are rationality's weak points? The more broad a theory is claimed to be, the more important it seems to really test the theory's weaknesses -- that is why I assume you bring up religion, but the same standard should apply to rationality. This is not a cute question from a religious person, more of an intellectual inquiry from a person hoping to learn. In honor of the grand-daddy of cognitive biases, confirmation bias, doesn't rational choice theory need to be vetted?

HungryTurtle makes an attempt to get to this question, but he gets too far into the weeds -- this allowed LW to simply compare the "cons" of religion with the "cons" of rationality -- this is a silly inquiry -- I don't care how the weaknesses of rationality compares to the weaknesses of Judaism because rational theory, if universally applicable with no weaknesses, should be tested on the basis of that claim alone, and not its weaknesses relative to some other theory.

NOTE: re-posting without offending language in the hopes i dont need to create a new name. looks like i lost on my instrumental rationality point, got downvoted enough to get be restricted. on the bright side I am learning to admit i'm wrong (i was wrong to misread whether i'd offend LW, which prevented me from engaging with others on substantive points i'm trying to learn more about).

Comment author: non-expert 06 February 2013 07:58:11PM -4 points [-]

This is a response to theOtherDave -- I can't respond anymore to threads! you guys win! crush dissent based on superficial factors that "automatically result in downvotes" and thus ignore criticism! fool proof!

Is that [understanding reality] the goal? I'm not sure it is As above, I neither agree that understanding reality is a singularly important terminal goal, nor that finding the "best theory" for achieving my goals is a particularly high-priority instrumental goal.

ok, sorry to put words in your mouth -- what is your goal then? Is it not fair to say the goal is "understand reality" and "achieve your goals"? I'm ignoring the second because its personal -- the first goes to a normative understanding of reality, which presumably equally apply to each of us.

perhaps your definition is different, but my understanding is that epistemic rationality is focused on understanding reality, and it uses rational choice theory as a means to understand that reality.

Comment author: Eliezer_Yudkowsky 06 February 2013 06:56:03PM 7 points [-]

On the distant chance that you're actually attempting to be reasonable and are just messing it up, I downvoted this post because I automatically downvote everything that tries to Poison the Well against being downvoted. Being preemptively accused of confirmation bias is itself sufficient reason to downvote.

Comment author: non-expert 06 February 2013 07:44:46PM -1 points [-]

Thanks, EY. I am asking a real question in that i want to know what people think of the question.

As a person that does not think rationality is as useful or as universal as people do on this site, I am at a disadvantage in that i'm in the minority here, however, I'm still posting/reading to question myself through engaging with those I disagree with. I seek the community's perspective, not to necessarily believe it or label it correct/wrong, but simply to understand it. My personal experience (with this name and old ones) has been that people generally do not respond to viewpoints that are contrary to the conventional thought -- this is problematic because this community is best-positioned to defend weaknesses (claimed or real) regarding rationality. Looking at it another way, if I believe that rationality has serious flaws, I need to be able to defend myself against YOUR best arguments, but can only do that if someone engages with me so I understand those arguments first.

The point of my post was to ask a serious question and poke you guys with a stick, hoping the poke elicits a response to the question -- frankly it worked, and now i will swallow, learn from and hopefully respond to the various comments. So long as negative points don't prevent me from reading and posting, i could care less about what points i have -- I also note that I was clear about my intentions about wanting to goad an answer.

Perhaps you disagree with my methods, but since my goal was to hear multiple perspectives, and got more than I usually do, i see its a win for instrumental rationality. And, if my follow-ups suggest I'm not an troll/general a**, perhaps I wont have to use dirty tricks going forward!

Comment author: TheOtherDave 06 February 2013 06:22:23PM 3 points [-]

If by "Rationality, as a universal theory (or near-universal) on decision making" you mean using Bayes' Theorem as a way of determining the likelihood of various potential events and consequently estimating the expected value of various courses of action, which is something that "rationality" sometimes gets used to mean on this site, I'd say (as many have said before me) that one big weakness is the lack of reliable priors. A mechanism for precisely calculating how much I should update P(x) from an existing made-up value based on things I don't necessarily know doesn't provide me with much guidance in my day-to-day life. Another big weakness is computational intractability.

If you mean more broadly making decisions based on the estimated expected value of various courses of action, I suppose the biggest weakness is again computational intractability. Which in turn potentially leads to sloppiness like making simplifying assumptions that are so radically at odds with my real environment that my estimates of value are just completely wrong.

If you mean something else, it might be useful if you said what you mean more precisely.

It's worth noting explicitly that these weaknesses are not themselves legitimate grounds for choosing some other approach that shares the same weaknesses. For example, simply making shit up typically results in estimates of value which are even more wrong. But you explicitly asked about weaknesses in isolation, rather than reasons to pick one decision theory over another.

Comment author: non-expert 06 February 2013 07:08:21PM 0 points [-]

Thanks. I don't mean any weaknesses in particular, the idea laid out by EY was to confront your greatest weaknesses, so that is something for those that follow the theory to look into -- I'm just exploring :).

I guess what I'm not following is this idea of "choosing" an approach. Implicit in your answer I think is the idea that there is a "best" approach that must be discovered among the various theories on living life -- why does the existence of theory that is the "best" indicative that it is universally applicable? The goal is to "understand reality," not choose a methodology that is the "best" under the assumption that the "best" theory can be then be followed universally.

Put differently, to choose rationality as a universal theory notwithstanding its flaws, you're saying more than "its the "best" of all the available theories -- I think you must also believe that the idea of having a set theory to guide life, notwithstanding its flaws, is the best way to go about understanding reality. What is the basis for the belief in the second prong?

Saying "well i have to make a decision," so i need to find the best theory doesn't cut it. It is clear there are times we must make a decision, but you are left with a similar question -- why are humans entitled to know what to do simply because they need to make a decision? Perhaps in "reality" is there is no answer (or no answer within the limits of human comprehension) -- it is true you're stuck not knowing what to do but you surely have a better view of reality (if that is the reality).

The implications of this are important. If you agree that rational choice theory is the "best" of all theories, but also agree that there is (or may be) a distinction between "choosing/applying a set theory" and "understanding reality" to the greatest extent humanly possible, it suggests one would need more than rationality to truly understand reality.

Comment author: non-expert 06 February 2013 06:02:49PM -16 points [-]

How has Rationality, as a universal theory (or near-universal) on decision making, confronted its most painful weaknesses? What are rationality's weak points? The more broad a theory is claimed to be, the more important it seems to really test the theory's weaknesses -- that is why I assume you bring up religion, but the same standard should apply to rationality. This is not a cute question from a religious person, more of an intellectual inquiry from a person hoping to learn. In honor of the grand-daddy of cognitive biases, confirmation bias, doesn't rational choice theory need to be vetted?

HungryTurtle makes an attempt to get to this question, but he gets too far into the weeds -- this allowed LW to simply compare the "cons" of religion with the "cons" of rationality -- this is a silly inquiry -- I don't care how the weaknesses of rationality compares to the weaknesses of Judaism because rational theory, if universally applicable with no weaknesses, should be tested on the basis of that claim alone, and not its weaknesses relative to some other theory.

Please note that negative points to this post, or failure to respond will only provide further evidence that LW is guilty of confirmation bias. Its sweet when you get to use cognitive biases against those that try to weed them out. (Yes, I'm trying to goad someone into answering, but only because I really want to know your answer, not because I'm trying to troll).

Comment author: Qiaochu_Yuan 05 February 2013 11:21:45PM 1 point [-]

I'm happy to agree that emotion hacking is important to epistemic rationality.

Comment author: non-expert 05 February 2013 11:30:17PM 1 point [-]

ok, wasn't trying to play "gotcha," just answering your question. good chat, thanks for engaging with me.

Comment author: Qiaochu_Yuan 05 February 2013 11:07:51PM 0 points [-]

I agree that this is problematic but don't see what it has to do with what I've been saying.

Comment author: non-expert 05 February 2013 11:19:45PM 1 point [-]

you suggested that emotion hacking is more of an issue for instrumental rationality and not so much for epistemic rationality. to the extent that is wrong, you're ignoring emotion hacking (subjective factor) from your application of epistemic rationality.

Comment author: Qiaochu_Yuan 05 February 2013 09:29:49PM 0 points [-]

I worry that rationality, to the extent it must value subjective considerations, tends to minimize the importance of those considerations to yield a more clear inquiry.

Can you clarify what you mean by this?

Comment author: non-expert 05 February 2013 10:46:08PM *  0 points [-]

sure. note that i don't offer this as conclusive or correct, but just something i'm thinking about. also, lets assume rational choice theory is universally applicable for decision making.

rational choice theory gives you an equation to use and all we have to do is fill that equation with the proper inputs, value them correctly, and you get an answer. Obviously this is more difficult in practice, particularly where inputs (as to be expected) are not easily convertible to probabilities/numbers -- I'm worried this is actually more problematic than we think. Once we have an objective equation as a tool, we may be biased to assume objectivity and truth regarding our answers, even though that belief often is based on the strength of the starting equation and not on our ability to accurately value and include the appropriate subjective factors. To the extent answering a question becomes difficult, we manufacture "certainty" by ignoring subjectivity or assuming it is not as relevant as it is.

Simply put, the belief we have a good and objective starting point biases us to believe we also can/will/actually derive an objectively correct answer, affecting the accuracy with which we fill in the equation.

Comment author: Qiaochu_Yuan 05 February 2013 08:51:58PM *  2 points [-]

isn't this the ONLY kind of emotion-hacking out there? what emotions are expressed irrespective of external stimuli? seems like a small or insignificant subset.

Let me make some more precise definitions: by "emotional responses to my thoughts" I mean "what I feel when I think a given thought," e.g. I feel a mild negative emotion when I think about calling people. By "emotional responses to my behavior" I mean "what I feel when I perform a given action," e.g. I feel a mild negative emotion when I call people. By "emotional responses to external stimuli" I mean "what I feel when a given thing happens in the world around me," e.g. I feel a mild negative emotion when people call me. The distinction I'm trying to make between my behavior and external stimuli is analogous to the distinction between operant and classical conditioning.

I thought you were questioning the value of considering/responding to others' thoughts, because you are arguing that even if you could, you would need to rely on their words and expressions, which may not be correlated with their "true" state of mind.

No, I'm just making the point that for the purposes of classifying different kinds of emotion-hacking I don't find it useful to have a category for other people's thoughts separate from other people's behaviors (in contrast to how I find it useful to have a category for my thoughts separate from my behaviors), and the reason is that I don't have direct access to other people's thoughts.

Interestingly, the "problem" you have

What problem?

Comment author: non-expert 05 February 2013 09:15:50PM 0 points [-]

Thanks for the clarification, now i understand.

Going back to the original comment i commented on:

emotion-hacking is mostly an instrumental technique (although it is also epistemically valuable to notice and then stop your brain from flinching away from certain thoughts).

Particularly with your third type of emotion hacking ("hacking your emotional responses to external stimuli"), it seems emotion hacking is vital for for epistemic rationality -- i guess that relates to my original point, that hacking emotions are at least as important for epistemic rationality as hacking emotions for instrumental rationality.

I raised the issue originally because I worry that rationality, to the extent it must value subjective considerations, tends to minimize the importance of those considerations to yield a more clear inquiry.

Comment author: Qiaochu_Yuan 05 February 2013 08:15:27PM 1 point [-]

I don't understand what point you're trying to make.

Comment author: non-expert 05 February 2013 08:38:31PM 1 point [-]

I suppose there is a third kind of emotion-hacking, namely hacking your emotional responses to external stimuli.

isn't this the ONLY kind of emotion-hacking out there? what emotions are expressed irrespective of external stimuli? seems like a small or insignificant subset.

But it's not as if I can respond to other people's thoughts, even in principle: all I have access to are sounds or images which purport to be correlated to those thoughts in some mysterious way.

the second two paragraphs above are responding to this. sorry to throw it back at you, but perhaps i'm misunderstanding the point you were trying to make here? I thought you were questioning the value of considering/responding to others' thoughts, because you are arguing that even if you could, you would need to rely on their words and expressions, which may not be correlated with their "true" state of mind.

View more: Prev | Next