Comment author: mark_spottswood 06 March 2009 05:18:04PM 1 point [-]

It depends how much relative value you assign to the following things:

  1. Increasing your well-being and life satisfaction.
  2. Your reputation (drug users have low status, mostly).
  3. Not having unpleasant contacts with the criminal justice system.
  4. Viewing the world through your current set of perceptive and affective filters, rather than through a slightly different set of filters.
In response to comment by [deleted] on Is it rational to take psilocybin?
Comment author: pwno 06 March 2009 06:20:24AM 0 points [-]

If you don't want to see it, because you're worried the new perspective will screw you up, that's a legitimate fear.

But I feel like any drastic perspective change will screw me up, whether it be positive or negative. Why would someone make the choice to change their preferences (as opposed to optimize them)?

Comment author: mark_spottswood 06 March 2009 05:03:43PM 4 points [-]

Because we can have preferences over our preferences. For instance, I would prefer it if I preferred to eat healthier foods because that preference would clash less with my desire to stay fit and maintain my health. There is nothing irrational about wishing for more consistent (and thus more achievable) preferences.

In response to Define Rationality
Comment author: mark_spottswood 06 March 2009 04:46:38PM *  1 point [-]

Arguing over definitions is pointless, and somewhat dangerous. If we define the word "rational" in some sort of site-specific way, we risk confusing outsiders who come here and who haven't read the prior threads.

Use the word "rational" or "rationality" whenever the difference between its possible senses does not matter. When the difference matters, just use more specific terminology.

General rule: When terms are confusing, it is better to use different terms than to have fights over meanings. Indeed, your impulse to fight for the word-you-want should be deeply suspect; wanting to affiliate our ideas with pleasant-sounding words is very similar to our desire to affiliate with high-status others; it makes us (or our ideas) appealing for reasons that are unrelated to the correctness or usefulness of what we are saying.

Comment author: Andy_McKenzie 05 March 2009 05:10:45PM 6 points [-]

Jack: The idea of having citations everywhere is nice but unpragmatic. It would slow down conversation and dialogue tremendously.

One possible alternative is to have nested dialogues. Each sentence that makes some sort of claim links to another which explains the idea more thoroughly if that is what you disagree with. If you do not disagree with that point, then you can continue reading the main chain. This is similar to the idea of hypertext dialogue: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.40.3246 , and it is similar to what Eliezer has done at OB by being so self-referential.

Comment author: mark_spottswood 05 March 2009 10:55:28PM 2 points [-]

I think the idea of a nested dialogue is a great one. You could also incorporate reader voting, so that weak arguments get voted off of the dialogue while stronger ones remain, thus winnowing down the argument to its essence over time.

I wonder if our hosts, or any contributors, would be interested in trying out such a procedure as a way of exploring a future disagreement?

Comment author: mark_spottswood 05 March 2009 07:35:47PM 5 points [-]

Useful practice: Systematize credibility assessments. Find ways to track the sincerity and accuracy of what people have said in the past, and make such information widely available. (An example from the legal domain would be a database of expert witnesses, which includes the number of times courts have qualified them as experts on a particular subject, and the number of times courts adopted or rejected their conclusions.) To the extent such info is widely available, it both helps to "sterilize" the information coming from untrustworthy sources and to promote the contributions that are most likely to be helpful. It also helps improve the incentive structure of truth-seeking discussions.

Comment author: Eliezer_Yudkowsky 05 March 2009 06:57:51PM 4 points [-]

Maybe "truth-seeking" versus "winning", if there's a direct appeal to one and not the other. But I am generally willing to rescue the word "rationality".

Comment author: mark_spottswood 05 March 2009 07:26:35PM *  4 points [-]

Sorry -- I meant, but did not make clear, that the word "rationality" should be avoided only when the conversation involves the clash between "winning" and "truth seeking." Otherwise, things tend to bog down in arguments about the map, when we should be talking about the territory.

Comment author: Eliezer_Yudkowsky 03 March 2009 07:29:13PM *  10 points [-]

My definition differs from the one in Wikipedia because I require that your goals not call for any particular ritual of cognition. When you care more about winning then about any particular way of thinking - and "winning" is not defined in such a way as to require in advance any particular method of thinking - then you are pursuing rationality.

This, in turn, ends up implying epistemic rationality: if the definition of "winning" doesn't require believing false things, then you can generally expect to do better (on average) by believing true things than false things - certainly in real life, despite various elaborate philosophical thought experiments designed from omniscient truth-believing third-person standpoints.

Conversely you can start with the definition of rational belief as accuracy-seeking, and get to pragmatics via "That which can be destroyed by the truth should be" and the notion of rational policies as those which you would retain even given an epistemically rational prediction of their consequences.

Comment author: mark_spottswood 05 March 2009 06:22:58PM 4 points [-]

Eliezer said: This, in turn, ends up implying epistemic rationality: if the definition of "winning" doesn't require believing false things, then you can generally expect to do better (on average) by believing true things than false things - certainly in real life, despite various elaborate philosophical thought experiments designed from omniscient truth-believing third-person standpoints.

--

I think this is overstated. Why should we only care what works "generally," rather than what works well in specific subdomains? If rationality means whatever helps you win, than overconfidence will often be rational. (Examples: placebo effect, dating, job interviews, etc.) I think you need to either decide that your definition of rationality does not always require a preference for true beliefs, or else revise the definition.

It also might be worthwhile, for the sake of clarity, to just avoid the word "rationality" altogether in future conversations. It seems to be at risk of becoming an essentially contested concept, particularly because everyone wants to be able to claim that their own preferred cognitive procedures are "rational." Why not just talk about whether a particular cognitive ritual is "goal-optimizing" when we want to talk about Eliezer-rationality, while saving the term "truth-optimizing" (or some variant) for epistemic-rationality?

Comment author: pwno 04 March 2009 08:26:07AM *  8 points [-]

I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.

My definition of truth-seeking is basically Robin's definition of "rational." I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?

Comment author: mark_spottswood 05 March 2009 06:08:05PM *  7 points [-]

Pwno said: I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?


The classic example would invoke the placebo effect. Believing that medical care is likely to be successful can actually make it more successful; believing that it is likely to fail might vitiate the placebo effect. So, if you are taking a treatment with the goal of getting better, and that treatment is not very good (but it is the best available option), then it is better from a rationalist goal-seeking perspective to have an incorrectly high assessment of the treatment's possibility of success.

This generalizes more broadly to other areas of life where confidence is key. When dating, or going to a job interview, confidence can sometimes make the difference between success and failure. So it can pay, in such scenarios, to be wrong (so long as you are wrong in the right way).

It turns out that we are, in fact, generally optimized to make precisely this mistake. Far more people think they are above average in most domains than hold the opposite view. Likewise, people regularly place a high degree of trust in treatments with a very low probability of success, and we have many social mechanisms that try and encourage such behavior. It might be "irrational" under your usage to try and help these people form more accurate beliefs.

View more: Prev