Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Rossin 08 September 2017 06:31:40PM 1 point [-]

I think that's a fair assessment, I have an image of myself as the sort of person who would value saving lives over beer and my alarm came from noticing a discrepancy between my self-image and my actions. I am trying to bring the two things in line because that self-image seems like something I want to actually be rather than think I am.

Comment author: Voltairina 08 September 2017 08:36:49PM 0 points [-]

I think our maps of these scenarios can be a bit limited. Like I think you have to model yourself in a world where you are also a person who has needs which have to be advocated for / accounted for, and particularly you have to think, I have access to or control over these resources, that I can turn to these needs, and my sphere of control depends on things like my psychological state and how well rested I am and how much I know, what skills I have, what tools I have, etc, which I can also sometimes spend those resources learning, buying etc. And in which all that's true of everyone else, too, of course.. that they are in a world where they may have to advocate for themselves to an extent or where some may be impaired or better able than most to do that. If you're waited on hand and foot, you may be able to afford to pour more of your 'all' into benevolent behavior - if other people are making sure you sleep and feeding you on time and everything...

In response to comment by Regex on What is Rational?
Comment author: Yosarian2 25 August 2017 07:24:57PM 1 point [-]

Maybe in the specific example of randomness, but I don't think you can say the general case of 'it feels so' is indefensible. This same mechanism is used for really complicated black box intuitive reasoning that underpins any trained skill. So in in areas one has a lot of experience in, or areas which are evolutionary keyed in such as social interactions or in nature this isn't an absurd belief.

Eh. Maybe, but I think that any idea which seriously underpins your actions and other belief systems in an important way should be something you can justify in a rational way. It doesn't mean you always need to think about it in that way, some things become "second nature" over time, but you should be able to explain rational underpinnings if asked.

If you're talking about a trained skill, "I've been fixing cars for 20 years and in my experience when you do x you tend to get better results then when you do y" is a perfectly rational reason to have a belief. So is "That's what it said in my medical school textbook", ect.

But, in my experience, people who put too much faith in their "black boxes" and don't ever think through the basis of their beliefs tend to behave in systematically irrational ways that probably harm them.

Comment author: Voltairina 25 August 2017 10:32:04PM *  2 points [-]

Its funny, I think this is probably always true as a guideline (that you should try and justify all your ideas) but might always break down in practice (all your ideas probably can't ever be fully justified, because Agrippa's trilemma - they're either justified in terms of each other, or not justified, and if they are justified in terms of other ideas, they eventually either are eventually circularly justified, or continue on into infinite regress, or are justified by things that are unjustified). We might get some ground by separating out ideas from evidence, and say we accept as axiomatic anything that is evidenced by inference until we gain additional facts that lend context that resituates our model so that it can include previous observations... something like that. Or it might be we just have to grandfather in some rules to avoid that Godelian stuff. Thoughts?

Comment author: ChristianKl 21 August 2017 07:13:41AM 4 points [-]

The ability to express basic nonsurprising facts is useful.

When discussing whether or not to allow abortion of a fetus it matters whether you believe that real human consciousness needs a certain amount of neurons to emerge.

Plenty of people believe in some form of soul that's a unit that creates consciousness. Saying that it's emergent means that you disagree.

According to Scott's latest post about EA global, there are people at the foundational research institute who do ask themseves whether particles can be conscious.

There are plenty of cases where people try to find reductionist ways to thinking about a domain. Calories in, calories out is a common paradigm that drives a lot of thinking about diet. If you instead have a paradigm that centeres around a cybernetic system that has an emergent set point that's managed by a complex net of neurons, that paradigm gives a different perspective about what to do about weightloss.

Comment author: Voltairina 22 August 2017 06:50:20PM *  0 points [-]

I think you're right. I also think saying 'x is emergent' may sound more magical than it is, if I am understanding emergence right, depending on your understanding of it. Like it doesn't mean that the higher scale phenomenon isn't /made up of/ lower-level phenomena, but that it isn't (like a homonculi) itself present as anything smaller than that level. Like a robot hopping kangaroo toy needs both a body, and legs. The hopping behavior isn't contained in the body - that just rotates a joint. The hopping behavior isn't contained in the legs - those just have a joint that can connect to the body joint. Its only when the two bits are plugged into each other that the 'hopping' behavior 'emerges' from the torso-legs system. Its not coming from any essential 'hoppiness' in the legs or the torso. I think it can seem a bit magical because it can sound like the behavior just 'appears' at a certain point but its no more than a picture of a tiger 'appears' from a bunch of pixels. Only we're talking about names for systems of functions (hopping is made of the leg and torso behaviors and their interaction with the ground and stuff) more than names for systems of objects (tiger picture is made up of lines and corners and stuff are made of pixels and stuff). In some sense 'tigers' and 'hopping' don't really exist - just pixels (or atoms or whatever) and particle interactions. But we have names for systems of objects, and systems of functions, because those names are useful.

Tabooing Science + an xkcd comic about the eclipse - "Honestly, it's not that scientific."

4 Voltairina 16 August 2017 03:19PM

It occurred to me when I was reading XKCD a moment ago that given that there exists a strain of suspicion of anything 'science' among a certain crowd in this country (fundamentalists, creationists, etc), and a kind of mystique among another crowd (of the "It was in a study so it must be true" variety) that it might be helpful, given that by doing science people are more or less systematizing thinking critically and checking things to be as certain as they can about an idea, to kind of pay attention to and possibly 'play taboo' to an extent when that something-is-a-special-kind-of-a-thing-because-it-is-a-science-thing attitude comes up.

A good example being the xkcd comic I got it from:

Comment author: Voltairina 13 October 2015 08:57:07PM *  1 point [-]

From what I've read, the proposed mechanism behind literary fiction enhancing empathy is that it describes the emotions of the characters in a vague or indirect way, and working out their actual psychological character becomes plot-relevant. This was distinct from genre fiction, where the results were less obvious. So the 'good guys are always rewarded' bit, which is prevalent in genre fiction, doesn't seem like the best explanation for the effect. It could be compared to an extended story problem about empathy - at least as far as predicting motives and emotions.

In response to Test Driven Thinking
Comment author: Voltairina 26 July 2015 05:18:05AM 0 points [-]

That seems like a job for an expert system - using formal reasoning from premises (as long as you can translate them comfortably into symbols), identifying whether a new fact contradicts any old fact...

Comment author: Voltairina 13 February 2015 06:22:48AM 2 points [-]

Not to mention tampering with it, or allowing it to tamper with itself, might have all kinds of unforeseen consequences. To me its like, here is a whole lot of evolutionary software that does all this elegant stuff a lot of the time... but has never been unit tested.

Comment author: CCC 14 October 2014 02:34:32PM 2 points [-]

While that is a world without rationality, it seems a fairly extreme case.

Another example of a world without rationality is a world in which, the more you work towards achieving a goal, the longer it takes to reach that goal; so an elderly man might wander distractedly up Mount Everest to look for his false teeth with no trouble, but a team of experienced mountaineers won't be able to climb a small hill. Even if they try to follow the old man looking for his teeth, the universe notices their intent and conspires against them. And anyone who notices this tendency and tries to take advantage of it gets struck by lightning (even if they're in a submarine at the time) and killed instantly.

Comment author: Voltairina 15 October 2014 12:46:08AM 4 points [-]

That reminds me of Hofstadter's Law: "It will always take longer than you think it is going to take. Even when you take into account Hofstadter's Law."

Comment author: faul_sname 12 November 2012 11:59:25PM 12 points [-]

if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

I just attempted to visualize such a world, and my mind ran into a brick wall. I can easily imagine a world in which I am not perfectly rational (and in fact am barely rational at all), and that world looks a lot like this world. But I can't imagine a world in which rationality doesn't exist, except as a world in which no decision-making entities exist. Because in any world in which there exist better and worse options and an entity that can model those options and choose between them with better than random chance, there exists a certain amount of rationality.

Comment author: Voltairina 14 October 2014 02:13:04PM *  1 point [-]

Well, a world that lacked rationality might be one in which all the events were a sequence of non-sequiters. A car drives down the street. Then dissappears. We are in a movie theater with a tyrannosaurus. Now we are a snail on the moon. Then there's just this poster of rocks. Then I can't remember what sight was like, but there's jazz music. Now I fondly remember fighting in world war 2, while evading the Empire with Hans solo. Oh! I think I might be boiling water, but with a sense of smell somehow.... that's a poor job of describing it -- too much familiar stuff - but you get the idea. If there was no connection between one state of affairs and the next, talking about what strategy to take might be impossible, or a brief possibility that then dissappears when you forget what you are doing and you're back in the movie theater again with the tyrannosaurus. If 'you' is even a meaningful way to describe a brief moment of awareness bubbling into being in that universe. Then again, if at any moment 'you' happen to exist and 'you' happen to understand what rationality means- I guess now that I think about it, if there is any situation where you can understand what the word rationality means, its probably one in which it exists (howevery briefly) and is potentially helpful to you, even if there is little useful to do about whatever situation you are in, there might be some useful thing to do about the troubling thoughts in your mind.

Comment author: Voltairina 23 September 2014 05:31:05AM 3 points [-]

Thank you for letting us know. Don't tell me your idea:).

View more: Next