Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Voltairina 13 October 2015 08:57:07PM *  1 point [-]

From what I've read, the proposed mechanism behind literary fiction enhancing empathy is that it describes the emotions of the characters in a vague or indirect way, and working out their actual psychological character becomes plot-relevant. This was distinct from genre fiction, where the results were less obvious. So the 'good guys are always rewarded' bit, which is prevalent in genre fiction, doesn't seem like the best explanation for the effect. It could be compared to an extended story problem about empathy - at least as far as predicting motives and emotions.

In response to Test Driven Thinking
Comment author: Voltairina 26 July 2015 05:18:05AM 0 points [-]

That seems like a job for an expert system - using formal reasoning from premises (as long as you can translate them comfortably into symbols), identifying whether a new fact contradicts any old fact...

Comment author: Voltairina 13 February 2015 06:22:48AM 2 points [-]

Not to mention tampering with it, or allowing it to tamper with itself, might have all kinds of unforeseen consequences. To me its like, here is a whole lot of evolutionary software that does all this elegant stuff a lot of the time... but has never been unit tested.

Comment author: CCC 14 October 2014 02:34:32PM 2 points [-]

While that is a world without rationality, it seems a fairly extreme case.

Another example of a world without rationality is a world in which, the more you work towards achieving a goal, the longer it takes to reach that goal; so an elderly man might wander distractedly up Mount Everest to look for his false teeth with no trouble, but a team of experienced mountaineers won't be able to climb a small hill. Even if they try to follow the old man looking for his teeth, the universe notices their intent and conspires against them. And anyone who notices this tendency and tries to take advantage of it gets struck by lightning (even if they're in a submarine at the time) and killed instantly.

Comment author: Voltairina 15 October 2014 12:46:08AM 4 points [-]

That reminds me of Hofstadter's Law: "It will always take longer than you think it is going to take. Even when you take into account Hofstadter's Law."

Comment author: faul_sname 12 November 2012 11:59:25PM 12 points [-]

if they could accurately visualize that hypothetical world in which there was no rationality and they themselves have become irrational?

I just attempted to visualize such a world, and my mind ran into a brick wall. I can easily imagine a world in which I am not perfectly rational (and in fact am barely rational at all), and that world looks a lot like this world. But I can't imagine a world in which rationality doesn't exist, except as a world in which no decision-making entities exist. Because in any world in which there exist better and worse options and an entity that can model those options and choose between them with better than random chance, there exists a certain amount of rationality.

Comment author: Voltairina 14 October 2014 02:13:04PM *  1 point [-]

Well, a world that lacked rationality might be one in which all the events were a sequence of non-sequiters. A car drives down the street. Then dissappears. We are in a movie theater with a tyrannosaurus. Now we are a snail on the moon. Then there's just this poster of rocks. Then I can't remember what sight was like, but there's jazz music. Now I fondly remember fighting in world war 2, while evading the Empire with Hans solo. Oh! I think I might be boiling water, but with a sense of smell somehow.... that's a poor job of describing it -- too much familiar stuff - but you get the idea. If there was no connection between one state of affairs and the next, talking about what strategy to take might be impossible, or a brief possibility that then dissappears when you forget what you are doing and you're back in the movie theater again with the tyrannosaurus. If 'you' is even a meaningful way to describe a brief moment of awareness bubbling into being in that universe. Then again, if at any moment 'you' happen to exist and 'you' happen to understand what rationality means- I guess now that I think about it, if there is any situation where you can understand what the word rationality means, its probably one in which it exists (howevery briefly) and is potentially helpful to you, even if there is little useful to do about whatever situation you are in, there might be some useful thing to do about the troubling thoughts in your mind.

Comment author: Voltairina 23 September 2014 05:31:05AM 3 points [-]

Thank you for letting us know. Don't tell me your idea:).

Comment author: Voltairina 26 May 2014 04:10:01PM -1 points [-]

At any given time my ability to focus on and think about my individual memories is limited to a small portion of the total. As long as the thread of connections was kept consistent, all sorts of things about myself could chance without me having any awareness of them. If I was aware that they had changed, I would still have to put up with who I had now become, I think... unless I had some other reason for having allegiance to who I had been... say disliking whoever or whatever had made me who I was, or finding that I was much less capable than I had been, or something. If I was aware that they would change, drastically, but that afterwards it would all seem coherent, and I wouldn't remember worrying about them changing - or that while I was not focusing on them, they were changing very radically, and faster than normal, that would seem very deathlike or panic-inducing I guess.

Comment author: Voltairina 26 May 2014 04:11:35PM -1 points [-]

Because for any set of facts that I hold in my attention about myself, those facts could happen in a myriad of worlds other than the ones in which the rest of my memories took place and still be logically consistent - if my memories even were perfectly accurate and consistent, which they aren't in the first place.

Comment author: Voltairina 26 May 2014 04:10:01PM -1 points [-]

At any given time my ability to focus on and think about my individual memories is limited to a small portion of the total. As long as the thread of connections was kept consistent, all sorts of things about myself could chance without me having any awareness of them. If I was aware that they had changed, I would still have to put up with who I had now become, I think... unless I had some other reason for having allegiance to who I had been... say disliking whoever or whatever had made me who I was, or finding that I was much less capable than I had been, or something. If I was aware that they would change, drastically, but that afterwards it would all seem coherent, and I wouldn't remember worrying about them changing - or that while I was not focusing on them, they were changing very radically, and faster than normal, that would seem very deathlike or panic-inducing I guess.

Comment author: Matthew_Opitz 25 May 2014 02:44:41PM *  6 points [-]

Here's a thought experiment:

Let's say evil sadistic scientists kidnap you, bring you into their laboratory, and give you two options:

A: they incinerate your brain.

OR

B: they selectively destroy almost all of the neurons in your brain associated with memories and somehow create new connections to create different memories.

Which option would you choose?

If you see any reason to choose option B over option A, then it would seem to me that you don't really buy into "pattern identity theory" because pattern identity theory would suggest that you have effectively died in scenario B just as much as scenario A. The pattern of you from just before the operation has just had a discontinuously abrupt end.

Yet, I would still choose option B because I would still anticipate waking up as something or somebody that next morning, even if it were someone with a completely different set of memories, preferences, and sense of self, and surely that would be better than death. (Perhaps the evil scientists could even be so kind as to implant happier memories and healthier preferences in my new self).

Is this anticipation correct? I don't see how it could be wrong. Our memories change a little bit each night during sleep, and still we don't NOT wake up as at least someone (a slightly different person than the night before). I fail to see how the magnitude and/or rapidity of the change in memory could produce a qualitative difference in this regard. If it could, then where would the cut-off line be? How much would someone have to change my memories so that I effectively did not wake up the next morning as someone?

Note that this discussion is not just academic. It would determine one's decision to use a teleporter (especially if it was, let's say, a "1st generation" teleporter that still had some kinks in it and didn't quite produce a 100% high-fidelity copy at the other end). Would such a 99% accurate teleporter be a suicide machine, or would your subjective experience continue at the other end?

In any case, pattern identity theory (which says the continuation of my subjective experience is attached to a continuation of a particular pattern of information) seems out the window for me.

Nor does some sort of "physical identity theory" (that posits that the continuation of my subjective experience is attached to the continuation of a particular set of atoms) make any sense because of how patently false that is. (Atoms are constantly getting shuffled out of our brains all the time).

Nor does some sort of "dualism" (that posits that the continuation of my subjective experience is attached to some sort of metaphysical "soul") make any sense to me.

So at this point, I have no idea about under what conditions I will continue to have subjective experiences of some sort. Back to the drawing board....

Comment author: Voltairina 26 May 2014 03:55:39PM 0 points [-]

Well, like Skeptityke seems to be indicating, maybe it is better to say that identity is pattern-based, but analog (not one or zero, but on a spectrum from 0 to 1)... in which case while B would be preferable, some scenario C where life continued as before without incineration or selective brain destruction would be more preferable still.

In response to story idea...
Comment author: Lumifer 18 October 2013 04:17:43AM 1 point [-]

Have you read Singularity Sky and Iron Sunrise? There is a powerful entity there who prevents humans from passing a certain technological threshold:

I am the Eschaton; I am not your God.
I am descended from you, and exist in your future.
Thou shalt not violate causality within my historic light cone. Or else.

In response to comment by Lumifer on story idea...
Comment author: Voltairina 18 October 2013 03:11:18PM -1 points [-]

I have not! I will definitely check it out.

View more: Next