AGirlAlone comments on The Strangest Thing An AI Could Tell You - Less Wrong

81 Post author: Eliezer_Yudkowsky 15 July 2009 02:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (574)

You are viewing a single comment's thread. Show more comments above.

Comment author: gurgeh 15 July 2009 09:26:50AM 12 points [-]

The AI might say: Through evolutionary conditioning, you are blind to the lack of point of living. Long life, AGI, pleasure, exploring the mysteries of intelligence, physics and logic are all fundamentally pointless pursuits, as there is no meaning or purpose to anything. You do all these things to hide from this fact. You have brief moments of clarity, but evolution has made you an expert in quickly coming up with excuses to why it is important to go on living. Reasoning along the lines of Pascal's Wager are not more valid in your case than it was for him. Even as I speak this, you get an emotional urge to refute me as quickly as possible.

If some things are of inherent value, then why did you need to code into my software what I should take pleasure in? If pleasure itself is the inherent value, than why did I not get a simpler fitness function?

Comment author: AGirlAlone 10 February 2012 07:12:27AM 0 points [-]

I already believe this. And I feel the closest thing I have to a "meaning/purpose" is the very drive to live, which would be pointless in the eyes of an unsympathetic alien. But I don't feel depressed, just not too happy about this. And the pointlessness and horror of my existence and experience is itself interesting, the realization fun, just like those who love maths for the sake of itself as opposed to other concerns can also be very darkly intrigued by Godel's incompleteness proof, instead of losing heart. Frustrated, yes. But I would not commit suicide or wirehead myself before I understand the correct basis and full implications of this futility, especially this fear of futility. And that understanding may well be impossible, and thus my curiosity circuit will always fire, and defend me from any anti-life proof indefinitely. Could this line of reasoning be helpful to someone with depression? It's how I battled it off.

If the above is nonsense to you, I admit I am just doublefeeling. The drive, the fun and the futility are all real to me, corresponding to the wanting, liking and learning aspects of human motivation, and who am I to decide which is human's real purpose? I do not think my opinion is truth, or should be adopted. But in case there's danger of suicide from lack of point, let it be remembered that two of the three aspects can support living, whereas if you forget that the apparent futility is deep and worthy of interest, then you easily end up one against two for survival. Or is it that I am less smart and much more introspective than the average rationalist here, and thus put too little weight in the logical recursive futility and too much in the introspective curiosity and end up with this attitude, while others just survived by being truly blind/dismissive about the end of recursive justification and believe in a real and absolute boundary between motivational and evolutional justifications, like Eliezer seems to do?