Eliezer_Yudkowsky comments on Post ridiculous munchkin ideas! - Less Wrong

55 Post author: D_Malik 15 May 2013 10:27PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1240)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 12 May 2013 05:55:19AM 55 points [-]

...just to be clear on this, you have a persistent hallucination who follows you around and offers you rationality advice and points out fallacies in your thinking?

If I ever go insane, I hope it's like this.

Comment author: NancyLebovitz 12 May 2013 12:14:01PM 25 points [-]

Would what's considered a normal sense of self count as a persistent hallucination?

Comment author: shminux 13 May 2013 10:09:29PM 8 points [-]

See "free will".

Comment author: Jayson_Virissimo 15 May 2013 11:19:37PM 13 points [-]

...just to be clear on this, you have a persistent hallucination who follows you around and offers you rationality advice and points out fallacies in your thinking?

This is strikingly similar to Epictetus' version of Stoic meditation whereby you imagine a sage to be following you around throughout the day and critiquing your thought patterns and motives while encouraging you towards greater virtue.

Comment author: SaidAchmiz 15 May 2013 11:46:05PM 13 points [-]

Related:

I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself "Dijkstra would not have liked this", well, that would be enough immortality for me.

Edsger W. Dijkstra

Comment author: hylleddin 25 May 2013 10:10:53PM 0 points [-]

That sounds similar. Though I'm afraid I've had difficulty finding anything about this while researching Epictetus.

Comment author: hylleddin 13 May 2013 07:37:42PM 7 points [-]

The hallucination doesn't have auditory or visual components, but does have a sense of presence component that varies in strength.

Comment author: komponisto 12 May 2013 07:04:39AM 9 points [-]

Indeed, this style of insanity might beat sanity.

Comment author: SaidAchmiz 13 May 2013 10:20:42PM 14 points [-]

Tulpas, especially as construed in this subthread, remind me of daimones in Walter Jon Williams' Aristoi. I've always thought that having / being able to create such mental entities would be super-cool; but I do worry about detrimental effects on mental health of following the methods described in the tulpa community.

Comment author: SilasBarta 16 May 2013 04:31:39AM 7 points [-]

You are obligated by law to phrase those insights in the form "If X is Y, I don't want to be not-Y."

Comment author: Armok_GoB 12 May 2013 03:17:54PM 3 points [-]

From the sound of it it'd seem you can make that happen deliberately, and without the need for going insane. no need for hope.

Comment author: Viliam_Bur 12 May 2013 06:30:04PM 1 point [-]

We also have internet self-reports from people who tried it that they are not insane.

Comment author: jaibot 12 May 2013 11:38:46PM 9 points [-]

One rarely reads self-reports of insanity.

Comment author: TobyBartels 29 May 2013 07:11:06PM 0 points [-]

Yes, their attorney usually reports this on their behalf.

Comment author: ialdabaoth 16 May 2013 04:57:57AM 0 points [-]

If you're interested in experimenting...

Well, wait. Is there some way of flagging "potentially damaging information that people who do not understand risk-analysis should NOT have access to" on this site? Because I'd rather not start posting ways to hack your wetware without validating whether my audience can recover from the mental equivalent of a SEGFAULT.

Comment author: Eliezer_Yudkowsky 16 May 2013 05:47:41AM 4 points [-]

In my position, I should experiment with very few things that might be unsafe over the course of my total lifetime. This will probably not be one of them, unless I see very impressive results from elsewhere.

Comment author: ialdabaoth 16 May 2013 05:57:29AM 4 points [-]

nod that's probably the most sensible response.

To help others understand the potential risks, the creation of a 'tulpa' appears to involve hacking the way your sense-of-self (what current neuroscience identifies as a function of the right inferior parietal cortex) interacts with your ability to empathize and emulate other people (the so-called mirror neuron / "put yourself in others' shoes" modules). Failure modes involve symptoms that mimic dissociative identity disorder, social anxiety disorder, and schizophrenia.

Comment author: RichardKennaway 16 May 2013 10:38:44AM 1 point [-]

If you're interested in experimenting...

I am absolutely fascinated, although given the lack of effect that any sort of meditation, guided visualisation, or community ritual has ever had on me, I doubt I would get anywhere. On the other hand, not being engaged in saving the world and its future, I don't have quite as much at risk as Eliezer.

A MEMETIC HAZARD warning at the top might be appropriate, as is requested for basilisk discussion.