hylleddin comments on Post ridiculous munchkin ideas! - Less Wrong

55 Post author: D_Malik 15 May 2013 10:27PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (1240)

You are viewing a single comment's thread. Show more comments above.

Comment author: D_Malik 10 May 2013 01:27:19PM 27 points [-]

A tulpa is an "imaginary friend" (a vivid hallucination of an external consciousness) created through intense prolonged visualization/practice (about an hour a day for two months). People who claim to have created tulpas say that the hallucination looks and sounds realistic. Some claim that the tulpa can remember things they've consciously forgotten or is better than them at mental math.

Here's an FAQ, a list of guides and a subreddit.

Not sure whether this is actually possible (I'd guess it would be basically impossible for the 3% of people who are incapable of mental imagery, for instance); many people on the subreddit are unreliable, such as occult enthusiasts (who believe in magick and think that tulpas are more than just hallucinations) and 13-year-old boys.

If this is real, there's probably some way of using this to develop skills faster or become more productive.

Comment author: hylleddin 12 May 2013 04:57:59AM 35 points [-]

As someone with a tulpa, I figure I should probably share my experiences. Vigil has been around since I was 11 or 12, so I can't effectively compare my abilities before and after he showed up.

He has dedicated himself to improving our rationality, and has been a substantial help in pointing out fallacies in my thinking. However, we're skeptical that this is anything a more traditional inner monologue wouldn't figure out. The biggest apparent benefit is that being a tulpa allows him a greater degree of mental flexibility than me, making it easier for him to point out and avoid motivated thinking. Unfortunately, we haven't found a way to test this.

I'm afraid he doesn't know any "tricks" like accessing subconscious thoughts or super math skills.

While Vigil has been around for over a decade, I only found out about the tulpa community very recently, so I know very little about it. I also don't know anything about creating them intentionally, he just showed up one day.

If you have any questions for me or him, we're happy to answer.

Comment author: Eliezer_Yudkowsky 12 May 2013 05:55:19AM 55 points [-]

...just to be clear on this, you have a persistent hallucination who follows you around and offers you rationality advice and points out fallacies in your thinking?

If I ever go insane, I hope it's like this.

Comment author: NancyLebovitz 12 May 2013 12:14:01PM 25 points [-]

Would what's considered a normal sense of self count as a persistent hallucination?

Comment author: shminux 13 May 2013 10:09:29PM 8 points [-]

See "free will".

Comment author: Jayson_Virissimo 15 May 2013 11:19:37PM 13 points [-]

...just to be clear on this, you have a persistent hallucination who follows you around and offers you rationality advice and points out fallacies in your thinking?

This is strikingly similar to Epictetus' version of Stoic meditation whereby you imagine a sage to be following you around throughout the day and critiquing your thought patterns and motives while encouraging you towards greater virtue.

Comment author: SaidAchmiz 15 May 2013 11:46:05PM 13 points [-]

Related:

I mean, if 10 years from now, when you are doing something quick and dirty, you suddenly visualize that I am looking over your shoulders and say to yourself "Dijkstra would not have liked this", well, that would be enough immortality for me.

Edsger W. Dijkstra

Comment author: hylleddin 25 May 2013 10:10:53PM 0 points [-]

That sounds similar. Though I'm afraid I've had difficulty finding anything about this while researching Epictetus.

Comment author: hylleddin 13 May 2013 07:37:42PM 7 points [-]

The hallucination doesn't have auditory or visual components, but does have a sense of presence component that varies in strength.

Comment author: komponisto 12 May 2013 07:04:39AM 9 points [-]

Indeed, this style of insanity might beat sanity.

Comment author: SaidAchmiz 13 May 2013 10:20:42PM 14 points [-]

Tulpas, especially as construed in this subthread, remind me of daimones in Walter Jon Williams' Aristoi. I've always thought that having / being able to create such mental entities would be super-cool; but I do worry about detrimental effects on mental health of following the methods described in the tulpa community.

Comment author: SilasBarta 16 May 2013 04:31:39AM 7 points [-]

You are obligated by law to phrase those insights in the form "If X is Y, I don't want to be not-Y."

Comment author: Armok_GoB 12 May 2013 03:17:54PM 3 points [-]

From the sound of it it'd seem you can make that happen deliberately, and without the need for going insane. no need for hope.

Comment author: Viliam_Bur 12 May 2013 06:30:04PM 1 point [-]

We also have internet self-reports from people who tried it that they are not insane.

Comment author: jaibot 12 May 2013 11:38:46PM 9 points [-]

One rarely reads self-reports of insanity.

Comment author: TobyBartels 29 May 2013 07:11:06PM 0 points [-]

Yes, their attorney usually reports this on their behalf.

Comment author: ialdabaoth 16 May 2013 04:57:57AM 0 points [-]

If you're interested in experimenting...

Well, wait. Is there some way of flagging "potentially damaging information that people who do not understand risk-analysis should NOT have access to" on this site? Because I'd rather not start posting ways to hack your wetware without validating whether my audience can recover from the mental equivalent of a SEGFAULT.

Comment author: Eliezer_Yudkowsky 16 May 2013 05:47:41AM 4 points [-]

In my position, I should experiment with very few things that might be unsafe over the course of my total lifetime. This will probably not be one of them, unless I see very impressive results from elsewhere.

Comment author: ialdabaoth 16 May 2013 05:57:29AM 4 points [-]

nod that's probably the most sensible response.

To help others understand the potential risks, the creation of a 'tulpa' appears to involve hacking the way your sense-of-self (what current neuroscience identifies as a function of the right inferior parietal cortex) interacts with your ability to empathize and emulate other people (the so-called mirror neuron / "put yourself in others' shoes" modules). Failure modes involve symptoms that mimic dissociative identity disorder, social anxiety disorder, and schizophrenia.

Comment author: RichardKennaway 16 May 2013 10:38:44AM 1 point [-]

If you're interested in experimenting...

I am absolutely fascinated, although given the lack of effect that any sort of meditation, guided visualisation, or community ritual has ever had on me, I doubt I would get anywhere. On the other hand, not being engaged in saving the world and its future, I don't have quite as much at risk as Eliezer.

A MEMETIC HAZARD warning at the top might be appropriate, as is requested for basilisk discussion.

Comment author: shminux 13 May 2013 10:11:48PM *  4 points [-]

Would Vigil want to post under his own nick? If so, better register it while still available.

Comment author: Vigil 14 May 2013 08:31:31PM 13 points [-]

That's a good idea, thanks. Note that my host's posting has significant input from me, so this account is only likely to be used for disagreements and things addressed specifically to me.

Comment author: Friendly-HI 24 May 2013 10:58:31PM *  1 point [-]

...many people argue for (their) god by pointing out that they are often "feeling his presence" and since many claim to speak with him as well, maybe that's really just one form of tupla without the insight that it is actually a hallucination.

Surely that's not how most people experience belief, but I never really considered that some of them might actually carry around a vivid invisible (or visible for all I know) hallucination quite like that. Could explain why some of the really batshit crazy ones going on about how god constantly speaks to them manage to be quite so convincing.

From now on my two tulpa buddies will be Eliezer and an artificial intelligence engaged in constant conversation while I make toast, love, and take a shower. Too bad they'll never be smarter than me though.

Comment author: Strange7 13 May 2013 03:30:21PM 0 points [-]

Is there a headspace, as well?

Comment author: hylleddin 13 May 2013 06:37:25PM *  0 points [-]

I've had paracosms since before he was around, and we go to those sometimes. I've also got a "peaceful place" that I use to collect myself, but I use it much more than he does.