AdeleneDawner comments on Meditation, insight, and rationality. (Part 1 of 3) - Less Wrong

35 Post author: DavidM 28 April 2011 08:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (120)

You are viewing a single comment's thread. Show more comments above.

Comment author: Armok_GoB 09 May 2011 12:36:31PM 0 points [-]

Stuff like observations of stuff inside my brain and outside my brain being the same kind of thing, and not having any sense of "self" in the way most people describe it. Seeing myself as an an algorithm that this brain is approximating and a bunch of related notions like that are intuitively obvious in retrospect. Actually, the retrospect part is just an assumption, having always known such things sound extremely unreasonble, but I don't remember ever having not done so and can't imagine what it'd possibly be like. ... ugh this explanation sucks and sounds way more preposterous than what I actually mean by it but it's the closest I can get with words.

That's the biggest one at least, a bunch of other minor things seem consistent with the experience of being enlightened you describe as well. The only strange thing is that I don't seem to perceive any vibrations, but then again I've never actually looked for them and I do seem to instantly understand what exactly you're talking about and what it is that cases me not to see them individually and them being there seems to be somehting I obviously know even if I can't see them...

I'm still sceptical thou, all of these experiences and memories come flagged as suspect and might have been fabricated/altered/distorted by some psychological phenomena to fit your descriptions better. Wouldn't be the first time my brain did something like that.

I've read part 2, liked it a lot less than part 1 and were a bit creeped out by some of the descriptions, especially of stage 3... Made me a lot more weary of trying this whole meditation thing. (Also set of my absurdity heuristic big time but we all know that one isn't reliable so I'm trying to ignore that...)

Comment author: AdeleneDawner 09 May 2011 12:44:11PM *  0 points [-]

Here's a potentially-useful cue: How does your mind handle the question "what do you want"?

Anyone reading along might want to answer this question for themselves before continuing, of course.

Comment author: Armok_GoB 09 May 2011 01:01:55PM 0 points [-]

"ERROR; further context required Guess at context/translation: 'What utility function does the brain controlling the account User:Armok_GoB implement?' RETURNS: Unknown, currently working under the heuristic to act like it's indistinguishable from the CEV of humanity. "

Comment author: AdeleneDawner 09 May 2011 01:05:40PM 0 points [-]

Yes, exactly. Mine returns a similar 'insufficient data' error, though my default translation is slightly different.

Comment author: AdeleneDawner 09 May 2011 02:29:12PM 2 points [-]

To clarify this a bit, the interesting bits are that Armok:

  • Found the question confusing

  • Noticed the confusion and stopped rather than generating a plausible-sounding answer (though the framing of the question makes this much less remarkable than it would otherwise be)

  • Rephrased the question in a way that avoids the usual ways of thinking about both 'me' and 'wanting'

It's also somewhat interesting that his response to the question refers primarily to other peoples' desires, though that's very plausibly (sub-)cultural.

Comment author: Armok_GoB 13 May 2011 09:37:09PM 0 points [-]

Thanks! :)