SoullessAutomaton comments on Escaping Your Past - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (49)
Credible only in so far as "one can consistently induce change in a human being without strenuous effort or lengthy struggle" is, and I don't think the latter is anything like obviously right. On the face of it, it seems obviously wrong: people often do require effort and struggle to change, and evolutionarily speaking that seems like what one should expect. (You don't want random other people to be able to change your behaviour too easily, and easy self-modification is liable to make for too-easy modification by others.)
You remind us frequently about what miraculous techniques you have. So it seems like by now you should be a walking miracle, a paragon of well-adjusted Winning. And yet, it doesn't seem all that rare for you to post something saying "I just discovered another idiotic bug in my mental functioning. So I bypassed the gibson using my self-transcending transcendence-transmogrification method, and I'm better now." To my cynical eye, there seems to be some tension here.
OK, so there are some ways, commonly harmful but maybe sometimes exploitable for good, in which our mental states can be messed with non-rationally for a shortish period. Remind me, please, how that is supposed to be good evidence that we can consistently change our behaviours, motivations, etc., in ways we actually want to, with lasting effect?
As a programmer, I will charitably note that it's not uncommon for a more serious bug to mask other more subtle ones; fixing the big one is still good, even if the program may look just as badly broken afterwards. Judging from his blog, he's doing well enough for himself, and if he was in a pretty bad state to begin with his claims may be justified. There's a difference between "I fixed the emotional hang-up that was making this chore hard to do" and "I've fixed a crippling, self-reinforcing terror of failure that kept me from doing anything with my life".
That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture--but then, the same could be said of Eliezer, and he's clearly won many people over with his ideas.
And one of the unfortunate things about the human architecture is that the more global a belief/process is, the more invisible it is... which is rather the opposite of what happens in normal computer programming. That makes high-level errors much harder to spot than low-level ones.
First year or so, I spent way too much time dealing with "hangups making this chore hard to do", and not realizing that the more important hangups are about why you think you need to do them in the first place. So it has been taking a while to climb the abstraction tree.
For another thing, certain processes are difficult to spot because they're cyclical over a longer time period. I recently realized that I was addicted to getting insight into problems, when it wasn't really necessary to understand them in order to fix them, even at the relatively shallow level of understanding I usually worked with. In effect, insight was just a way of convincing myself to "lower the anti-persuasion shields".
The really crazy/annoying thing is I keep finding evidence that other people have figured ALL of this stuff out before, but either couldn't explain it or convince anybody else to take it seriously. (That doesn't make me question the validity of what I've found, but it does make me question whether I'll be able to explain/convince any more successfully than the rest did.)
Heh, you think mine are grandiose, you should hear the claims that other people make for what are basically the same techniques! I'm actually quite modest. ;-)
"That said, there is a lack of solid evidence, and the grandiosity of the claims suggests brilliant insight or crackpottery in some mixture--but then, the same could be said of Eliezer, and he's clearly won many people over with his ideas."
Precisely the point. We're not interested in how to attract people to doctrines (or at least I'm not), but in determining what is true and finding ever-better ways to determine what is true.
The popularity of some idea is absolutely irrelevant in itself. We need evidence of coherence and accuracy, not prestige, in order to reach intelligent conclusions.
Compelling, but false. Ideas' popularity not only contributes network effects to their usefulness (which might be irrelevant by your criteria), but it also provides evidence that they're worth considering.