shminux comments on Dark Arts of Rationality - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (185)
OK, I upvoted it before reading, and now that I have read it, I wish there were a karma transfer feature, so I could upvote it a dozen times more :) Besides the excellent content, it is exemplary written ( engaging multi-level state-explain-summarize style, with quality examples throughout).
By the way, speaking of karma transfer, here is one specification of such a feature: anyone with, say, 1000+ karma should be able to specify the number of upvotes to give, up to 10% of their total karma (diluted 10x for Main posts, since currently each Main upvote gives 10 karma points to OP). The minimum and transfer thresholds are there to prevent misuse of the feature with sock puppets.
Now, back to the subject at hand. What you call compartmentalization and what jimmy calls attention shifting I imagine in terms of the abstract data type "stack": in your current context you create an instance of yourself with a desired set of goals, then push your meta-self on stack and run the new instance. It is, of course, essential that the instance you create actually pops the stack and yields control at the right time, (and does not go on creating and running more instances, until you get a stack overflow and require medical attention to snap back to reality). Maybe it's an alarm that goes off internally or externally after a certain time (a standard feature in hypnosis), and/r after a certain goal is achieved (e.g. 10 pages of a novel).
Also, as others pointed out, your ideas might be a better sell if they are not packaged as Dark Arts, which they are not, but rather as being meta-rational. For example, "local irrationality may be globally rational", or, somewhat more mathematically, "the first approximation to rationality need not appear rational", or even more nerdy "the rationality function is nonlinear, its power series has a finite convergence radius", or something else, depending on the audience. Then again, calling it Dark Arts has a lot of shock value in this forum, which attracts interest.
To clarify, what he calls compartmentalization I call compartmentalization. I'd just recommend doing something else, which, if forced to name, I'd call something like "having genuine instrumental goals instead of telling yourself you have instrumental goals".
When I say "attention shifting" I'm talking abut the "mental bit banging" level thing. When you give yourself the (perhaps compartmentalized) belief that "this water will make me feel better because it has homeopathic morphine in it", that leads to "so when I drink this water, i will feel better" which leads to anticipating feeling better which leads to pointing your attention to good feelings to the exclusion of bad things.
However, you can get to the same place by deciding "i'm going to drink this water and feel good" - or when you get more practiced at it, just doing the "feeling good" by directing attention to goodness without all the justifications.
(that bit of) my point is that knowing where the attention is screens off how it got there in terms of its effects, so might as well get there through a way that does not have bad side effects.
LW already feels uncomfortably polarized with a clique of ridiculously high karma users at the top. I don't think giving additional power to the high-karma users is a good idea for the long term.
Huh. never noticed that. A clique? What an interesting perspective. How much karma do you mean? Or is it some subset of high karma users? For example, I happen to have just over 10k karma, does it make me a clique member? What about TheOtherDave, or Nancy? How do you tell if someone is in this clique? How does someone in the clique tell if she is?
Presumably you joined a while ago, when there weren't so many intimidating high-karma users around
Yes on all counts. You're clearly the cool kids here.
You see them talk like they know each other. You see them using specialized terms without giving any context because everybody knows that stuff already. You see their enormous, impossible karma totals and wonder if they've hacked the system somehow.
Dunno. It probably looks completely different from the other side. I'm just saying that's what it feels like (and this is bad for attracting new members), not that's what it's really like.
(nods) True enough.
That said, you're absolutely right that it looks completely different from "the other side."
What does it look like?
Well, for example, I'm as aware of the differences between me and shminux as I ever was. From my perspective (and from theirs, I suspect), we aren't nearly as homogenous as we apparently are from lmm's perspective.
I always assumed I'd get a black cloak and a silver mask in the mail once I break 10K and take the Mark of Bayes. Isn't that what happens?
We aren't supposed to talk about it.
For the record, I have also noticed this subset of LW users - I tend to think of them as "Big Names" - and:
You could ask the same of any clique;
It seems like these high-profile members are actually more diverse in their opinions than mere "regulars".
Of course, this is just my vague impression. And I doubt it's unique to LessWrong, or particularly worrying; it's just, y'know, some people are more active members of the community or however you want to phrase it.
(I've noticed similar "core" groups on other websites, it's probably either universal or a hallucination I project onto everything.)
Relevant link: The Tyranny of Structurelessness (which is mostly talking about real-life political groups, but still, much of it is relevant):
What if going beyond 1 vote cost karma - so you'd actually need to spend, not just apply, that karma?
I can't see people using it to the point where their karma-flow went negative, so I don't think it really helps. It's a less bad idea, but not I think a good one.