Comment author: h-H 30 October 2010 05:10:00PM 1 point [-]

regardless of dis/agreement, guy has a really cool voice http://www.youtube.com/watch?v=wS6DKeGvBW8&feature=related

Comment author: PhilGoetz 29 October 2010 02:18:01AM 2 points [-]

Should I read Luminosity if I disliked Twilight? Does it matter why I dislike Twilight? Can I read it if I never finish Twilight?

(I tried reading Twilight this week. Got halfway through it. The writing style is very workmanlike - good at describing the surroundings, and just enough other important details to move quickly from point A to point B in the plot. Descriptions of Edward are limited to monotonous repetitions of "perfect" and "beautiful", and descriptions of Bella are absent. None of the dialogue is clever. All the boys fall for Bella, who is not interesting; and her reaction (only wants what she can't have) is repellent. The central love story is unconvincing, but is a little interesting because Edward is unpredictable. So far, it is an efficient vehicle for romance-novel cliches. I may finish it, but will feel guilty if I do.)

Comment author: h-H 29 October 2010 03:41:28AM 0 points [-]

I've sometimes read romance novels, more a function of my reading appetite at the time, plus no books remained in the house except those, I've also read a couple of -video-game stories, including some vampire ones to be relevant for your example, I agree that they have mildly interesting twists, enough for guilt pleasure level.

I can't put a name to it, but it doesn't require such a leap to see the relation between reading things like tvtropes and then to an extent Twilight? on that note, what do you read for fiction generally?

Comment author: Perplexed 29 October 2010 02:35:31AM 1 point [-]

I'm a bit uncomfortable with calling this a "dark art" (perhaps because teaching dark arts seems to be such a dangerous occupation). But there is a "rainbow art" consisting of equal parts of attention-grabbing and persuasion; an art which is necessary even if it is a good argument that you are trying to propagate. I would like to learn something about that art. Ideally, by means of an online class.

I can think of at least 5 different persuasion media that I would like to become skillful at.

  • Stand up lectures - like the TED lectures, for example
  • Powerpoint-style presentations with voiceovers.
  • Blog postings (and sequences of blog postings)
  • Publishable academic-style papers on technical topics.
  • Works of fiction with a didactic subtext - like HPMOR and Luminosity.

I'd bet lots of other people would like to become skillful at these things too.

I'd bet we have people here who are good enough at these things that they could lead a kind of online study group focused on learning and/or improving skills like these.

Comment author: h-H 29 October 2010 03:11:56AM 0 points [-]

sounds like a good idea (though I'm not giving up on the Dark Arts class/sequence yet ..), given that OP does "encourage you to post your skills here anyway" I think bringing this up in the open thread or as a general call to candidates should be worthwhile, this can effectively and depending on the instructions make short work of most barriers to publishing an LW top level post, given relevant and interesting topics of course.

we have been experiencing a slump of late, I think this potentially helps in overcoming the slow stagnation that happens in all forums after the early 'glory days' are over.

Comment author: Perplexed 29 October 2010 01:20:28AM 3 points [-]

I am somewhat afraid of the fact that convincing can be tought seperate from reasoned arguing, that not the best reason wins, but the most enthusiastic speaker, and the one who can best make his point in the eyes of the people.

Fear of a fact is not a good reason to ignore a fact.

... but I do not wish to persuade unreasoned.

A good argument, like a good novel, can work on several levels.

Comment author: h-H 29 October 2010 01:51:15AM *  0 points [-]

ok, so I'm considering that a discussion post at least should be made, any thoughts?

it could potentially be part of the sequences, although Eliezer and others do cover the Dark Arts I don't recall a dedicated thread. I found some good examples from a quick googling, like Yvain's Defense Against The Dark Arts: Case Study #1 or The Power of Positivist Thinking

what makes an irrational argument convincing is human biases, but what I think lacks is more focused treatment of things like good writing or effective signaling, I haven't read all of LW though so it might just be a simple task of collecting articles, but I don't feel that's the case, or is it?

Comment author: jimrandomh 29 October 2010 01:06:52AM 1 point [-]

What about wireheading that's one step further away from the brain, ie, which just attaches to the sensory and motor nerves and provides a simulated universe? That doesn't seem to require modifying the utility function, but I include it under the term wireheading if it's irreversible and the simulated universe doesn't interact with the real one.

Comment author: h-H 29 October 2010 01:30:56AM *  0 points [-]

I believe it does need modifying the utility function given technological constrains, consider for example if the simulated person's physical body was threatened and they were not be able to respond appropriately, This is one of the main reason I included suicide next to lobotomy, I wasn't really clear on that, but you make a much more interesting point.

now that I think about it-for a few minutes-I generally agree with you, attaching sensory and motor nerves for a simulated universe is in fact a form of wire heading. I wonder if my definition of full wireheading needs to be changed though? I don't specify any time constraints because the utility function can in fact change 'for the better' under correct circumstance, not to mention the general case of human terminal values having natural or provoked drift as Rain eloquently put it.

actually, I'm interested in why you include jacking into a a simulation as wireheading for reasons other than not interacting with the real world? does it apply if we include a defense mechanism yet the person remains engrossed in the simulation?

Comment author: MartinB 29 October 2010 12:44:57AM 1 point [-]

I am somewhat afraid of the fact that convincing can be tought seperate from reasoned arguing, that not the best reason wins, but the most enthusiastic speaker, and the one who can best make his point in the eyes of the people. I am surprised on the spread of public debates and how many people change opinions during a debate. I still want to learn it, but I do not wish to persuade unreasoned.

Notice the effects in charismatic leaders how they become inable to get good criticism of their ideas.

Comment author: h-H 29 October 2010 01:11:42AM 1 point [-]

precisely the reasons why we want better dark art skills just for the sake of countering them at least? I'm half tempted to start a thread on this, but I can't write as clearly as most here.

Comment author: Eugine_Nier 28 October 2010 11:53:30PM 2 points [-]

There's also the TDT idea that people who did evil things should be punished.

Comment author: h-H 29 October 2010 12:18:52AM *  1 point [-]

yeah, punishing agents for doing 'bad things' as a deterrence against other agents acting similarly is quite rational.

Comment author: MartinB 28 October 2010 11:35:18PM 0 points [-]

I never founded a group myself. But maybe can give a few pointers. First make sure you actually want to do that. Going to other groups as guest is free and easy, and you can check out if the structure of the Organization is something you like. Then you need a few like-minded people. There is a minimum number for new groups. Keep in mind that it is a training group for speaking in general. One does not practice to convince in a specific topic, but the speaking itself - which would classify it as a dark art, and maybe not even be what you look for. For the actual founding the easiest way is to contact another local group and ask for assistance. There is some paperwork, and then holding the regular meetings. Having a few experienced members is a really good way to get a new group going.

Comment author: h-H 29 October 2010 12:10:52AM *  1 point [-]

rationality might have at some point evolved from the dark arts themselves, i.e. human propensity to make up reasons as they go might have lead to their being better minds at making up reasons and arguments-I read that in an article somewhere but can't remember where exactly.

The dark arts get too much mud slung at them and IMO warrant further study,careful dissection by wise men wearing all the necessary charms and offering appropriate sacrifices should be sufficient ..:)

Complete Wire Heading as Suicide and other things

0 h-H 28 October 2010 11:57PM

I came to the idea after a previous lesswrong topic discussing nihilism, and its several comments on depression and suicide. My argument is that wire heading in its extreme or complete/full form can be easily modeled as suicide, or less strongly as volitional intelligence reduction, at least given current human brain structure and the technology being underdeveloped and hence understood and more likely to lead to such end states.

I define Full Wire Heading as that which a person would not want to reverse after it 'activates' and which deletes their previous utility function or most of it. a weak definition yes, but it should be enough for the preliminary purposes of this post. A full wire head is extremely constrained, much like an infant for e.g. and although the new utility function could involve a wide range of actions, the activation of a few brain regions would be the main goal, and so they are extremely limited.

If one takes this position seriously, it follows that only one's moral standpoint on suicide or say lobotomy should govern judgments about full wire heading. This is trivially obvious of course, but to take this position as true we need to understand more about wire heading, as data is extremely lacking especially in regards to human like brains. My other question then is to what extent could such an experiment help in answering the first question?

In response to comment by h-H on Slava!
Comment author: Will_Newsome 16 October 2010 11:56:56AM *  -1 points [-]

as a rationalist-not technophile/libertarian etc but as one who seeks to be more rational, do you seriously believe in what Buddhism preaches? all of it?

The fact that you ask this question is strong evidence you are being careless. You assume stupidity and are self-satisfied. You will never be a strong rationalist this way. You need to cultivate a sense that much more is possible.

if you're going to cherry pick then why call it Buddhism and praise it so?

Did not praise. I know that you know that your assumptions are mostly rhetorical. Still dangerous. Carelessness. Not moving in harmony with the Bayes. Begging for confirmation, is this disposition of assumption. You will be pulled off course by this. These simple skills of rationality must be perfected if one is to build very strong rationality, with very complex skills. Necessary if you are to use all of your cognitive aspects and limitations to achieve all that is possible. Only possible to use limitation of affective thoughts for good after one is very consistently strong rationalist. Must be able to hold a very steady course in mindspace, in conceptspace, in identityspace, before one can try to use powerful attractors like affect to accelerate along that course. Less Wrong folk cannot do this consistently. Almost no one can. Enlightened people, mostly; maybe others from other disciplines that I know not. I cannot yet do so. Perhaps not far, though.

I would greatly appreciate a simple and rational explanation of why I-or Eliezer or anyone else-should take "good" Buddhism seriously in our pursuit of rationality?

Less Wrong is not really worth my time, except as providing a motivation to write. The epistemological gap between Less Wrong and me is growing too wide. Eliezer I may talk to next time he's around, I guess. The epistemological gap between Eliezer and me is growing narrower. Still many levels above me is Eliezer, but I think only 2.2 levels or so. Easily surmountable with recursive self-improvement.

my problem is mainly, the attachment of Buddhist teaching with 'meditation'-which seems to be universal and not only a Buddhist practice-of some value but not more than say studying human bias or generally reading the average LW top post.

We do not live in Gautama's time. Almost all of Theravada is true, but most is not relevant for rationalists of my caliber.

  • Virtues that he preached, we mostly have now. Smart people are cultured enough to have these virtues and understand their motivations. Evolutionary psychology and cultivated compassion. So virtue part of Buddhism, not as important, I think.
  • Community part of Buddhism, sangha, very important; but having a peer group of strong rationalists intent on leveling up, is its own sangha, and one better than any that could have arisen almost anywhere in the past.

  • Mindfulness, insight, concentration, self-control; these are the third branch of Buddhism, and the part Less Wrong needs. Thus I focus on that part. I know about human bias. I have read nearly every Less Wrong post. But understanding the algorithms behind human bias from the inside, feeling the qualia of cognitive subsystems, building those qualia, being mindful of attractors in mindspace; these are important skills for a rationalist. Knowledge of a bias is a knowledge. Important one, but so very limited. Having the disposition of feeling biases as pulls on cognition, at all levels of complexity of cognitive algorithms: this is much stronger skill. Necessary to become superintelligence. Which is everyone's desire, no? It should be, I think. Would be their desire if they knew more, thought faster, were wiser and more compassionate. Meditation, not only way of building this skill. Just oldest and most studied one. Many have tread this path and passed on knowledge. The skills of good computational cognitive scientist, also strong. But no one writes about these skills. I think becoming superintelligent probably not possible without them. But meditation builds similar skills, and is close. Both are metaskills. Epistemological bootstrapping mechanisms. There are meta-metaskills for this. Well, there aren't yet, but I am building one, and others are building some. We sense that more is possible. Buddhism is silly. Meditation, less silly, but not important of itself. Just one metaskill. Soon more will be possible. Not sure if it will trickle down to Less Wrong. Probably not. The gap is too wide.

In response to comment by Will_Newsome on Slava!
Comment author: h-H 28 October 2010 11:20:38PM *  0 points [-]

ok, I was careless, I apologize, still the argument remains unanswered satisfactorily..

my-and others'-main argument against meditation as a rationality increasing tool is that the less than perfect brains we have are not sufficient at dealing with biases and so forth. I can see that you've pretty much said the same or close to it in your reply above, so that's that.

P.S disjointed sentences?

View more: Prev | Next