All of learnmethis's Comments + Replies

Keep in mind that people who apply serious life-changing ideas after reading about them in fiction are the exception rather than the norm. Most people who aren't exceptionally intellect-oriented need to personally encounter someone who "has something" that they themselves wish they had, and then have some reason to think that they can imitate them in that respect. Fiction just isn't it, except possibly in some indirect ways. Rationalist communities competing in the "real-world" arena of people living lives that other people want to and can emulate are a radically more effective angle for people who don't identify strongly with their intellectual characteristics.

It seems at best fairly confused to say that an L-zombie is wrong because of something it would do if it were run, simply because we evaluated what it would say or do against the situation where it didn't. Where you keep saying "is" and "concludes" and "being" you should be saying "would", "would conclude", and "would be", all of which is a gloss for "would X if it were run", and in the (counter-factual) world where the L-zombie "would" do those things it "would be runnin... (read more)

Once you know about affective death spirals, you can use them in tricky ways. Consider for example, that you got into an affective death spiral about capital "R" Rationality which caused you to start entertaining false delusions (like that teaching Rationality to your evil stepmother would finally make her love you, or whatever). If you know that this is an affective death spiral, you can do an "affective death spiral transfer" that helps you avoid the negative outcome without needing to go to war with your own positive feelings: in ... (read more)

Ah well, I had to ask. I know religion is usually the "other team" for us, so I hope I didn't push any buttons by asking--definitely not my intention.

This article is awesome! I've been doing this kind of stuff for years with regards to motivation, attitudes, and even religious belief. I've used the terminology of "virtualisation" to talk about my thought-processes/thought-rituals in carefully defined compartments that give me access to emotions, attitudes, skills, etc. I would otherwise find difficult. I even have a mental framework I call "metaphor ascendence" to convert false beliefs into virtualised compartments so that they can be carefully dismantled without loss of existing ... (read more)

2So8res
To address your other question: I was raised religious, and I learned about compartmentalization by self-observation (my religion was compartmentalized for a couple years before I noticed what I was doing). That said, since becoming an atheist I have never held a compartmentalized religious belief for motivational purposes or otherwise.
2So8res
To address your postscript: "Dark Arts" was not supposed to mean "bad" or "irrational", it was supposed to mean "counter-intuitive, surface-level irrational, perhaps costly, but worth the price". Strategically manipulating terminal goals and intentionally cultivating false beliefs (with cognitive dissonance as the price) seem to fall pretty squarely in this category. I'm honestly not sure what else people were expecting. Perhaps you could give me an idea of things that squarely qualify as "dark arts" under your definition? (At a guess, I suppose heavily leveraging taboo tradeoffs and consequentialism may seem "darker" to the layman.)

You seem to be making a mistake in treating bridge rules/hypotheses as necessary--perhaps to set up a later article?

I, like Cai, tend to frame my hypotheses in terms of a world-out-there model combined with bridging rules to my actual sense experience; but this is merely an optimisation strategy to take advantage of all my brain's dedicated hardware for modelling specific world components, preprocessing of senses, etc.. The bridging rules certainly aren't logically required. In practice there is an infinite family of equivalent models over my mental expe... (read more)

I’ve got kind of a fun rationalist origin story because I was raised in a hyper-religious setting and pretty much invented rationalism for use in proselytisation. This placed me on a path of great transformation in my own personal beliefs, but one that has never been marked by a “loss of faith” scenario, which in my experience seems atypical. I’m happy to type it up if anyone’s interested, but so far the lack of action on comments I make to old posts has me thinking that could be a spectacularly wasted effort. Vote, comment, or pm to show interest.

Causal knowledge is required to ensure success, but not to stumble across it. Over time, noticing (or stumbling across if you prefer) relationships between the successes stumbled upon can quickly coalesce into a model of how to intervene. Isn't this essentially how we believe causal reasoning originated? In a sense, all DNA is information about how to intervene that, once stumbled across, persisted due to its efficacy.

All these conclusions seem to require simultaneity of causation. If earthquakes almost always caused recessions, but not until one year after the earthquake; and if recessions drastically increase the number of burglars, but not until one year after the recession; then drawing any of the conclusions you made from a survey taken at a single point in time would be entirely unwarranted. Doesn't that mean you’re essentially measuring entailment rather than causation via a series of physical events which take time to occur?

Also, the virtue theory of metabolis... (read more)

1TheOtherDave
Some of the inferred subtext is being extracted from earlier posts that refer to diet while ostensibly discussing other issues.

Great post! If this is the beginning of trend to make Less Wrong posts more accessible to a general audience, then I'm definitely a fan. There's a lot of people I'd love to share posts with who give up when they see a wall of text.

There are two key things here I think can be improved. I think they were probably skipped over for mostly narrative purposes and can be fixed with brief mentions or slight rephrasings:

You won't get a direct collision between belief and reality - or between someone else's beliefs and reality - by sitting in your living-room w

... (read more)

Good quote, but what about the reality that I believe something? ;) The fact that beliefs themselves are real things complicates this slightly.

0Normal_Anomaly
It's possible to stop believing that you believe something while continuing to believe it. It's rare, and you won't notice you did so, but it can happen.

Also known as the "people can't remember things without distinctive features" phenomenon. Still interesting to note their behaviours in the situation though.

I understand the point Eliezer's trying to make here. However, you (whoever's reading this) could not convince me that ss0 + ss0 =sss0 in Peano arithmetic (I define the scenario in which my mind is directly manipulated so that I happen to believe this not to constitute "convincing me"). Here's why I believe this position to be rational:

A)In order for me to make this argument, I have to presume communication of it. It's not that I believe the probability of that communication to be 1. Certainly many people might read this comment and not know ... (read more)

0somejan
Extrapolating from Eliezers line of reasoning you would probably find that although you remember ss0 + ss0 = ssss0, if you try to derive ss0 + ss0 from the peano axioms, you also discover it ends up as sss0, and starting with ss0 + ss0 = ssss0 quickly leads you to a contradiction.