I am worried that I have it too easy. I recently discovered LessWrong for myself, and it feels very exciting and very important and I am learning a lot, but how do I know that I am really on a right way? I have some achievements to show, but there are some worrisome signs too.

I need some background to explain what I mean. I was raised in an atheist/agnostic family, but some time in my early teens I gave a mysterious answer to a mysterious question, and... And long story short, influenced by everything I was reading at the time I became a theist. I wasn't religious in the sense that I never followed any established religion, but I had my own "theological model" (heavily influenced by theosophy and other western interpretations of eastern religions). I believed in god, and it was a very important part of my life (around the end of high school, beginning of college I started talking about it with my friends and was quite open and proud about it).

Snip 15-20 years. This summer I started lurking on LessWrong, reading mostly Eliezer's sequences. One morning, walking to the train station, thinking about something I read, my thoughts wondered to how this all affects my faith. And I noticed myself flinching away, and thought Isn't this what Eliezer calls "flinching away"? I didn't resolve my doubts there and then, but there was no turning back and couple of days later I was an atheist. This is my first "achievement". The second is: when I got to the "free will" sequence, I stopped before reading any spoilers, gave myself a weekend and I figured it out! (Not perfectly, but at least one part I figured out very clearly, and got important insights into the other part.) Which I would have never thought I would be able to do, because as it happens, this was the original mysterious question on which I got so confused as a teenager. (Another, smaller "achievement" I documented here.)

Maybe these are not too impressive, but they are not completely trivial either (actually, I am a bit proud of myself :)). But, I get a distinct feeling that something is off. Take the atheism: I think, one of the reasons I so easily let go of my precious belief, was that I had something to replace it with. And this is very-very scary, that I sometimes get the same feeling of amazing discovery reading Eliezer as when I was 13, and my mind just accepts it all unconditionally! I have to constantly remind myself that this is not what I should do with it!

Do not misunderstand, I am not afraid of becoming a part of some cult. (I had some experience with less or more strongly cultish groups, and I didn't have hard time of seeing through and not falling for them. So, I am not afraid. Maybe foolishly.)  What I am afraid of, is that I will do the same mistake on a different level: I won't actually change my mind, won't learn what's really matters. Because, even if everything I read here turns out to be 100% accurate, it would be a mistake "believing in it". Because, as soon as I get to a real-world problem I will just go astray again.

This comment is the closest I saw here on LessWrong to my concerns. It also sheds some light on why is this happening. Eliezer describes the experience vividly enough, that afterwards my mind behaves as if I had the experience too. Which is, of course, the whole point, but also one source of this problem. Because I didn't have the experience, it wasn't me who thought it through, so I don't have it in my bones. I will need much more to make the technique/conclusion a part of myself (and a lot of critical thinking, or else I am worse off and not better.)  And no, Eliezer, I don't know how to make it less dark either.  Other than what is already quite clear: we have to be tested on our rationality. The skills have to be tested, or one won't be able to use them properly.  The "free will" challenge is very good, but only if one takes it. (I took it, because it was a crucial question for me.) And not everything can be tested like this. And it's not enough.

 

So, my question to more experienced LessWrongers: how did you cope with this (if you had such worries)?  Or am I even right on this (do I "worry" in the right direction)?

 

(Oh, and also, is this content appropriate for a "Main" post? Now that I have enough precious karma. :))

New Comment
44 comments, sorted by Click to highlight new comments since: Today at 8:36 AM

Have you read No Safe Defense, Not Even Science?

I'm afraid your concerns have already been covered. That doesn't mean they're not legitimate. Quite the opposite, in fact. The answer to your initial question, "Do we have it too easy?" is yes. But the answer to your closing question, "is this content appropriate for a 'Main' post?" is no.

It's not so simple as "take everything with a grain of salt." Even though that in itself is monumentally difficult, it's not enough. The universe is not calibrated to our skill set. Rationality is impossible.

(I bet you figured out what comes next.)

Now shut up and do the impossible.

(If it matters that much to you.)

As well as No Safe Defense, acknowledge that striving to live by your own strength is a good goal. However, by my own strength I would be at approximately the level of the proto-hominids in the beginning of 2001: A Space Oddyssey; so I'll cheat where I can until I have more time to devote to recreating rationality, the iPhone, and belgian waffles from scratch.

Yes, I saw it coming. :) Thanks! It does matter to me.

[-][anonymous]12y60

"Reading about rationality is a very effective way of training the verbal part of one's brain to be rational. On the other hand, the influence on other parts of the brain may be less impressive. The translation of a rationality concept into words may also be imperfect, rendering it less helpful than expected when applied to novel circumstances."

That's how I understand your post. Reading about rationality doesn't sculpt your brain in the same way as does learning over many years to overcome problems through the virtues of precise thinking. I agree - and the only solution is to read widely, use your brain all the time, and try to become more perspicacious over time!

In the mean time, use Yudkowsky's insights and teachings to the extent that you feel you can trust them at this point. The same goes for any other sage.

Yes, I think this is a pretty good reading of my post. And it makes the issue seem less pressing and more manageable.

To avoid cult mode, try to avoid the local jargon. That will help you keep some distance by not turning on your slogan-loyalty loop. I've avoided this because I remember when this topic space was young, none of these sites existed and this sort of thing was still the stuff of excited conversation among college students. It's nice to see it all laid out in various places, but it will never appear to me as the work of monumental genius it does to some people.

To avoid cult mode, try to avoid the local jargon.

Are there any particularly nasty bits you can think of?

One of the things Yudkowsky has done very well is coin memorable phrases for questions about cognitive bias or poor reasoning. Terms like "true rejection" are great mnemonics. But there is also a danger--which Yudkowsky himself would recognize--of forming a group identity around this patois.

That's the jargon I'm talking about. You should think twice before adopting these terms when writing or speaking.

This sounds like a very good piece of advice. A slight problem is that some of the jargon is very useful for expressing things that otherwise would be hard to express. But, I'll try to be conscious about it.

A litany I repeat to myself when learning new topics is that: "One must be be able to teach the class to learn the lesson."

By being able to explain the science of heuristics and biases in your own terms, citing LessWrong only when absolutely necessary you internalize concepts and relationships that make the research more tangible for you.

Yes, this should work. With (hard) sciency stuff I actually do this. For example, after finishing the Quantum Physics sequence (and some reading of my own afterwards) I did a series of lectures about "the Intuitive Quantum World" here in the office.

I need to find some audience, who would be interested in the more general topics that I learn here on LessWrong. And of course, I would need to read a lot to have a real deep understanding. But yes, this is a very good answer to my question!

One morning, walking to the train station, thinking about something I read, my thoughts wondered to how this all affects my faith. And I noticed myself flinching away, and thought “Isn't this what Eliezer calls "flinching away"?” I didn't resolve my doubts there and then, but there was no turning back and couple of days later I was an atheist.

I recall a Gom Jabbar spell cast on a hapless teacher in a similar circumstance.

Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher. So, ask yourself: what if the idea you just thought over and internalized is wrong? Because, chances are, at least one of them is. If there is a topic in the sequences you consider yourself an expert in, start there. It might be his approach to free will, or to quantum mechanics, or to the fun theory, or to dark arts, or...

Until you have proven EY wrong at least once on this forum, you are not ready for rationality.

(Hope this is not too dark for you.)

Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher.

I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there's absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn't as unreasonable. (That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)

Until you have proven EY wrong at least once on this forum, you are not ready for rationality.

This sounds disturbingly like the apprentice beating the master and then leaving. This sort of notion always annoys me. The first time the master can beat the apprentice is not the time when the apprentice has nothing left to learn from the master. It simply indicates that marginal returns are likely to start diminishing. For similar reasons we don't give a PhD in math to someone as soon as they can prove something new that their adviser tried and failed to prove. They need to do a lot more than that.

But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn't as unreasonable.

The Sequences have been estimated at about 1 million words. I daresay the notion that everything is "correct" there is... unrealistic.

I can even corroborate that notion by pointing out that Eliezer is genetically human; and no human being is immune to the various cognitive biases and other failure modes of rationality; ergo even the best of us will be incorrect on a topic we have established expertise in on ocassion. Even if we assume it happens less frequently in Eliezer than in any other expert in any other topic, I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.

I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.

Especially since the Sequences were written as an exercise in "just getting a post out of the door" instead of spending a long time thinking about and revising each post.

I find the notion that there are no errors in a body of work more than twice the length of The Lord of the Rings is one that I assign a vanishingly small probability of being accurate to.

This is a really evocative phrasing that helps the point a lot. I'm updating my position accordingly. There's an extremely high probability that something of the length of the sequences has at least a few things wrong with it. That that probability is less than the probability that there's a mistake in at least one of someone'e beliefs shouldn't be that relevant because the underlying probability is still extremely high.

[-][anonymous]12y50

I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect.

Mind posting it?

Summary of the major stuff I think Eliezer is wrong on:

Everything Eliezer wrote about blue tentacles is wrong.

Eliezer's use of phlogiston as an example of a bad hypothesis shows lack of historical knowledge about what was actually believed. Phlogiston was rejected by and large because it had been falsified. So claiming that it was unfalsifiable is incorrect. It is true that some people (especially Joseph Priestly) tried to add additional ad hoc hypotheses to prevent its falsification but they were a tiny minority.

Eliezer drastically overestimates the difference between "traditional rationalism" and "extreme rationalism".

Eliezer underestimates how many physicists take MWI seriously, yet at the same time ignores that many people who have thought about the same issues as he has and know a lot more about it than he does have not accepted it.

Eliezer's negative opinion about academia is by and large inaccurate and unjustified and to some extent seems to extend from stereotypes of it that aren't really accurate and his own lack of experience with it.

Eliezer has massive nostalgia for the science and the attitudes about science from the 1960s or so that are deeply unjustified.

Eliezer massively overestimates the chance of an intelligence explosion occurring, primarily because he doesn't take into account how difficult software optimization is and how much theoretical compsci puts limits on it, and he underestimates how much technical difficulty is involved in serious nanotech.

[-][anonymous]12y70

Eliezer has massive nostalgia for the science and the attitudes about science from the 1960s or so that are deeply unjustified.

Can you expand on this? Which posts are you referring to?

Not posts but comments he has made. All such comments are actually pretty recent so they may be functions of recent viewpoints. There seems to be hints of this in some older remarks but 1 and 2 are recent extreme examples. Curiously, judging from the karma, a lot of the community disagreed with Eliezer on the first claim but a lot agreed with him on the second.

Ok, I give. Where does Eliezer talk about blue tentacles?

[-][anonymous]12y10

In Some Claims Are Just Too Extraordinary. I'm not sure if he still believes this, though, since it seems to contradict the spirit of How To Convince Me That 2 + 2 = 3.

Note that the original whole "blue tentacle" thing is from A Technical Explanation of Technical Explanation.

[-][anonymous]12y00

Thanks, I'd forgotten about that.

I agree that there's absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn't as unreasonable.

There's nothing wrong with this discussion as such, but it is of no practical relevance. Regardless of whether or not errors were written into the sequences, errors are read out of them. I've been surprised and frustrated by people's stupidity, such as the mass misunderstanding of the Allais paradox or dust specks, but I probably am misinterpreting something that would be obvious to someone smarter. This might even be on an issue where I am right and only wrongly think I disagree with others.

So the notion that everything he has said here is correct isn't as unreasonable.

At least once Eliezer has posted a factual error and been corrected and said "oops".

Sure. My argument wasn't that there isn't things wrong with the Sequences but that it wasn't completely unreasonable to to think that there would be no mistakes in a work of that length. I already stated in my post that I thought there were problems with what Eliezer has said. But in any event, see my reply to Logos where he convinced that it is still extremely unlikely for something of this length to not have mistakes.

I have a not at all short list of things I think Eliezer is wrong on but this seems incorrect. I agree that there's absolutely no way that Eliezer is right about everything. But everything in the Sequences is a (small) proper subset of everything Eliezer believes. So the notion that everything he has said here is correct isn't as unreasonable.(That said, there are quite a few issues with things Eliezer has said here including things in the Sequences.)

I think it really is rather unreasonable. Take a human, no matter how smart or rational, and have them write one blog post per day for several years, much of which is on debated topics, and I would be shocked if nothing that they said turned out to be false. Even given the relative smallness of EY's beliefs in the sequences/EY's beliefs in general, it's still rather unlikely. Every one of his mistakes would have had to have been regarding something other than what he posted about, which is a bit much for my inner skeptic.

Even given the relative smallness of EY's beliefs in the sequences/EY's beliefs in general, it's still rather unlikely.

It is more reasonable that you have no errors in a smaller sampling size of your beliefs than in a larger sampling size, but the probability of there being at least one error increases with the size of the beliefs being sampled.

I don't know that you and JoshuaZ are really disagreeing with one another so much as you are taking alternate perspectives on the same set of data.

Until you have proven EY wrong at least once on this forum, you are not ready for rationality.

I'm having trouble parsing the phrase "ready for rationality."

Cheesy pathos, I agree. (And an obscure reference to Babylon 5.)

(Note that whole math textbooks can be essentially correct. Minor errors can usually be corrected without affecting anything else.)

While this is true, most math [1] textbooks generally don't provide verbose treatments of controversial, unresolved, possibly untestable meta-problems [2], (where the validity of the conclusions crucially depend on previous controversial, unresolved, possibly untestable meta-problems.)

[1] String theory textbooks provide a possible anti-example.
[2] Metaphysics, metacognition, metaprogramming.

[1] String theory textbooks provide a possible anti-example.

I can assure you that the maths in a string theory textbook will still be essentially correct.

No, it's not too dark, it is useful to see an even stronger expression of caution. But, it misses the point a bit. It's not very helpful to know that Eliezer is probably wrong on some things. Neither is finding a mistake here or there. It just doesn't help.

You see, my goal is to accept and learn fully that which is accurate, and reject (and maybe fix and improve) that which is wrong. Neither one is enough by itself.

How about accepting that some things are neither, but you still have to make a choice? (E.g. inevitability of (u)FAI is untestable, and relies on a number of disputed assumptions and extrapolations. Same with the viability of cryonics.) How do you construct your priors to make a decision you can live with, and how do you deal with the situation where, despite your best priors, you end up being proven wrong?

Now, this is a much better question! And yes, I am thinking a lot on these. But, in some sense this kind of thing bothers me much less: because it is so clear that the issue is unclear, my mind doesn't try to unconditionally commit it to the belief pool just because I read something exciting about it. And then I know I have to think about it, and look for independent sources etc. (For these two specific problems, I am in a different state of confusion. Cryonics: quite confused; AGI: a bit better, at least I know what my next steps are.)

How do you deal with this?

Out of curiosity, what do you disagree with him on?

I commented on MWI once or twice or a dozen times here, a subject dear to my heart, with little interest from the regulars. There are some other topics I mentioned in passing, but not worth getting into here.

Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher.

This seems to be a straw-man. Has anyone ever asserted the infallibility of everything Eliezer has posted? Not even the Pope has that going (he only has an infallible hat he can put on), and it seems to be contradicted many times in the posts themselves with notes of edits made. Everything Eliezer has posted being right is substantially less probable than the least likely thing he has posted.

But everything Eliezer has posted doesn't have to be right for there to be much of value, or to rely with some confidence in a particular assertion being right (particularly after you have read the arguments behind it) - and some of what is written here is fairly uncontroversial.

This seems to be a straw-man. Has anyone ever asserted the infallibility of everything Eliezer has posted?

A straw man is a component of an argument and is an informal fallacy based on misrepresentation of an opponent's position, twisting his words or by means of [false] assumptions.. Please point out where I have misrepresented Klao's position. If anything, you are misrepresenting mine, as I never made any of the claims you refuted.

First of all, congratulations on your deconversion :)

Other commenters have addressed your concerns directly, so I'll just suggest that you pay attention to the psychological needs that your version of theosophy satisfied.

Thanks!

I think, first and foremost these psychological needs were "to understand how things are". And that's in short why I am here now. :)

Ease is not necessarily a bad thing.

One of the best parts of modern society is the efficiencies we realize from specialization. You don't need a phd in molecular chemistry to work at a pharmacy. You don't need to be able to write in machine code to create an iPhone app. A few extremely educated and extremely intelligent people can make the advances and then teach the process needed to recreate what they've discovered to a much larger class of technicians that don't need to invest so much of their lives and energy into producing good things and begin to do it on a large scale.

This is a good thing, it would be very wasteful for everyone to train up to Seal Team standards to provide national defense. We can let a small group of specialists work on that so the rest of the population can do other things. Sure, everyone should know something about self-defense. Best tactics for common situations, how to call for help, maybe even a few dozen hours of physical combat training if they think it'd be useful. Likewise, it's good for everyone to have some exposure to the methods of rationality and to integrate them into their lives. Certainly we need more than we have right now, there's far too much low-lying insanity that could quickly be washed away.

But not everyone needs to be a Green-Beret level rationalist of Yudkowsky's strength. Sometimes it's ok to simply accept what the specialists are doing. You don't need to re-invent the wheel to buy a car, you don't need to re-derive all of physics from first principles to use the GPS network.

Obviously there are failure modes with this method. If the specialist caste becomes too small it becomes too easy to lose them all in a single accident and collapse the system, or to corrupt them. You can't give up completely and just follow blindly, you still need to apply your rationality and accept that the leadership is provisional.

But once you have enough strength as a rationalist to make such allocation decisions; and you have a large network of other rationalists who are around to check your work, provide criticism, and point out if you're trusting the wrong people; you can put a fair amount of trust in the most capable and talented to lead the way and learn what they say without having to do it all independently as well. Even if everyone had the ability to do that (and many don't, I know I sure can't) the sheer amount of time and effort required would make it wasteful for everyone to do that every time.

So I've coped with your worries by accepting I can't do everything, and realizing that Eliezer is enough like me that I trust that almost any major action he takes is one that will also further my interests even though it's never his direct goal to further my interests. I really couldn't ask for a better representative, and so I don't worry anymore that I have to test and accept every single thing myself. Sometimes it's good to delegate. Sometimes it's not bad if something is made easier.