Comment author: Jayson_Virissimo 14 November 2011 09:03:17AM *  5 points [-]

I once considered doing a mini-sequence on marksmanship along similar lines to your drawing sequence, but decided it was too tangential to the Less Wrong project to be of much value. Was I wrong?

Comment author: Klao 14 November 2011 02:53:27PM 2 points [-]

I would be interested in a sequence like that. Of course, if it only touches rationality tangentially then maybe LessWrong is not the best place for it. But again, I personally would be very interested in it.

Comment author: Karmakaiser 08 November 2011 08:18:02PM 3 points [-]

A litany I repeat to myself when learning new topics is that: "One must be be able to teach the class to learn the lesson."

By being able to explain the science of heuristics and biases in your own terms, citing LessWrong only when absolutely necessary you internalize concepts and relationships that make the research more tangible for you.

Comment author: Klao 09 November 2011 11:17:19AM 1 point [-]

Yes, this should work. With (hard) sciency stuff I actually do this. For example, after finishing the Quantum Physics sequence (and some reading of my own afterwards) I did a series of lectures about "the Intuitive Quantum World" here in the office.

I need to find some audience, who would be interested in the more general topics that I learn here on LessWrong. And of course, I would need to read a lot to have a real deep understanding. But yes, this is a very good answer to my question!

Comment author: Klao 08 November 2011 01:08:50PM 0 points [-]

Sorry to ask, but is this still on?

Wakefield, could you post some details here if it is? (I've sent you a private message, but maybe you didn't notice it.)

Comment author: lukeprog 03 November 2011 07:40:25AM -1 points [-]

I'm curious about why this was downvoted. Is there something in this comment that is incorrect? Do people disagree with my statement that 'explanation' has two meanings, or something?

Comment author: Klao 04 November 2011 11:21:53AM *  4 points [-]

I didn't downvote that comment, but might have, if I followed the conversation live. My thinking when I read it was: "He can't possibly really think that it is a homonym! So, for the sake of the argument he arrogantly (because that all caps spelling does show off some arrogancy) distorts reality and expects us to accept it?!"

But, now I see that this is too much of a correspondence bias. You probably just wanted to show that "explanation" has two different meanings, but in the head of the discussion just found a very bad example for your argument. Because "explanation" does have two slightly different meanings and this is relevant here. But let's be clear, these two meanings are close and in no way homonyms (as opposed to what you stated and what you clearly tried to present with the "fluke" example).

So, I think this comment of yours is bad and the downvotings were valid.

Edit: I didn't read the wikipedia article you linked when I wrote the above. I only ever heard/saw "homonym" being used in the sense of "two identically spelled and pronounced words with different meanings and of unrelated origin"; what Wikipedia calls "true homonym". In the more general sense the two "explanations" might qualify as homonyms (I am definitely not a linguist). But, your "fluke" example strongly indicated the more narrow (and, I think, more common) meaning. So, my reasons still stand.

Comment author: Nisan 03 November 2011 06:12:48AM 1 point [-]

First of all, congratulations on your deconversion :)

Other commenters have addressed your concerns directly, so I'll just suggest that you pay attention to the psychological needs that your version of theosophy satisfied.

Comment author: Klao 03 November 2011 12:32:19PM 0 points [-]

Thanks!

I think, first and foremost these psychological needs were "to understand how things are". And that's in short why I am here now. :)

Comment author: [deleted] 02 November 2011 07:09:46PM *  5 points [-]

"Reading about rationality is a very effective way of training the verbal part of one's brain to be rational. On the other hand, the influence on other parts of the brain may be less impressive. The translation of a rationality concept into words may also be imperfect, rendering it less helpful than expected when applied to novel circumstances."

That's how I understand your post. Reading about rationality doesn't sculpt your brain in the same way as does learning over many years to overcome problems through the virtues of precise thinking. I agree - and the only solution is to read widely, use your brain all the time, and try to become more perspicacious over time!

In the mean time, use Yudkowsky's insights and teachings to the extent that you feel you can trust them at this point. The same goes for any other sage.

In response to comment by [deleted] on Do we have it too easy?
Comment author: Klao 02 November 2011 10:33:22PM 1 point [-]

Yes, I think this is a pretty good reading of my post. And it makes the issue seem less pressing and more manageable.

Comment author: shminux 02 November 2011 03:53:04PM *  1 point [-]

How about accepting that some things are neither, but you still have to make a choice? (E.g. inevitability of (u)FAI is untestable, and relies on a number of disputed assumptions and extrapolations. Same with the viability of cryonics.) How do you construct your priors to make a decision you can live with, and how do you deal with the situation where, despite your best priors, you end up being proven wrong?

Comment author: Klao 02 November 2011 04:38:58PM 1 point [-]

Now, this is a much better question! And yes, I am thinking a lot on these. But, in some sense this kind of thing bothers me much less: because it is so clear that the issue is unclear, my mind doesn't try to unconditionally commit it to the belief pool just because I read something exciting about it. And then I know I have to think about it, and look for independent sources etc. (For these two specific problems, I am in a different state of confusion. Cryonics: quite confused; AGI: a bit better, at least I know what my next steps are.)

How do you deal with this?

Comment author: Grognor 02 November 2011 08:25:56AM *  6 points [-]

Have you read No Safe Defense, Not Even Science?

I'm afraid your concerns have already been covered. That doesn't mean they're not legitimate. Quite the opposite, in fact. The answer to your initial question, "Do we have it too easy?" is yes. But the answer to your closing question, "is this content appropriate for a 'Main' post?" is no.

It's not so simple as "take everything with a grain of salt." Even though that in itself is monumentally difficult, it's not enough. The universe is not calibrated to our skill set. Rationality is impossible.

(I bet you figured out what comes next.)

Now shut up and do the impossible.

(If it matters that much to you.)

Comment author: Klao 02 November 2011 12:35:59PM -1 points [-]

Yes, I saw it coming. :) Thanks! It does matter to me.

Comment author: Hyena 02 November 2011 04:24:33AM 6 points [-]

To avoid cult mode, try to avoid the local jargon. That will help you keep some distance by not turning on your slogan-loyalty loop. I've avoided this because I remember when this topic space was young, none of these sites existed and this sort of thing was still the stuff of excited conversation among college students. It's nice to see it all laid out in various places, but it will never appear to me as the work of monumental genius it does to some people.

Comment author: Klao 02 November 2011 12:34:47PM 2 points [-]

This sounds like a very good piece of advice. A slight problem is that some of the jargon is very useful for expressing things that otherwise would be hard to express. But, I'll try to be conscious about it.

Comment author: shminux 02 November 2011 03:20:27AM 6 points [-]

One morning, walking to the train station, thinking about something I read, my thoughts wondered to how this all affects my faith. And I noticed myself flinching away, and thought “Isn't this what Eliezer calls "flinching away"?” I didn't resolve my doubts there and then, but there was no turning back and couple of days later I was an atheist.

I recall a Gom Jabbar spell cast on a hapless teacher in a similar circumstance.

Jokes aside, some of what EY preaches here IS WRONG, since there is absolutely no way he is right about everything. If someone tells you otherwise, they are treating EY as a cult leader, not a teacher. So, ask yourself: what if the idea you just thought over and internalized is wrong? Because, chances are, at least one of them is. If there is a topic in the sequences you consider yourself an expert in, start there. It might be his approach to free will, or to quantum mechanics, or to the fun theory, or to dark arts, or...

Until you have proven EY wrong at least once on this forum, you are not ready for rationality.

(Hope this is not too dark for you.)

Comment author: Klao 02 November 2011 12:31:13PM 1 point [-]

No, it's not too dark, it is useful to see an even stronger expression of caution. But, it misses the point a bit. It's not very helpful to know that Eliezer is probably wrong on some things. Neither is finding a mistake here or there. It just doesn't help.

You see, my goal is to accept and learn fully that which is accurate, and reject (and maybe fix and improve) that which is wrong. Neither one is enough by itself.

View more: Prev | Next