Wiki Contributions

Comments

Then what makes Peterson so special?

This is what the whole discussion is about. You are setting boundaries that are convenient for you, and refuse to think further. But some people in that reference class you are now denigrating as a whole are different from others. Some actually know their stuff and are not charlatans. Throwing a tantrum about it doesn't change it.

I did in fact have something between those two in mind, and was even ready to defend it, but then I basically remembered that LW is status-crazy and and gave up on fighting that uphill battle. Kudos to alkjash for the fighting spirit.

They explicitly said that he's not wrong-on-many-things in the T framework, the same way Eliezer is T.correct.

Frustrating, that's not what I said! Rule 10: be precise in your speech, Rule 10b: be precise in your reading and listening :P My wording was quite purposeful:

I don't think you can safely say Peterson is "technically wrong" about anything

I think Raemon read my comments the way I intended them. I hoped to push on a frame in people seem to be (according to my private, unjustified, wanton opinion) obviously too stuck in. See also my reply below.

I'm sorry if my phrasing seemed conflict-y to you. I think the fact that Eliezer has high status in the community and Peterson has low status is making people stupid about this issue, and this makes me write in a certain style in which I sort of intend to push on status because that's what I think is actually stopping people from thinking here.

Cool examples, thanks! Yeah, these are issues outside of his cognitive expertise and it's quite clear that he's getting them wrong.

Note that I never said that Peterson isn't making mistakes (I'm quite careful with my wording!). I said that his truth-seeking power is in the same weight class, but obviously he has a different kind of power than LW-style. E.g. he's less able to deal with cognitive bias.

But if you are doing "fact-checking" in LW style, you are mostly accusing him of getting things wrong about which he never cared in the first place.

Like when Eliezer is using phlogiston as an example in the Sequences and gets the historical facts wrong. But that doesn't make Eliezer wrong in any meaningful sense, because that's not what he was talking about.

There's some basic courtesy in listening to someone's message, not words.

This story is trash and so am I.
If people don't want to see this on LW I can delete it.

You are showcasing a certain unproductive mental pattern, for which there's a simple cure. Repeat after me:

This is my mud pile

I show it with a smile

And this is my face

It also has its place

For increased effect, repeat 5 times in rap style.

[Please delete this thread if you think this is getting out of hand. Because it might :)]

I'm not really going to change my mind on the basis of just your own authority backing Peterson's authority.

See right here, you haven't listened. What I'm saying is that there is some fairly objective quality which I called "truth-seeking juice" about people like Peterson, Eliezer and Scott which you can evaluate by yourself. But you are just dug yourself into the same trap a little bit more. From what you write, your heuristics for evaluating sources seem to be a combination of authority and fact-checking isolated pieces (regardless of how much you understand the whole picture). Those are really bad heuristics!

The only reason why Eliezer and Scott seem trustworthy to you is that their big picture is similar to your default, so what they say is automatically parsed as true/sensible. They make tons of mistakes and might fairly be called "technically wrong on many things". And yet you don't care because you when you feel their big picture is right, those mistakes feel to you like not-really-mistakes.

Here's an example of someone who doesn't automatically get Eliezer's big picture, and thinks very sensibly from their own perspective:

On a charitable interpretation of pop Bayesianism, its message is:
Everyone needs to understand basic probability theory!
That is a sentiment I agree with violently. I think most people could understand probability, and it should be taught in high school. It’s not really difficult, and it’s incredibly valuable. For instance, many public policy issues can’t properly be understood without probability theory.
Unfortunately, if this is the pop Bayesians’ agenda, they aren’t going at it right. They preach almost exclusively a formula called Bayes’ Rule. (The start of Julia Galef’s video features it in neon.) That is not a good way to teach probability.

What about if you go read that, and try to mentally swap places. The degree to which Chapman doesn't get Eliezer's big picture is probably similar to the degree to which you don't get Peterson's big picture, with similar results.

[Note: somewhat taking you up on the Crocker's rules]

Peterson's truth-seeking and data-processing juice is in super-heavy weight class, comparable to Eliezer etc. Please don't make the mistake of lightly saying he's "wrong on many things".

At the level of analysis in your post and the linked Medium article, I don't think you can safely say Peterson is "technically wrong" about anything; it's overwhelmingly more likely you just didn't understand what he means. [it's possible to make more case-specific arguments here but I think the outside view meta-rationality should be enough...]

4) The skill to produce great math and skill to produce great philosophy are secretly the same thing. Many people in either field do not have this skill and are not interested in the other field, but the people who shape the fields do.

FWIW I have reasonably strong but not-easily-transferable evidence for this, based on observation of how people manipulate abstract concepts in various disciplines. Using this lens, math, philosophy, theoretical computer science, theoretical physics, all meta disciplines, epistemic rationality, etc. form a cluster in which math is a central node, and philosophy is unusually close to math even considered in the context of the cluster.

Note that this is (by far) the least incentive-skewing from all (publicly advertised) funding channels that I know of.

Apply especially if all of 1), 2) and 3) hold:

1) you want to solve AI alignment

2) you think your cognition is pwned by Moloch

3) but you wish it wasn't

tl;dr: your brain hallucinates sensory experiences that have no correspondence to reality. Noticing and articulating these “felt senses” gives you access to the deep wisdom of your soul.

I think this snark makes it clear that you lack gears in your model of how focusing works. There are actual muscles in your actual body that get tense as a result of stuff going on with your nervous system, and many people can feel that even if they don't know exactly what they are feeling.

Load More