Wiki Contributions

Comments

There's something about reading the new style that makes me uncomfortable, and prompts me to skim some posts that I would have read more carefully on the old site. I'm not too clear on what causes that effect. I'm guessing that some of it is the excessive amount of white, causing modest sensory overload.

Some of it could be the fact that less of a post fits on a single screenful: I probably form initial guesses about a post's value based on the first screenful, and putting less substance on that first screenful leads me to guess that the post has less substance. Or maybe I associate large fonts with click-baity sites, and small fonts with more intellectual sites.

The editor used for writing comments is really annoying. E.g. links expand to include unrelated text, or unexpectedly stop being links.

I want a way to enter html and/or markup that I can cut and paste after writing them in an editor with which I'm more comfortable. Without that, I'll probably give up on writing comments that are more thoughtful than Facebook comments.

I presume the new karma system will be an important improvement. I'm unhappy that it's bundled with such large changes to aspects of the UI that were working adequately.

Most of your post is good, but you're too eager to describe trends as mysterious.

Also, your link to "a previous post" is broken.

Moore's law appears to be a special case of Wright's Law. I.e. it seems well explained by experience curve effects (or possibly economies of scale).

Secondly, we have strong reasons to suspect that there won't be any explanation that ties together things like the early evolution of life on Earth, human brain evolution, the agricultural revolution, the industrial revolution, and future technology development. These phenomena have decent local explanations that we already roughly understand

I don't see these strong reasons.

Age of Em gives some hints (page 14) that the last three transitions may have been caused by changes in how innovation diffused, maybe related to population densities enabling better division of labor.

I think Henrich's The Secret of our Success gives a good theory human evolution which supports Robin's intuitions there.

For the industrial revolution, there are too many theories, with inadequate evidence to test them. But it does seem possible that the printing press played a role that's pretty similar to Henrich's explanations for early human evolution.

I don't know much about the causes of the agricultural revolution.

I'm sometimes able to distinguish different types of feeling tired, based on what my system 1 wants me to do differently: sleep more, use specific muscles less, exercise more slowly, do less of a specific type of work, etc.

Tool-boxism implies that there is no underlying theory that describes the mechanisms of intelligence.

If I try to apply this to protein folding instead of intelligence, it sounds really strange.

Most people who make useful progress at protein folding appear to use a relatively tool-boxy approach. And they all appear to believe that quantum mechanics provides a very good theory of protein folding. Or it least it would be, given unbounded computing power.

Why is something similar not true for intelligence?

I agree with most of what you said. But in addition to changing the community atmosphere, we can also change how guarded we feel in reaction to a given environment.

CFAR has helped me be more aware of when I'm feeling guarded (againstness), and has helped me understand that those feelings are often unnecessary and fixable.

Authentic relating events (e.g. Aletheia) have helped to train my subconscious to feel more safe about feeling less guarded in contexts such as LW meetups.

There's probably some sense in which I've lowered my standards, but that's mostly been a fairly narrow sense of that term: some key parts of my system 1 have become more willing to bring ideas to my conscious attention. That has enabled me to be less guarded, with essentially no change in the intellectual standards that I use at a system 2 level.

It isn't designed to describe the orthodox view. I think the ideas it describes are moderately popular among mainstream experts, but probably some experts dispute them.

I enjoyed Shadow Syndromes, which is moderately close to what you asked for.

Henrich's The Secret of our Success isn't exactly about storytelling, but it provides a good enough understanding of human evolution that it would feel surprising to me if humans didn't tell stories.

I'd guess the same fraction of people reacted disrespectfully to Gleb in each community (i.e. most but not all). The difference was more that in an EA context, people worried that he would shift money away from EA-aligned charities, but on LW he only wasted peoples' time.

Some of what a CFAR workshop does is convince our system 1's that it's socially safe to be honest about having some unflattering motives.

Most attempts at doing that in written form would at most only convince our system 2. The benefits of CFAR workshops depend heavily on changing system 1.

Your question about prepping for CFAR sounds focused on preparing system 2. CFAR usually gives advice on preparing for workshops that focuses more on preparing system 1 - minimize outside distractions, and have a list of problems with your life that you might want to solve at the workshop. That's different from "you don't have to do anything".

Most of the difficulties I've had with applying CFAR techniques involve my mind refusing to come up with ideas about where in my life I can apply them. E.g. I had felt some "learned helplessness" about my writing style. The CFAR workshop somehow got me to re-examine that atititude, and to learn how improve it. That probably required some influence on my mood that I've only experienced in reaction to observing people around me being in appropriate moods.

Sorry if this is too vague to help, but much of the relevant stuff happens at subconscious levels where introspection works poorly.

Load More