You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Examples of growth mindset or practice in fiction

12 Swimmer963 28 September 2015 09:47PM

As people who care about rationality and winning, it's pretty important to care about training. Repeated practice is how humans acquire skills, and skills are what we use for winning.

Unfortunately, it's sometimes hard to get System 1 fully on board with the fact that repeated, difficult, sometimes tedious practice is how we become awesome. I find fiction to be one of the most useful ways of communicating things like this to my S1. It would be great to have a repository of fiction that shows characters practicing skills, mastering them, and becoming awesome, to help this really sink in.

However, in fiction the following tropes are a lot more common:

  1. hero is born to greatness and only needs to discover that greatness to win [I don't think I actually need to give examples of this?]
  2. like (1), only the author talks about the skill development or the work in passing… but in a way that leaves the reader's attention (and system 1 reinforcement?) on the "already be awesome" part, rather that the "practice to become awesome" part [HPMOR; the Dresden Files, where most of the implied practice takes place between books.]
  3. training montage, where again the reader's attention isn't on the training long enough to reinforce the "practice to become awesome" part, but skips to the "wouldn't it be great to already be awesome" part [TVtropes examples].
  4. The hero starts out ineffectual and becomes great over the course of the book, but this comes from personal revelations and insights, rather than sitting down and practicing [Nice Dragons Finish Last is an example of this].

Example of exactly the wrong thing:
The Hunger Games - Katniss is explicitly up against the Pledges who have trained their whole lives for this one thing, but she has … something special that causes her to win. Also archery is her greatest skill, and she's already awesome at it from the beginning of the story and never spends time practicing.

Close-but-not-perfect examples of the right thing:
The Pillars of the Earth - Jack pretty explicitly has to travel around Europe to acquire the skills he needs to become great. Much of the practice is off-screen, but it's at least a pretty significant part of the journey.
The Honor Harrington series: the books depict Honor, as well as the people around her, rising through the ranks of the military and gradually levelling up, with emphasis on dedication to training, and that training is often depicted onscreen – but the skills she's training in herself and her subordinates aren't nearly as relevant as the "tactical genius" that she seems to have been born with.

I'd like to put out a request for fiction that has this quality. I'll also take examples of fiction that fails badly at this quality, to add to the list of examples, or of TVTropes keywords that would be useful to mine. Internet hivemind, help?

Ideas on growth of the community

3 Lu93 12 August 2015 06:45PM

TLDR: I had idea to apply some tools I learned on coursera to our community in order to grow it better. I wanted to start some organized thinking about goals our community has, and offer some materials for people who are eager to work on it, but are maybe lost or need ideas.

 

Yesterday I did a course on coursera.org. It's called "Grow to Greatness: Smart Growth for Private Businesses, Part I". (I play lectures often at x2.5 so I can do 5 weeks course in one day)

Though this course seems obvious, it'd say pretty worth 3 hours, so look it up. (It's hard to say how much is hindsight and how much is actually too easy and basic) I got some ideas sorted, and I saw the tools. I'm not an expert now, obviously, but at least i can see when things are done in unprofessional manner, and it can help you understand what follows.

When growing anything (company, community, ...) you have different options. You should not opt for everything, because you will be spread thin. You should grow with measure, so that people can follow, and so that you can do it right. This is the essence of the course. Rest is focused on ways of growing.

This was informative part of this article. Rest is some thoughts that just came to my mind that I would like to share. Hopefully I inspire some of you, and start some organized thinking about this community.

 

This community is some kind of organization, and it has a goal. To be precise, it probably has two goals, as I see it:

  1. to make existing members more rational
  2. to get more members.

Note that second focus is to grow.

I will just plainly write down some claims this course made:

 

In order to grow:

  1. your people need to grow (as persons, to get more skills, to learn).
  2. you need to create more processes regarding customers, in order to preserve good service
  3. you often need better organization (to regulate processes inside the company)
  4. you need to focus
  5. you need a plan
  6. if you need to stop the fire, stop the fire which has the greatest impact, and make a process out of it, so that people can do it on their own afterwards

1. I guess no-one is against this. After all, we are all here to grow.

2. My guess is that our customers could be defined as new members. So, first steps someone makes here are responsibility of this organization. After, when they get into rationality more, when they start working on themselves, they become employees. That's at least how it works in my head. Book on sequences is a good step here since it helps to have it all organized in one pdf.

3. this is actually where it all started. We are just a bunch of people with common drive to be more rational. There are meetups, but that's it. I guess some people see EY as some kind of leader, but even if he were one, that's not an organization. My first idea is to create some kind of separation of topics, reddit-like. (With or without moderators, we can change that at any point if one option does not work.)

For example, I'm fed with AI topics. When i see AI, I literally stop reading. I don't even think it's rational to force that idea so much. I understand the core of this community is in that business, but:

  1. One of the first lessons in finance is "don't put all the eggs in one basket". If there is something more important than AI we are fucked if no-one sees it. I guess "non-rational" people will see it (since they were not active on this forum, therefore not focused on AI) but then people of this forum lose attribute "rational" since "non-rationals" outperformed them simply by doing random stuff.
  2. It may stop people from visiting the forum. They may disagree, they may feel "it's not right", but be unable to formulate it in "dont put all the eggs in one basket" (my example, kind of). The remaining choice is to stop visiting the site.

So, I would STRONGLY encourage new topics, and I would like to see some kind of classification. If I want to find out about AI, I want to know where to look, and if I don't want to read about it, I want to know how to avoid it. If I want to read about self-improvement, I want to know where to find it. Who knows, after some rough classification people start to do finer ones, and discuss how to increase memory without being spammed with procrastination. I think this could help the first goal (to make existing members more rational) since it would give them some overview.

I also think this would reduce cult-ism, since it would add diversity, and loose the "meta".

4. Understatement. Anyone who worked, or read anything about work knows how important plan is. It is OBLIGATORY. Essential. (See course https://www.coursera.org/learn/work-smarter-not-harder/outline )

5. I think this is not very important to us. There are lots of people here. Many enthusiasts. However, this should be some kind of guideline to make a good plan, and to tell us how much resources to devote to each problem.

 

In conclusion, I understand these things are big. But growth means change. (There is some EY quote on this, I think:not every change is improvement, but every improvement is a change, correct me if I'm wrong.) Humans did not evolve this far by being better, but by socializing and cooperating. So I think we should move from herd to organization.

 

 

Call For Agreement: Should LessWrong have better protection against cultural collapse?

3 Epiphany 03 September 2012 05:35AM

As you are probably already aware, many internet forums experience a phenomenon known as "eternal September".  Named after a temporary effect where the influx of college freshmen would throw off a group's culture every September, eternal September is essentially what happens when standards of discourse and behavior degrade in a group to the point where the group loses it's original culture.  I began focusing on solving this problem and offered to volunteer my professional web services to get it done because:

- When I explained that LessWrong could grow a lot and volunteered to help with growth, various users expressed concerns about growth not always being good because having too many new users at once can degrade the culture.

- There has been concern from Eliezer about the site "going to hell" because of trolling.

- Eliezer has documented a phenomenon that subcultures know as infiltration by "poseurs" happening in the rationalist community.  He explains that rationalists are beginning to be inundated by "undiscriminating skeptics" and has stated that it's bad enough that he needed to change his method of determining who is a rationalist.  The appearance of poseurs doesn't guarantee that a culture will be washed away by main-streamers, but may signal that a culture is headed in that direction, and it does confirm that a loss of culture is a possibility - especially if there got to be so many undiscriminating skeptics as to form their own culture and become the new majority at LessWrong.

  My plan to prevent eternal September sparked a debate about whether eternal September protection is warranted.  Lukeprog, being the decision maker whose decision is needed for me to be allowed to do this as a volunteer, requested that I debate this with him because he was not convinced but might change his mind.

 

Here are some theories about why eternal September happens:

1. New to old user ratio imbalance:

  New users need time to adjust to a forum's culture.  Getting too many new users too fast will throw off the ratio of new to old users, meaning that most new users will interact with each other rather than with older users, changing the culture permanently.

2. Groups tend to trend toward the mainstream:

  Imagine some people want to start a group.  Why are they breaking away from the mainstream?  Because their needs are served there?  Probably not.  They most likely have some kind of difference that makes them want to start their own group.  Of course not everyone fits nicely into "different" and "mainstream", no matter what type of difference you look at.  So, as a forum grows, instead of attracting people who fit nicely into the "different" category, you attract people who are similar to those in the different category.  People way on the mainstream end of the spectrum generally are not attracted to things that are very different.  But imagine how this progresses over time.  I'll create a scale between green and purple.  We'll say the green people are different and the purple people are mainstream.  So, some of the most green folks make a green forum.  Now, people who are green and similar - those with an extra tinge of red or blue or yellow join.  People in the mainstream still aren't attracted, however, since there are still more in-between people than solid green or purple people, the most greenish in-between people begin to dominate.  They and the original green people still enjoy conversation - they're similar enough to share the culture and enjoy mutual activities. But the greenish in-between people start to attract in-between people that are neither more purple or more green.  There are more in-between people than greenish in-between or green people, because purple people dominate in their larger culture, so in-between people quickly outnumber the green people.  This may still be fine because they may adjust to the culture and enjoy it, finding it a refreshing alternative to purple culture.  But the in-between people attract people who are more purplish in-betweeners than greenish in-betweeners.  There are more of those than the in-between people, so the culture now shifts to be closer to mainstream purple than different green.  At this point, it begins to attract the attention of the solid purple main streamers.  "Oh!  Our culture, but with a twist!"  They think.  Now, droves of purple main stream people deluge the place looking for "something a little different".  Instead of valuing the culture and wanting to assimilate, they just want to enjoy novelty.  So, they demand changes to things they don't like to make it suit them better.  They justify this by saying that they're the majority.  At that point, they are.

3.  Too many trolls scare away good people and throw off the balance.

 

Which theory is right?


  All of them likely play a role.

 

  I've seen for myself that trolls can scare the best people out of a forum, ruining the culture. 

  I've heard time and time again that subculture movements have problems with being watered down by mainstream folks until their cultures die and don't feel worth it anymore to the original participators.  A lot of you have probably heard of the term "poseurs".  With poseurs in a subculture, it's not that too many new people joined at once, but that the wrong sort of people joined.  The view is that there are people who are different enough to "get" their movement, and people who are not.  Those who aren't similar decided to try to appear like them even though they're not like them on the inside.  Essentially, a large number of people much nearer to the mainstream got involved, so the group was no longer a haven for people with their differences.

  And I think it's a no-brainer that if a group gets enough newbies at once, old members can't help them adjust to the culture, and the newbies will form a new culture and become a new majority.

  Also, I think all of these can combine together, create feedback loops, and multiply the others.

 

Theory about cause and effect interactions that lead to endless September:

 1.  A group of people who are very different break away from the mainstream and form a group.
 2.  People who are similarly different but not AS different join the group.
 3.  People who are similar to the similarly different people, but even less similar to the different people join the group.
 4.  It goes on this way for a while.  Since there are necessarily more people who are mainstream than different, new generations of new users may be less and less like the core group.
 5.  The group of different people begins to feel alienated with the new people who are joining.
 6.  The group of different people begin to ignore the new people.
 7.  The new people form their own culture with one another, excluding old people, because the old people are ignoring them.
 8.  Old people begin to anticipate alienation and start to see new users through tinted lenses, expecting annoyance.
 9.  New people feel alienated by the insulting misinterpretations that are caused by the expectation that they're going to be annoying. 
10.  The unwelcoming environment selects for thick-skinned people.  A higher proportion of people like trolls, leaders, spammers, debate junkies, etc are active.
11.  Enough new people who are ignored and failed to acculturate accumulate, resulting in a new majority.  If trolls are kept under control, the new culture will be a watered down version of the original culture, possibly not much different from mainstream culture.  If not, see the final possibility.
12.  If a critical mass of trolls, spammers and other alienating thick-skinned types is reached due to an imbalance or inadequate methods of dealing with them, they might ward off old users, exacerbating the imbalance that draws a disproportionate number of thick-skinned types in a feedback loop and then take over the forum.  (Why fourchan /b isn't known for having sweet little girls and old ladies.)

 

Is LessWrong at risk?

  1.  Eliezer has written about rationalists being infiltrated by main-streamers who don't get it, aka "poseurs".

  Eliezer explains in Undiscriminating Skeptics that he can no longer determine who is a rationalist based on how they react to the prospect of religious debates, and now he has to determine who is a rationalist based on who is thinking for themselves.  This is the exact same problem other subcultures have - they say the new people aren't thinking for themselves.  We might argue "but we want to spread the wonderful gift of rational thought to the mainstream!" and I would agree with that.  However, if all they're able to take away from joining is that there are certain things skeptics always believe, all they'll be taking away from us is an appeal to skepticism.  That's the kind of thing that happens when subcultures are over-run by mainstream folks.  They do not adopt the core values.  Instead, they run roughshod over them.  If we want undiscriminating skeptics to get benefits from refining the art of rationality, we have to do something more than hang out in the same place.  Telling them that they are poseurs doesn't work for subcultures, and I don't think Eliezer telling them that they're undiscriminating skeptics will solve the problem.  Getting people to think for themselves is a challenge that should not be undertaken lightly.  To really get it, and actually base your life on rationality, you've either got to be the right type, a "natural" who "just gets it" (like Eliezer who showed signs as a child when he found a tarnished silver amulet inscribed with Bayes's Theorem) or you have to be really dedicated to self-improvement.

  2. I have witnessed a fast-growing forum actually go exponential.  Nothing special was being done to advertise the forum. 

  Obviously, this risks deluging old members in a sea of newbies that would be large enough to create a newbie culture and form a new majority.

  3. LessWrong is growing fast and it's much bigger than I think everyone realizes.

  I made a LessWrong growth bar graph showing how LessWrong has gained over 13,000 members in under 3 years (Nov 2009 - Aug 2012).  LessWrong had over 3 million visits in the last year.  The most popular post has gotten over 200,000 views.  Yes I mean there are posts on here that are over 1/5 of their way to a million views, I did not mistype.  This is not a tiny community website anymore.  I see signs that people are still acting that way, like when people post their email addresses on the forum.  People don't seem to realize how big LessWrong has gotten.  Since this happened in a short time, we should be wondering how much further it will go, and planning for the contingency that could become huge.

  4. LessWrong has experienced at least one wild spike in membership.  Spikes can happen again.

  We can't control the ups and downs in visitors to the site.  That could happen again.  It could last for longer than a month.  According to Vladmir, using wget, we've got something like 600 - 1000 active users posting per month.  We've got about 300 users joining per month from the registration statistics.  What would happen if we got 900 each month for a few months in a row?  A random spike could conceivably overwhelm the members.

  5. Considering how many readers it has, LessWrong could get Slashdotted by somebody big.

  If you've ever read about the Slashdot effect, you'll know that all it might take to get a deluge bigger than we can handle is to be linked to by somebody big.  What if Slashdot links to LessWrong?  Or somebody even bigger?  We have at least one article on LessWrong that got about half as many visits as a hall of fame level Slashdot article.  The article "Scientologists Force Comment Off Slashdot" got 383692 visits on Slashdot, compared with LessWrong's most popular article at 211,000 visits. (Cite: Slashdot hall of fame.)  LessWrong is gaining popularity fast.  It's not a small site anymore.  And there are a lot of places that could Slashdot us.  I may be just a matter of time before somebody pays attention, does an article on LessWrong, and it gets flooded.

  6. We all want to grow LessWrong, and people may cause rapid growth before thinking about the consequences.

  What if people start growing LessWrong and wildly succeed?  I would like to be helping LessWrong grow but I don't want to do it until I feel the culture is well-protected.

  7. Some combination of these things might happen and deluge old people with new people.

 

Does LessWrong need additional eternal September protection?

  Lukeprog's main argument is that we don't have to worry about eternal September because we have vote downs. Here's why vote downs are not going to protect LessWrong:

  1.  If the new to old user ratio becomes unbalanced, or the site is filled with main streamers who take over the culture, who is going to get voted down most?  The new users, or the old ones?  The old members will be outnumbered, so it will likely be old members.

  2. This doesn't prevent new users from interacting primarily with new users.  If enough people join, there may not be enough old users doing vote downs to discourage them anymore.  That means if the new to old user ratio were to become unbalanced, new users may still interact primarily with new users and form their own, larger culture, a new majority.

  3.  Let's say Fourchan /b decides to visit.  A hundred trolls descend upon LessWrong.  The trolls, like everybody else, have the ability to vote down anything they want.  The trolls of course will enjoy harassing us endlessly with vote downs.  They will especially enjoy the fact that it only takes three of them to censor somebody.  They will find it a really, really special treat that we've made it so that anybody who responds to a censored person ends up getting points deducted.  From a security perspective, this is probably one of the worst things that you could do.  I came up with an idea for a much improved vote down plan.

 

Possibly more important: What happens if we DO prevent an eternal September?

  What we are deciding here is not simply "do we want to protect this specific website from cultural collapse?" but "How do we want to introduce the art of refining rationality to the mainstream public?"

  Why do main streamers deluge new cultures and what happens after that?  What do they get out of it?  How does it affect them in the long-term?  Might being deluged by main streamers make it more likely for main streamers to become better at rational thought, like a first taste makes you want more?

  If we kept them from doing that, what would happen, then? 

  Say we don't have a plan.  LessWrong is hit by more users than it can handle.  Undiscriminating skeptics are voting down every worthwhile disagreement.  So, as an emergency measure, registrations are shut off, the number of visits to the website grows and then falls.  We succeed in keeping out people who don't get it.  After it has peaked, the fad is over.  Worse, we've put them off and they're offended.  Or, we don't shut off registrations, we're deluged, and now everyone thinks that a "rationalist" an "undiscriminating skeptic".  We've lost the opportunity to get through to them, possibly for good.  Will they ever become more rational?  LessWrong wants to make the world a more rational place.  An opportunity to accomplish that goal could happen.  Eliezer figured out a way to make rationality popular.  Millions of people have read his work.  This could go even bigger.

  This is why I suggested two discussion areas - then we get to keep this culture and also have an opportunity to experiment with ways for the people who are not naturals at it to learn faster.  If we succeed in figuring out how to get through to them, we will know that the deluge will be constructive, if one happens.  Then, we can even invite one on purpose.  We can even advertise for that and I'd be happy to help.  But if we don't start with eternal September protection, we could lose all this progress, lose our chance to get through to the mainstream, and pass like a fad.

  For that reason, even if eternal September doesn't look likely to you after everything that I've explained above, I say it is still worthwhile to develop a tested technique to preserve LessWrong culture against a deluge and get through to those who are not naturals.  Not doing so takes a risk with something important.

 

Please critique.

  Your honest assessments of my ideas are welcome, always.

 

Preventing discussion from being watered down by an "endless September" user influx.

14 Epiphany 02 September 2012 03:46AM

  In the thread "LessWrong could grow a lot, but we're doing it wrong.", I explained why LessWrong has the potential to grow quite a lot faster in my opinion, and volunteered to help LessWrong grow.  Of course, a lot of people were concerned about the fact that a large quantity of new members will not directly translate to higher quality contributions or beneficial learning and social experiences in discussions, so I realized it would be better to help protect LessWrong first.  I do not assume that fast growth has to cause a lowering of standards.  I think fast growth can be good if the right people are joining and all goes well (specifics herein).  However, if LessWrong grows carelessly, we could be inviting an "Endless September", a term used to describe a never ending deluge of newbies that "degraded standards of discourse and behavior on Usenet and the wider Internet" (named after a phenomenon caused by an influx of college freshmen).  My perspective on this is that it could happen at any time, regardless of whether any of us does anything.  Why do I think that?  LessWrong is growing very fast and could snowball on it's own.  I've seen that happen, I saw it ruin a forum.  That site wasn't even doing anything special to advertise the forum that I am aware of.  The forum was just popular and growth went exponential.  For this reason, I asked for a complete list of LessWrong registration dates in order to make a growth chart.  I received it on 08-23-2012.  The data shows that LessWrong has 13,727 total users, not including spammers and accounts that were deleted.  From these, I have created a LessWrong growth bar graph:

 

 

 

  Each bar represents a one month long total of registration dates (the last bar is a little short, being that it only goes up until the 23rd).  The number of pixels in each bar is equal to the number of registrations each month.  The first (leftmost) bar that hits the top of the picture (it actually goes waaaay off the page) mostly represents the transfer of over 2000 accounts from Overcoming Bias.  The right bar that goes off the page is so far unexplained for me - 921 users joined in September 2011, more than three times the number in the months before and after it.  If you happen to know what caused that, I would be very interested in finding out. (No, September 2010 does not stand out, if you were wondering the same thing).  If anyone wants to do different kinds of analysis, I can generate more numbers and graphs fairly easily.

  As you can see, LessWrong has experienced pretty rapid growth.

  Growth is in a downward trend at the moment, but as you can see from the wild spikes everyplace, this could change any time.  In addition to LessWrong growing on it's own, other events that could trigger an "endless September" effect are:

  LessWrong could be linked to by somebody really big (see: Slashdot effect on Wikipedia).

  LessWrong could end up on the news after somebody does something news worthy or because a reporter discovers LessWrong culture and finds it interesting or weird.

  (A more detailed explanation is located here.)

  For these reasons, I feel it is a good idea to begin constructing endless September protection, so I have volunteered some of my professional web services to get it done.  This has to be done carefully because if it is not done right, various unwanted things may happen.  I am asking for any ideas or links to ideas you guys have that you think were good and am laying out my solutions and the pitfalls I have planned for below in order to seek your critiques and suggestions.

 

Cliff Notes Version:

  I really thought this out quite a bit because I think it's going to be tricky and because it's important.  So I wrote a cliff notes version of the below solution ideas with pros and cons for each which is about a tenth the size.

 

The most difficult challenge and my solution:

  People want the site to be enriching for those who want to learn better reasoning but haven't gotten very far yet.

  People also want an environment where they can get a good challenge, where they are encouraged to grow, where they can get exposed to new ideas and viewpoints, and where they can get useful, constructive criticism. 

  The problem is that a basic desire all humans seem to share is a desire to avoid boredom.  There is possibly a survival reason for this:  There is no way to know everything, but missing even one piece of information can spell disaster.  This may be why the brain appears to have evolved built-in motivators to prod you to learn constantly.  From the mild ecstasy of flow state (cite Flow: The psychology of peak experiencing) to tedium, we are constantly being punished and rewarded based on whether we're receiving the optimal challenge for our level of ability. 

  This means that those who are here for a challenge aren't going to spend their time being teachers for everybody who wants to learn.  Not everyone has a teacher's personality and skill set to begin with, and some people who teach do it as writers, explaining to many thousands, rather than by explaining it one-to-one.  If everyone feels expected to teach by hand-holding, most will be punished by their brains for not learning more themselves, and will be forced to seek a new learning environment.  If beginners are locked out, we'll fail at spreading rationality.  The ideal is to create an environment where everyone gets to experience flow, and no one has to sacrifice optimal challenge.

  To make this challenge a bit more complicated, American culture (yes, a majority of the visits, 51.12%, are coming from the USA - I have access to the Google Analytics) can get pretty touchy about elitism and anti-intellectualism.  Even though the spirit of LessWrong - wanting to promote rational thought - is not elitist but actually inherently opposite to that (to increase good decision making in the world "spreads the wealth" rather than hoarding it or demanding privileges for being capable of good decisions), there is a risk that people will see this place as elitist.  And even though self-improvement is inherently non-pretentious (by choosing to do self-improvement, you're admitting that you've got flaws), undoubtedly there will be a large number of people who might really benefit from learning here but instead insta-judge the place as "pretentious".  Interpreting everything intellectual as pretentious and elitist is an unfortunate habit in our culture.  I think, with the right wording on the most prominent pages (about us, register, home page, etc.) LessWrong might be presented as a unique non-elitist, non-pretentious place.

  For these reasons, I am suggesting multiple discussion areas that are separated by difficulty levels.  Presenting them as "Easy and Hard" will do three things:

  1. Serve as a reminder to those who attend that it's a place of learning where the objective is to get an optimal challenge and improve as far as possible.  This would help keep it from coming across as pretentious or elitist.

  2. Create a learning environment that's open to all levels, rather than a closed, elitist environment or one that's too daunting.  The LessWrong discussion area is a bit daunting to users, so it might be really desirable for people to have an "easy" discussion area where they can learn in an environment that is not intimidating.

  3. Give us an opportunity to experiment with approaches that help willing people learn faster.

 

Endless September protection should be designed to avoid causing these side-effects:

 

  Creating an imbalance in the proportion of thick-skinned individuals to normal individuals.

  Anything that annoys, alienates or discourages users is going to deter a lot of people while retaining thick-skinned individuals.  Some thick-skinned individuals are leaders, but many are trolls, and thick-skinned individuals may be more likely to resist acculturation or try to change the culture (though it could be argued the other way - that their thick skin allows them to take more honest feedback).  For example: anonymous, unexplained down votes create a gauntlet for new users to endure which selects for a high tolerance to negative feedback.  This may be the reason it has been reported that there are a lot of "annoying debater types".

 

  People that we do want fail to join because the method of protection puts them off.

  There are two pitfalls that I think are going to be particularly attractive, but we should really avoid them:

  1.) Filtering into hard/easy based on anything other than knowledge about rational thinking.  There are various reasons that could go very wrong.

    - Filtering in any other way will keep out advanced folks who may have a lot to teach.

    If a person has already learned good reasoning skills in some other way, do we want them at the site?  There might be logic professors, Zen masters, debate competition champs, geniuses, self-improvement professionals, hard-core bookworms and other people who are already advanced and are interested in teaching others to improve their skills, or interested in finding a good challenge, or are interested in contributing articles, but have already learned much of the material the sequences cover.  Imagine that a retired logic professor comes by hoping to get a challenge from similarly advanced minds and perhaps do a little volunteer work teaching about logic as a past time.  Now imagine requiring them to read 2,000 pages of "how to think rationally" in order to gain access to all the discussion areas.  This will almost guarantee that they go elsewhere.

    - Filtering based on the sequences or other cultural similarities would promote conformity and repel the true thinkers.

    If true rationalists think for themselves, some of them will think differently, some of them will disagree.  Eliezer has explained in undiscriminating skeptics that "I do propose that before you give anyone credit for being a smart, rational skeptic, that you ask them to defend some non-mainstream belief." he defines this as "It has to be something that most of their social circle doesn't believe, or something that most of their social circle does believe which they think is wrong."  If we want people in the "hard" social group who are likely to hold and defend non-mainstream beliefs, we have to filter out people unable to defend beliefs without scaring off those who have beliefs different from the group.

  2.) Discouraging people with unusually flawed English from participating at all levels.  Doing that would stop two important sources of new perspectives from flowing in:

    - People with cultural differences, who may bring in fresh perspectives.

    If you're from China, you may want to share perspectives that could be new and important to a Westerner, but may be less likely to meet the technical standards of a perfectionist when it comes to writing in English.

    - People with learning differences, whose brains work differently and may offer unique insight.

    A lot of gifted people have learning disorders and gifted people who don't tend to have large gaps between skill levels.  It is not uncommon to find a gifted person whose abilities with one skill are up to 40% behind (or better than) their abilities in other areas.  This phenomenon is called "asynchronous development".  We associate spelling and grammar with intelligence, but the truth is that those who have a high verbal IQ may not have equally intelligent things to say, and people who word things crudely due to asynchronous development (engineers, for instance, are not known for their communication skills but can be brilliant at engineering) may be ignored even though they could have important things to say.  Dyslexics, who have all kinds of trouble from spelling to vocabulary to arranging sentences oddly may be ignored despite the fact that "children and adults who are dyslexic usually excel at problem solving, reasoning, seeing the big picture, and thinking out of the box" (Yale).

   Everyone understands the importance of making sure all the serious articles get published with good English, but frequently in intellectual circles, the attitude is that if you aren't a perfectionist about spelling and grammar, you're not worth listening to at all.  The problem of getting articles polished when they are written by dyslexics or people for whom English is a second language should be pretty easy - people with English problems can simply seek a volunteer editor.  The ratio of articles being published by these folks versus the number of users at the site encourages me to believe that these guys will be able to find someone to polish their work.  Since it would be so easy to accommodate for these disabilities, taking an attitude that puts form over function as a filter would not serve you well.  If dyslexics and people with cultures different from the majority feel that we're snobby about technicalities, they could be put off.  This could already be happening and we could be missing out on the most creative and most different perspectives this way.

 

People who qualify under the "letter" of the standards do not meet the spirit of the standards.

  For instance:  They claim to be rationalists because they agree with a list of things that rationalists agree with, but don't think for themselves, as Eliezer cautions about in undiscriminating skeptics.  Asking them questions like "Are you an atheist?" and "Do you think signing up for cryo makes sense?" would only draw large numbers of people who agree but do not think for themselves.  Worse, that would send a strong message saying: "If you don't agree with us about everything, you aren't welcome here."

 

The right people join, but acculturate slowly or for some reason do not acculturate. 

  - Large numbers of users, even desirable ones, will be frustrating if newbie materials are not prominently posted.

  I was very confused and disoriented as a new user.  I think that there's a need for an orientation page.  I wrote about my experiences as a new user here which I think might make a good starting point for such a new user orientation page.  I think LessWrong also needs a written list of guidelines and rules that's positioned to be "in your face" like the rest of the internet does (because if users don't see it where they expect to find it, then they will assume there isn't one).  If new users adjust quickly, both old users and new users will be less annoyed if/when lots of new users join at once.

 

The filtering mechanism gives LessWrong a bad name.

  For instance, if we were to use an IQ test to filter users, the world may feel that LessWrong is an elitist organization.  Sparking an anti-intellectual backlash would do nothing to further the cause of promoting rationality, and it doesn't truly reflect the spirit of bringing everyone up, which is what this is supposed to do.  Similarly, asking questions that may trigger racial, political or religious feelings could be a bad idea - not because they aren't sources of bias, but because they'll scare away people who may have been open to questioning and growing but are not open to being forced to choose a different option immediately.  The filters should be a test about reasoning, not a test about beliefs.

 

Proposed Filtering Mechanisms:

 

  Principle One:  A small number of questions can deter a lot of activity.

  As a web pro, I have observed a 10 question registration form slash the number of files sent through a file upload input that used to be public.  The ten questions were not that hard - just name, location, password, etc.  Asking questions deters people from signing up.  Period.  That is why, if you've observed this trend as well, I think that a lot of big websites have begun asking for minimal registration info: email address and password only.  Years ago, that was not common, it seemed that everyone wanted to give you ten or twenty questions.  For this reason, I think it would be best if the registration form stays simple, but if we create extra hoops to jump through to use the hard discussion area, only those who are seriously interested will join in there.  Specific examples of questions that meet the other criteria are located in the proposed acculturation methods section under: A test won't deter ignorant cheaters, but they can force them to educate themselves.

 

  Principle Two:  A rigorous environment will deter those who are not serious about doing it right.

  The ideal is to fill the hard discussion area with the sort of rationalists who want to keep improving, who are not afraid to disagree with each other, who think for themselves.  How do you guarantee they're interested in improving?  Require them to sacrifice for improvement.  Getting honest feedback is necessary to improve, but it's not pleasant.  That's the perfect sacrifice requirement:

  Add a check box that they have to click where it says "By entering the hard discussion area, I'm inviting everyone's honest criticisms of my ideas.  I agree to take responsibility for my own emotional reactions to feedback and to treat feedback as valuable.  In return for their valuable feedback, which is a privilege and service to me, I will state my honest criticisms of their ideas as well, regardless of whether the truth could upset them."

  I think it's common to assume that in order to give honest feedback one has to throw manners out the window.  I disagree with that.  I think there's a difference between pointing out a brutal reality, and making the statement of reality itself brutal.  Sticking to certain guidelines like attacking the idea, not the person and being objective instead of ridiculing makes a big difference.  

  There are other ways, also, for less bold people, like the one that I use in IRL environments: Hint first (sensitive people get it, and you spare their dignity) then be clear (most people get it) then be brutally honest (slightly dense people get it). If I have to resort to the 2x4, then I really have to decide whether enlightening this person is going to be one of those battles I choose or one of those battles I do not choose.  (I usually choose against those battles.)

  How do you guarantee they're capable of disagreeing with others?  Making it clear that they're going to experience disagreements by requiring them to invite disagreements will not appeal to conformists.  Those who are not yet thinking for themselves will find it impossible to defend their ideas if they do join, so most of them will become frustrated and go back to the easy discussion area.  People who don't want intellectual rigor will be put off and leave.

  It's important that the wording for the check box has some actual bite to it, and that the same message about the hard discussion area is echoed in any pages that advise on the rules, guidelines, etiquette, etc.  To explain why, I'll tell a little story about an anonymous friend:

  I have a friend that worked at Microsoft.  He said the culture there was not open to new ideas and that management was not open to hearing criticism.  He interviewed with various companies and chose Amazon.  According to this friend, Amazon actually does a good job of fulfilling values like inviting honest feedback and creating an environment conductive to innovation.  He showed me the written values for each.  I didn't think much of this at first because most of them are boring and read like empty marketing copy.  Amazon.com has the most incredible written values page I've ever seen - it does more than sit there like a static piece of text.  It gives you permission.  Instead of saying something fluffy like: "We value integrity and honesty and our managers are happy to hear your criticisms." it first creates expectations for management: " Leaders are sincerely open-minded, genuinely listen, and are willing to examine their strongest convictions with humility." and then gives employees permission to give honest feedback to decision-makers: "Leaders (all employees are referred to as "leaders") are obligated to respectfully challenge decisions when they disagree, even when doing so is uncomfortable or exhausting.  Leaders have conviction and are tenacious. They do not compromise for the sake of social cohesion."  The Amazon values page gives their employees permission to innovate as well: "As we do new things, we accept that we may be misunderstood for long periods of time."  If you look at Microsoft's written values, there's no bite to them.  What do I mean by bite?

  Imagine you're an employee at Amazon.  Your boss does something stupid.  The cultural expectation is that you're not supposed to say anything - offending the boss is bad news, right?  So you're inhibited.  But the thing they've done is stupid.  So you remember back to the values page and go bring it up on your computer.  It says explicitly that your boss is expected to be humble and that you are expected to sacrifice social cohesion in this case and disagree.  Now, if your boss gets irritated with you for disagreeing, you can point back to that page and say "Look, it's in writing, I have permission to tell you."

  Similarly, there is, what I consider to be, a very unfortunate social skills requirement that more or less says if you don't have something nice to say, don't say anything at all.  Many people feel obligated to keep constructive criticism to themselves.  A lot of us are intentionally trained to be non-confrontational.  If people are going to overcome this lifetime of training to squelch constructive criticism, they need an excuse to ignore that social training.  Not just any excuse.  It needs to be worded to require them to do that and it needs to be worded to require them to do it explicitly despite the consequences.

 

  Principle Three:  If we want innovation, we have to make innovators feel welcome.

  That brings me to another point.  If you want innovation, you can't deter the sort of person who will bring it to you: the "people who will be misunderstood for long periods of time", as Amazon puts it.  If you give specific constructive criticism to a misunderstood person, this will help them figure out how to communicate - how else will they navigate the jungle of perception and context differences between themselves and others?  If you simply vote them down, silently and anonymously, they have no opportunity to learn how to communicate with you and what's worse is that they'll be censored after three votes.  This ability for three people to censor somebody with no accountability, and without even needing a reason, encourages posters to keep quiet instead of taking the sort of risk an innovator needs to take in presenting new ideas, and it robs misunderstood innovators of those opportunities for important feedback - which is required for them to explain their ideas.  Here is an example of how feedback can transform an innovator's description of a new idea from something that seems incomprehensible into something that shows obvious value:

  On the "Let's start an important start-up" thread, KrisC posts a description of an innovative phone app idea.  I read it and I cannot even figure out what it's about.  My instinct is to write it off as "gibberish" and go do something else.  Instead, I provide feedback, constructive criticism and questions.  It turns out that the idea KrisC has is actually pretty awesome.  All it took was for KrisC to be listened to and to get some feedback, and the next description that KrisC wrote made pretty good sense.  It's hard to explain new ideas but with detailed feedback, innovation may start to show through.  Link to KrisC and I discussing the phone app idea.

 

Proposed Acculturation Methods:

 

   Send them to Center for Modern Rationality

   Now that I have discovered the post on the Center for Modern Rationality and have see that they're targeting the general population and beginners with material for local meetups, high schools and colleges and they're planning some web apps to help with rationality training, I see that referring people over to them might be a great suggestion.  Saturn suggested sending them to appliedrationality.org before I found this but I'm not sure if that would be adequate since I don't see a lot of stuff for people to do on their website.

 

    Highlight the culture.

    A database of cultural glossary terms can be created and used to highlight those terms on the forum.  The terms are already on the page, so what good would this do?  Well, first they can be automatically linked to the relevant sequence or wiki page.  If old users do not have to look for the link, this speeds up the process of mentioning them to new users quite a lot.  Secondly, it would make the core cultural items stand out from all of the other information, which will likely cause new users to prioritize it.  Thirdly, there will be a visual effect on the page.  You'll be able to see that this place has it's own vocabulary, it's own personality, it's own memes.  It's one thing to say "LessWrong has been influenced by the sequences" to a new user who hasn't seen all those references on all of those pages, and even if they do see them, won't know where they're from, versus making it immediately obvious how by giving them a visual that illustrates the point.

 

    Provide new users with real feedback instead of mysterious anonymous down votes:

    We have karma vote buttons, but this is not providing useful feedback for new users.  Without a specific reason, I have no way to tell if I'm being down voted by trolls and I may see ten different possible reasons for being voted down and not know which one to choose.  This annoyance selects for thick-skinned individuals like trolls and fails to avoid the "imbalance in the proportion of thick-skinned individuals to normal individuals" side-effect.

    If good new users are to be preserved, and the normal people to troll ratio is to be maintained, we need to add a "vote to ban" button that's used only for blatant misbehavior, and if an anonymous feedback system is to be used for voting down, it needs to prompt you for more detailed feedback - either allowing you to select from categories, or give at least one or two words as an explanation.  Also, the comments need to should show both up votes and down votes.  If you don't know when you've said something controversial and are being encouraged to view everything you say as black-and-white good-or-bad, this promotes conformity.

 

     A test won't deter ignorant cheaters, but they can force them to educate themselves.

    Questions can be worded in such a way that they serve as a crash course in reasoning in the event that someone posts a cheat sheet or registrants look up all the answers on the internet.  Assuming that the answer options are randomly ordered so that you have to actually read them then the test should, at the very least, familiarize them with the various biases and logical fallacies, etc.  Examples:

    --------------

    Person A in a debate explains a belief but it's not well-supported.  Their opponent, person B, says they're an idiot.  What is this an example of?

    A. Attacking the person, a great way to really nail a debate.

    B. Attacking the person, a great way to totally fail in debate because you're not even attacking their ideas.

    --------------

    You are with person X and person Y.  Person Y says they have been considering some interesting new evidence of what might be an alien space craft and aren't sure what to think yet.  You both see person Y's evidence, and neither of you has seen it before.  Person X says to you that they don't believe in UFOs and don't care about person Y's silly evidence.  Who is the better skeptic?

    Person X because they have the correct belief about UFOs.

    Person Y because they are actually thinking about it, avoiding undiscriminating skepticism.

    --------------

    Note:  These questions are intentionally knowledge-based.  If the purpose is to avoid requiring an IQ test, and to create an obstacle that requires you to learn about reasoning before posting in "hard", that's the only way that these can be done.

 

    Encouraging users to lurk more. 

   Vaniver contributed this: Another way to cut down on new-new interaction is to limit the number of comments someone can make in a time period- if people can only comment once an day until their karma hits 20, and then once an hour until their karma hits 100, and then they're unrestricted, that will explicitly encourage lurking / paying close attention to karma among new members. (It would be gameable, unless you did something like prevent new members from upvoting the comments of other new members, or algorithmically keeping an eye out for people gaming the system and then cracking down on them.) [edit] The delay being a near-continuous function of the karma- say, 24 hours*exp(-b karma)- might make the incentives better, and not require partitioning users explicitly. No idea if that would be more or less effort on the coding side.

    Cons:  This would deter some new users from becoming active users by causing them to lose steam on their initial motivation to join.  It might be something that would deter the right people.  It might also filter users, selecting for the most persistent ones, or for some other trait that might change the personality of the user base.  This would exacerbate the filtering effect that the current karma system is exerting, which, I theorize, is causing there to be a disproportionate number of thick-skinned individuals like trolls and debate-oriented newbies.  My theory about how the karma system is having a bad influence

 

    Give older users more voting power. 

    Luke suggested "Maybe this mathematical approach would work. (h/t matt)" on the "Call for Agreement" thread. 

    I question, though, whether changing the karma numbers on the comments and posts in any way would have a significant influence on behavior or a significant influence on who joins and stays. Firstly, votes may reward and punish but they don't instruct very well - unless people are very similar, they won't have accurate assumptions about what they did wrong. I also question whether having a significant influence on behavior would prevent a new majority from forming because these are different problems. The current users who are the right type may be both motivated and able to change, but future users of the wrong type may not care or may be incapable of changing. They may set a new precedent where there are a lot of people doing unpopular things so new people are more likely to ignore popularity. The technique uses math and the author claims that "the tweaks work" but I didn't see anything specific about what the author means by that nor evidence that this is true. So this looks good because it is mathematical, but it's less direct than other options so I'm questioning whether it would work.

  Vladimir_Nesov posted a variation here.

 

  Make a different discussion area for users with over 1000 karma.

  Posted by Konkvistador here.

 

  Make a Multi Generation Culture.

  Limit the number of new users that join the forum to a certain percentage per month, sending the rest to a new forum.  If that forum grows too fast, create additional forums.  This would be like having different generations.  New people would be able to join an older generation if there is space.  Nobody would be labeled a "beginner".

 

  Temporarily turn off registration or limit the number of users that can join.

  (See the cliff notes version for more.)

 

Should easy discussion participants be able to post articles?

  I think the answer to this is yes, because no filtering mechanism is perfect and the last thing you want to do is filter out people with a different and important point of view.  Unless the site is currently having issues with trolls posting new articles, or with the quality of the articles going down, leaving that freedom intact is best.  I definitely think, though, that written guidelines for posting an article need to be put in "in your face" expected places.  If a lot of new users join at once, well-meaning but confused people will be posting the wrong sorts of things there - making sure they've got the guidelines right there is all that's probably needed to deter them.

 

Testing / measuring results:

  How do we tell if this worked?  Tracking something subjective like whether we're feeling challenged or inundated with newbies is not going to be a straightforward matter of looking at numbers.  (Methods to assist willing people learn faster deserves it's own post.)  Just because it's subjective doesn't mean tracking is impossible or that working out whether it's made a difference cannot be done.  I suspect that a big difference will be noticed in the hard discussion area right away.  Here are some figures that are relevant and can be tracked, that may give us insight and ways to check our perceptions:

  1.  How many people are joining the hard forum versus the easy forum?  If we've got a percentage, we know how *much* we've filtered, though we won't know exactly *who* we've filtered.

  2.  Survey the users to ask whether the conversations they're reading have increased in quality.

  3.  Survey the users to ask whether they've been learning more since the change.

  4.  See which area has the largest ratio of users with lots of vote downs. 

  (This could be tricky because people who frequently state disagreements might be doing a great service to the group, but might be unpopular because of it, and people who are innovative may be getting voted down due to being misunderstood.  One would think, though, that people who are unpopular due to disagreeing, or being innovative, assuming they're serious about good reasoning, would end up in the hard forum.) 

 

Request for honest feedback:

  Your honest criticisms of this idea and your suggestions will be appreciated, and I will update this idea or write a new one to reflect any good criticisms or ideas you contribute.

 

This is in the public domain:

  This idea is hereby released into the public domain, with acknowledgement from Luke Muehlhauser that those were my terms prior to posting.  My intent is to share this idea to make it impossible to patent and my hope is that it will be free for the whole world to use.

  Preventing discussion from being watered down by an "endless September" user influx. by Epiphany is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

 

Why are certain trends so precisely exponential?

16 orthonormal 06 August 2011 05:38PM

I was reading a post on the economy from the political statistics blog FiveThirtyEight, and the following graph shocked me:

This, according to Nate Silver, is a log-scaled graph of the GDP of the United States since the Civil War, adjusted for inflation. What amazes me is how nearly perfect the linear approximation is (representing exponential growth of approximately 3.5% per year), despite all the technological and geopolitical changes of the past 134 years. (The Great Depression knocks it off pace, but WWII and the postwar recovery set it neatly back on track.) I would have expected a much more meandering rate of growth.

It reminds me of Moore's Law, which would be amazing enough as a predicted exponential lower bound of technological advance, but is staggering as an actual approximation:

I don't want to sound like Kurzweil here, but something demands explanation: is there a good reason why processes like these, with so many changing exogenous variables, seem to keep right on a particular pace of exponential growth, as opposed to wandering between phases with different exponents?

EDIT: As I commented below, not all graphs of exponentially growing quantities exhibit this phenomenon- there still seems to be something rather special about these two graphs.

How I applied useful concepts from the personal growth seminar "est" and MBTI

3 suecochran 10 April 2011 11:49PM

I have encountered personally in conversations, and also observed in the media over the past couple of decades, a great deal of skepticism, scorn, and ridicule, if not merely indifference or dismissal, from many people in reaction to the est training, which I completed in 1983, and the Myers-Briggs Type Indicator tool, which I first took in 1993 or 1994. I would like to share some concrete examples from my own life where information and perspective that I gained from these two sources have improved my life, both in my own way of conceptualizing and approaching things, and also in my relationships with others. I do this with the hope and intention of showing that est and MBTI have positive value, and encouraging people to explore these and other tools for personal growth.

One important insight that I gained from the est training is an understanding and the experience that I am not my opinions, and my opinions are not me. Opinions are neutral things, and they may be something I hold, or agree with, but I can separate my self from them, and I can discuss them, and I can change or discard them, but I am still the same "me". I am not more or less "myself" in relation to what I think or believe. Before I did the est training, whenever someone would question an opinion I held, I felt personally attacked. I identified my self with my opinion or belief. My emotional response to attack, like for many other people, is to defend and/or to retreat, so when I perceived of my "self" being "attacked", I gave in to the standard fight or flight response, and therefore I did not get the opportunity to explore the opinion in question to see if the person who questioned me had some important new information or a perspective that I had not previously considered. It is not that I always remember this or that it is my first response, but once I notice myself responding in the old way, I can then take that step back and remember the separation between self and opinion. That choice is now available to me, where it wasn't before. When I find myself in conversations with another person or people who disagree with me, my response now is to draw them out, to ask them about what they believe and why they believe it. I regard myself as if I were a reporter on a fact-finding mission. I step back and I do not feel attacked. I learn sometimes from this, and other times I do not, but I no longer feel attacked, and I find that I can more easily become friends with people even if we have disagreements. That was not the case for me prior to doing est.

Another valuable tool that I got from est and still use in my life is the ability to accept responsibility without attaching blame to it, even if someone is trying to heap blame upon me. This is similar to what I said above about basically not identifying my self with what I think. I do not have to feel or think of myself as a "bad person" because I made a mistake. I have come to the belief that guilt is an emotion that I need not wallow in. If I feel guilt about doing or not doing something, saying or not saying something, I take that feeling of guilt as a sign that I either need to take some action to rectify the situation, and/or I need to apologize to someone about it, and/or I need to learn from the situation so that hopefully I will not repeat it, and then forgive myself, and move on. Hanging on to guilt is something I see many people doing, and it not only holds them up and blocks them off from taking action, they often pull that feeling in and create a scenario or self-definition that involves beating themselves up about it, or they wallow around in feeling guilty in a way that serves as a self-indulgent excuse for not improving things. "I'm so awful, I'm such a screw-up, I can't do anything right." That kind of negative self-esteem can affect a person for their entire life if they allow it to. There are many ways to come to these realizations, and I make no claim that est is some kind of "cure-all". One of the characters on the tv show "SOAP" called est "The McDonald's of Psychiatry". That's amusing, but it denigrates a very useful and powerful experience. I believe in an eclectic approach to life. I look at many things, explore many ideas and experiences, and I take what works and leave the rest. est is only one of many helpful experiences I have had in my 49 years.

I took the Myers-Briggs Personality Index at a science fiction convention in the early years of my marriage, when I was living in Alexandria, VA, in 1993 and 1994. It was given as part of a panel, and I also took it again when I read "Do What You Are", which is a book about finding employment/a profession based on your MBTI personality type. The basics, if you have not encountered MBTI before are: There are 4 "continuums" in how people tend to interact with the world. Most people use both sides of each continuum, but are most comfortable on one side. The traits are Extrovert/Introvert, Sensing/Intuiting, Thinking/Feeling, and Judging/Perceiving. (The use of these words in the MBTI context is not exactly the same as their dictionary definitions). I am a strong ENFP. My husband was an ISTP. Understanding the differences between how we approached the world was very helpful to me in learning why we were so different about socializing with other people, and about our communication style with each other. As an "I", John (as they put it in the book), "got his batteries charged" by mostly being alone. I, as an "E", got mine charged by being with other people. We went to conventions and parties, but he often wanted to leave well before I felt ready to go. Once we had two cars, we would each take our own to events. Even though I felt it wasted gas, it gave him the opportunity to "flee" once he had had enough of being with others, while I could then come home at my leisure, and neither of us had to give up on what made us happier and more comfortable. It also explained why he would not always respond immediately to a question. "I "people tend to figure out in their own mind first what they want to say before they say anything aloud. "E" people often start talking right away, and as they speak, what they think becomes clearer to them. This is also a very useful data point for teachers. If they know about it, they can realize that the "I" kids need more time to come up with their answers, while the "E" kids put their hands in the air more immediately. They can then allow the "I" kids the time they need to respond to questions without thinking they are not good students, or are not as intelligent or knowledgeable as they "E" kids are.

My boyfriend is an ENTJ. The source of some of the friction in our relationship became clear to me after I asked him to find out his Myers-Briggs type, which he had never done before. Gerry often asks me to give him a list of what I want to do in the course of my day, and how much time things will take. These are reasonable requests. However, the rub comes from the fact that as a "J", he is uncomfortable not knowing the answer to these things. I, as a "P", am uncomfortable stating these things in advance, in nailing things down. I prefer to leave things open-ended. He regarded what I said as more concrete, whereas I regarded it more as a guideline, but not a definite plan or promise. In addition, I have always had a hard time judging how long things will take, and as a person with ADD, I also get distracted easily, so it was making me upset when he would come home and ask me what I'd gotten done, and then he would get upset when I hadn't done what I had said I wanted to, or if things took longer than I said they would. Understanding the differences in our types has helped me to understand more about why this has been an area of friction. That leaves room for us to discuss it without feeling the need to blame each other for our preferred method of dealing with things. I feel clearer about stating goals for the day, but not necessarily promising to do specific things, and working on figuring out how to allocate enough time for things. He understands that just because I tell him what I would like to do, it is not necessarily what I will end up doing. It's still a work in progress.

I want to be clear that I am not talking about using the types as excuses to get out of doing things, or for taking what other people feel is "too long" to get things done. It's merely another "tool in my tool box" that helps me to process how I and my loved ones function, and to figure out how to improve.

I am curious to know how other people feel about their experiences, if they have done a personal growth seminar such as est and/or taken the MBTI, if they feel that they have also taken tools from those experiences that have had an ongoing positive impact on their lives and relationships. I look forward to hearing what people have to say in response to this article.