Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.
This sort of thinking seems bad:
me.INTRINSIC_WORTH = 99999999; No matter what I do, this fixed property will remain constant.
This sort of thinking seems socially frowned upon, but accurate:
a.impactOnSociety(time) > b.impactOnSociety(time)
a.qualityOfCharacter > b.qualityOfCharacter // determined by things like altruism, grit, courage, self awareness...
Similar points could be made by replacing a/b with [group of people]. I think it's terrible to say something like:
This race is inherently better than that race. I refuse to change my mind, regardless of the evidence brought before me.
But to me, it doesn't seem wrong to say something like:
Based on what I've seen, I think that the median member of Group A has a higher qualityOfCharacter than the median member of Group B. I don't think there's anything inherently better about Group A. It's just based on what I've observed. If presented with enough evidence, I will change my mind.
Credit and accountability seem like good things to me, and so I want to live in a world where people/groups receive credit for good qualities, and are held accountable for bad qualities.
I'm not sure though. I could see that there are unintended consequences of such a world. For example, such "score keeping" could lead to contentiousness. And perhaps it's just something that we as a society (to generalize) can't handle, and thus shouldn't keep score.
This is intended to be a pretty broad discussion of sports. I have some thoughts, but feel free to start your own threads.
tl;dr - My impression is that people here aren't very interested in sports. My impression1 is that most people have something to gain by both competitive and recreational sports. With competitive sports you have to be careful not to overdo it. With recreational sports, the circumstances have to be right for it to be enjoyable. I also think that sports get a bad rep for being simple and dull. In actuality, there's a lot of complexity.
1 - Why does this have to sound bad?! I have two statements I want to make. And for each of them, I want to qualify it by saying that it as an impression that I have. What is a better way to say this?
I love sports. Particularly basketball. I was extremely extremely dedicated to it back in middle/high school. Actually, it was pretty much all I cared about (not an exaggeration). This may or not be crazy... but I wanted to be the best player who's ever lived. That was what I genuinely aspired and was working towards (~7th-11th grade).
My thinking: the pros practice, what, 5-6 hours a day? I don't care about anything other than basketball. I'm willing to practice 14 hours a day! I just need time to eat and sleep, but other than that, I value basketball above all else (friends, school...). Plus, I will work so much smarter than they do! The norm is to mindlessly do push ups and eat McDonalds. I will read the scientific literature and figure out what the most effective ways to improve are. I'm short and not too athletic, so I knew I was starting at a disadvantage, but I saw a mismatch between what the norm is and what my rate of improvement could be. I thought I could do it.
In some ways I succeeded, but ultimately I didn't come close to my goal of greatness. In short, I spent too much time on high level actions such as researching training methods and not enough time on object level work; and with school and homework, I simply didn't have enough time to put in the 14 hour days I envisioned. I was a solid high school player, but was no where near good enough to play college ball.
Intense work. I've gone through some pretty intense physical exercise. Ex. running suicides until you collapse. And then getting up to do more until you collapse again. It takes a lot of willpower to do that. I think willpower is like a muscle, and you have to train yourself to be able to work at such intensities. I haven't experienced anything intellectual that has required such intensity. Knowing that I am capable of working at high intensities has given me confidence that "I could do anything".
Ambition. The culture in athletic circles is often one where, "I'm not content being where I am". There's someone above you, and you want to beat them out. I guess that sort of exists in academic and career circles as well, but I don't think it's the same (in the average case; there's certainly exceptions). What explains this? Maybe there's something very visceral about lining up across from someone, getting physically and unambiguously beaten, and letting your teammates and yourself down.
Confidence. Often times, confidence is something you learn because you have to. Often times, if you're not confident, you won't perform, so you need to learn to be confident. But it's not just that; there's something else about the culture that promotes confidence (perhaps cockiness). Think: "I don't care who the opponent is, no one can stop me!".
Group Bonds. When you spend so much time with a group of people, go through exhausting practices together, and work as a team to experience wins and losses, you develop a certain bond that is enjoyable. It reminds me a bit of putting in long hours on a project and eventually meeting the deadline, but it isn't the same.
Other: There's certainly other things I'm forgetting.
All of that said, there are downsides that correspond with all of these benefits. My overarching opinion is "all things in moderation". Ambition can be poison. So can the habitual productivity that often comes with ambition. Sometimes the atmosphere can backfire and make you less confident. And sometimes teammates can bully and be cruel. I've experienced the good and bad extremes along all of these axes.
Honestly, I'm not quite sure when it's worth it and when it isn't. I think it often depends on the person and the situation, but I think that in moderation, most people have a decent amount to gain (in aggregate) by experiencing these things.
So far I've really only talked about competitive sports. Now I want to talk about recreational sports. With competitive sports, as I mention above, I think there's a somewhat fine line between underdoing it and overdoing it. But I think that line is a lot wider for recreational sports. I think it's wide enough such that recreational sports are very often a good choice.
One huge benefit of recreational sports is that it's a fun way to get exercise. You do/should exercise anyway; why not make a game out of it?
Part of me feels like sports are just inherently fun! I know that calling them inherently fun is too strong a statement, but I think that under the right circumstances, they often are fun (I think the same point can be applied to most other things as well).
In practice, what goes wrong?
- You aren't in shape. You're playing a pick up basketball game where everyone else is running up and down the court and you're too winded to breathe. That's no fun.
- Physical bumps and bruises. You're playing football and get knocked around, or perhaps injured.
- Lack of involvement.
- You're playing baseball. You only get to hit 1/18th of the time. And you are playing right field and no one ever hits it to you (for these reasons, I don't like baseball).
- You're playing soccer with people who don't know how to space the field and move the ball, and you happen to get excluded.
- You're playing basketball where each team has a ball hog who brings up the ball and shoots it every possession.
- Difficulty-skill mismatch. You're playing with people who are way too good for you, so it isn't fun. Alternatively, maybe you're way better than the people you're playing with and aren't being challenged.
- Other. Again, I'm sure there are things I'm not thinking of.
I sense that sports get a bit of an unfair rep for being simple and dull games. Maybe some are, but I think that most aren't.
Perhaps it's because of the way most people experience the game. Take basketball as an example. A lot of people just like to watch to see whether the ball goes in the hoop or not and cheer. Ie. they experience the game in a very binary way. Observing this, it may be tempting to think, "Ugh, what a stupid game." But what happens when you steelman?
I happen to know a lot about basketball, so I experience the game very differently. Here's an example:
Iguodala has the ball and is being guarded by LeBron. LeBron is playing close and is in a staggered stance. He's vulnerable and Iguodala should attack his lead foot. People (even NBA players) don't look at this enough! Actually no, he shouldn't attack: the weak side help defense looks like it's in position, and LeBron is great at recovery. Plus, you have to think about the opportunity cost. Curry has Dellavedova and could definitely take him. Meaning, if Delly plays off, Curry can take a shot, but if Delly plays him more tightly, Curry could penetrate and either score or set someone else up, depending on how the help defense reacts. That approach has a pretty high expected value. But actually, Draymond Green looks like he has JR Smith on him (who is much smaller), which probably has an even higher expected value than Curry taking Delly. But to get Green the ball they'd have to reverse it to the weak side, and they'd have to keep the court spaced such that the Cavs won't have an opportunity to switch a bigger defender on to Green. All of this is in contrast with running a motion offense or some set plays. And you also have to take into account the stamina of the other team. Maybe you want to attack LeBron on defense to make him work, get him tired, and make him less effective on offense (I think this is a great approach to take against Curry and the Warriors, because Curry isn't a good defender and is lethal on offense).
Hopefully you could see that the amount of information there is to process in any given second is extremely high! If you know what to look for. Personally, I've never played organized football. But after playing the video game Madden (and doing some further research), I've learned a good amount about how the game works. Now when I watch football, I know the intricacies of the game and am watching for them. The density of information + the excitement, skill and physicality makes these ports extremely enjoyable for me to watch. Alternatively, I don't know too much about golf and don't enjoy watching it. All I see when I watch golf is, "The ball was hit closer to the hole... the ball was hit closer to the hole... the ball was it in the hole. This was a par 3, so that must have been an average performance."
So I think programming would not be possible without System I; without the gut. Now, I have absolutely no evidence to support that statement, but my gut tells me it's true, so I believe it.
I don't think he has "absolutely no evidence". In worlds where DOUGLAS CROCKFORD has a gut feeling about something related to programming, how often does that gut feeling end up being correct? Probably a lot more than 50% of the time. So according to Bayes, his gut feeling is definitely evidence.
The problem isn't that he lacks evidence. It's that he lacks communicable evidence. He can't say "I believe A because X, Y and Z." The best he could do is say, "just trust me, I have a feeling about this".
Well, "just trust me, I have a feeling about this" does qualify as evidence if you have a good track record, but my point is that he can't communicate the rest of the evidence his brain used to produce the resulting belief.
How do you handle a situation where you're having a conversation with someone and they say, "I can't explain why I believe X; I just do."
Well, as far as updating beliefs, I think the best you could do is update on the track record of the person. I don't see any way around it. For example, you should update your beliefs when you hear Douglas Crockford say that he has a gut feeling about something related to programming. But I don't see how you could do any further updating of your beliefs. You can't actually see the evidence he used, so you can't use it to update your beliefs. If you do, the Bayes Police will come find you.
Perhaps it's also worth trying to dig the evidence out of the other persons subconscious.
- If the person has a good track record, maybe you could say, "Hmm, you have a good track record so I'm sad to hear that you're struggling to recall why it is you believe what you do. I'd be happy to wait for you to spend some time trying to dig it up."
- Maybe there are some techniques that can be used to "dig evidence out of one's subconscious". I don't know of any, but maybe they exist.
Ok, now let's talk about what you shouldn't do. You shouldn't say, "Well if you can't provide any evidence, you shouldn't believe what you do." The problem with that statement is that it assumes that the person has "no evidence". This was addressed in Section 1. It's akin to saying, "Well Douglas Crockford, you're telling me that you believe X and you have a fantastic track record, but I don't know anything about why you believe it, so I'm not going to update my beliefs at all, and you shouldn't either."
Brains are weird and fantastic thingys. They process information and produce outputs in the form of beliefs (amongst other things). Sometimes they're nice and they say, "Ok Adam - here is what you believe, and here is why you believe it". Other times they're not so nice and the conversation goes like this:
Brain: Ok Adam, here is what you think.
Adam: Awesome, thanks! But wait - why do I think that?
Brain: Fuck you, I'm not telling.
Adam: Fuck me? Fuck you!
Just because brains could be mean doesn't mean they should be discounted.
I'm reading Dan Ariely's book Predictably Irrational. The story of what got him interested in rationality and human biases goes something like this.
He was the victim of a really bad accident, and had terrible burns covering ~70% of his body. The experience was incredibly painful, and so was the treatment. For treatment, he'd have to bathe in some sort of disinfectant, and then have bandages ripped off his exposed flesh afterwards, which was extremely painful for him.
The nurses believed that ripping it off quickly would produce the least amount of pain for the patient. They thought the short and intense bursts of pain were less (in aggregate) than the less intense but longer periods of pain that a slower removal of the bandages would produce. However, Dan disagreed about what would produce the least amount of pain for patients. He thought that a slower removal would be better. Eventually, he found some scientific research that supported/proved his theory to be correct.
But he was confused. These nurses were smart people and had a ton of experience giving burn victims baths - shouldn't they have figured out by now what approaches best minimize patient pain? He knew their failure wasn't due to a lack of intelligence, and that it wasn't due to a lack of sympathy. He ultimately concluded that the failure was due to inherent human biases. He then became incredibly interested in this and went on to do a bunch of fantastic research in the area.
In my experience, the overwhelming majority of people are uninterested in rationality, and a lot of them are even put off by it. So I'm curious about how members of this incredibly small minority of the population became who they are.
Part of me thinks that extreme outputs are the result of extreme inputs. Like how Dan's extreme passion for his work has (seemingly) originated from his extreme experiences with pain. With this rule-of-thumb in mind, when I see someone who possesses some extreme character trait, I expect there to be some sort of extreme story or experience behind it.
But another part of me thinks that this doesn't really apply to rationality. I don't have much data, but from the limited experience I've had getting to know people in this community, "I've just always thought this way" seems common, and "extreme experiences that motivated rational thinking" seems rare.
Anyway, I'm interested in hearing people's "rationalist backstories". Personally, I'm interested in reading really long and detailed backstories, but am also interested in reading "just a few paragraphs". I'm also eager to hear people's thoughts on my "extreme input/output" theory.
Here are my thoughts on the "Why don't rationalists win?" thing.
I think it's pretty clear that rationality helps people do a better job of being... less wrong :D
But seriously, I think that rationality does lead to very notable improvements in your ability to have correct beliefs about how the world works. And it helps you to calibrate your confidence. These abilities are useful. And I think rationality deserves credit for being useful in this area.
I'm not really elaborating here because I assume that this is something that we agree on.
However, I should note that rationalists aren't really making new and innovative discoveries (the non-superstar ones anyway), and that this may increase the "why don't rationalists win?" thing. I think that a big reason for this lack of progress is because a) we think about really really really difficult things! And b) we beat around the bush a lot. Big topics are often brought up, but I rarely see people say, "Ok, this is a huge topic so in order to make progress, we're going to have to sit down for many hours and be deliberate about this. But I think we could do it!". Instead, these conversations seem to be just people having fun, procrastinating, and never investing enough time to make real progress.
I also think that rationality is doing a great job in helping people to do a better job at being altruistic. Another thing that:
- I'm going to assume that we mostly agree on, and thus not really elaborate.
- I think deserves to be noted and given credit.
- Is useful.
For people with altruistic goals, rationality is helping them to achieve their goals. And I think it's doing a really good job at this. But I also think that it doesn't quite feel like the gains being made here are so big. I think that a major reason for this is because the gains are so:
- High level.
- Likely to be realized far in the future.
- Are the sort of thing that you don't personally experience (think: buying a poor person lunch vs. donating money to people in Africa).
But we all know that (1), (2), and (3) don't actually make the gains smaller, it just makes them feel smaller. I get the impression that the fact that the gains feel smaller results in an unjustified increase in the "rationalists don't win" feeling.
I get the impression that lack of success plays a big role in the "why don't rationalists win?" thing.
I guess an operational definition of success for this section could be "professional, financial, personal goals, being awesome...".
I don't know much about this, but I would think and hope that rationality helps people to be notably more successful than they otherwise would be. I don't think rationality is at the point yet where it could make everyone millionaires (metaphorically and/or literally). But I think that a) it could get there, and b) we shouldn't trivialize the fact that it does (I'm assuming) make people notably more successful than they otherwise would be.
But still, I think that there are a lot of other factors that determine success, and given their difficulty/rarity, even with rationality in your toolbox, you won't achieve that much success without these things.
- Plain old hard work. I'm a huge believer in working smart, but I also think that given a pretty low and relatively sufficient level of smartness in your work, it's mostly a matter of how hard you work. You may ask yourself, "Take someone who studies really hard, but is lacking big time when it comes to rationality - wouldn't they not be successful?". I think an important (and sad) point to make is that at this point in history, you could be very successful with domain specific knowledge, but no rationality. And so people who work really hard but don't have an ounce of rationality often end up being very good at what they do, and very successful. I think we'll reach a point where things progress enough and rationality does in fact become necessary (the people with domain specific knowledge but no rationality will fail).
- Aptitude/starting early. I'm not sure the extent to which aptitude is actually a thing. I sense that a big part of it is simply how early on you started. When your brain was at that "sponge-stage". Regardless, aptitude/starting early seems to be pretty important. Someone who works hard but started too late will certainly be at a disadvantage.
- Opportunity. In one sense, not much will help you if you have to work 3 jobs to survive (you won't have much time for self-improvement or other necessary investments of time). In another sense, there's the idea that "you are who you surround yourself with". So people who are fortunate enough to grow up around other smart and hard working people will have had the opportunity to be socially pressured into doing the same. I think this is very underrated, but also very overcommable. In another sense, some people are extremely fortunate and are born into a situation where they have a lot of money and connections.
- Ambition/confidence. Example: imagine a web developer who has rationality + (1) + (2) + (3) but doesn't have (4). He'll probably end up being a good web developer. But he might not end up being a great web developer. The reason for that is because he might not have the ambition or confidence to think to pursue certain skills. He may think, "that stuff is for truly smart people, I'm just not one of those people". And he may not have the confidence to pursue the goal of being a great software engineer (more general and wide-ranging). He may not have the confidence to learn C and other stuff. Note that there's a difference between not having the confidence to try, and not having the confidence to even think to try. I think that the latter is a lot more common, and blends into "ambition territory". On that note, this hypothetical person may not think to pursue innovative ideas, or get into UX, or start a startup and do something bigger.
I get the impression that lack of happiness plays a big role in the "why don't rationalists win?" thing.
Luke talked about the correlates of happiness in How to Be Happy:
Factors that don't correlate much with happiness include: age,7 gender,8 parenthood,9 intelligence,10 physical attractiveness,11 and money12 (as long as you're above the poverty line). Factors that correlate moderately with happiness include: health,13 social activity,14 and religiosity.15 Factors that correlate strongly with happiness include: genetics,16 love and relationship satisfaction,17 and work satisfaction.18
One thing I want to note is that genetics seem to play a huge role, and that plus the HORRIBLE hedonic adaptation thing makes me think that we don't actually have that much control over our happiness.
Moving forward... and this is what motivated me to write this article... the big determinants of happiness seem like things that are sort of outside rationality's sphere of influence. I don't believe that, and it kills me to say it, but I thought it'd make more sense to say it first and then amend it (a writing technique I'm playing around with and am optimistic about). What I really believe is:
- Things like social and romantic relationships are tremendously important factors in one's happiness. So is work satisfaction (in brief: autonomy, mastery and purpose).
- These are things that you could certainly get without rationality. Non-rationalists, have set a somewhat high bar for us to beat.
- Rationality certainly COULD do wonders in this area.
- But the art hasn't progressed to this point yet. Doing so would be difficult. People have been trying to figure out the secrets of happiness for 1000s of years, and though I think we've made some progress, we still have a long way to go.
- Currently, I'm afraid that rationality might be acting as a memetic immune disorder. There's a huge focus on our flaws and how to mitigate them, and this leads to a lot of mental energy being spent thinking about "bad" things. I think (and don't know where the sources are) that a positive/optimistic outlook plays a huge role in happiness. "Focusing on the good." Rationality seems to focus a lot on "the bad". Rationality also seems to make people feel unproductive and wrong for not spending enough time focusing on and fixing this "bad", and I fear that this is overblown and leads to unnecessary unhappiness. At the same time, focusing on "the bad" is important: if you want to fix something, you have to spend a lot of time thinking about it. Personally, I struggle with this, and I'm not sure where the equilibrium point really is.
Socially, LessWrong seems to be a rather large success to me. My understanding is that it started off with Eliezer and Robin just blogging... and now there are thousands of people having meet-ups across the globe. That amazes me. I can't think of any examples of something similar.
Furthermore, the social connections LW has helped create seem pretty valuable to me. There seem to be a lot of us who are incredibly unsatisfied with normal social interaction, or sometimes just plain old don't fit in. But LW has brought us together, and that seems incredible and very valuable to me. So it's not just "it helps you meet some cool people". It's "it's taken people who were previously empty, and has made them fulfilled".
Still though, I think there's a lot more that could be done. Rationalist dating website?* Rationalist pen pals (something that encourages the development of deeper 1-on-1 relationships)? A more general place that "encourages people to let their guard down and confide in each other"? Personal mentorship? This is venturing into a different area, but perhaps there could be some sort of professional networking?
*As someone who constantly thinks about startups, I'm liking the idea of "dating website for social group X that has a hard time relating to the rest of society". It could start off with X = 1, and expand, and the parent business could run all of it.
So, are we a failure? Is everything moot because "rationalists don't win"?
I don't think so. I think that rationality has had a lot of impressive successes so far. And I think that it has
of potential (did I forget any other indicators of visual weight there? it wouldn't let me add color). But it certainly hasn't made us super humans. I get an impression that because rationality has so much promise, we hold it to a crazy high standard and sometimes lose sight of the great things it provides. And then there's also the fact that it's only, what, a few decades old?
(Sorry for the bits of straw manning throughout the post. I do think that it lead to more effective communication at times, but I also don't think it was optimal by any means.)
Programmers do something called Test Driven Development. Basically, they write tests that say "I expect my code to do this", then write more code, and if the subsequent code they write breaks a test they wrote, they'll be notified.
Wouldn't it be cool if there was Test Driven Thinking?
- Write tests: "I expect that this is true."
- Think: "I claim that A is true. I claim that B is true."
- If A or B causes any of your tests to fail, you'd be notified.
- It'd be awesome if you could apply TDT and be notified when your tests fail, but this seems very difficult to implement.
- I'm not sure what a lesser but still useful version would look like.
- Maybe this idea could serve as some sort of intuition pump for intellectual hygiene ("What do you think you know, and why do you think you know it?"). Ie. having understood the idea of TDT, maybe it'd motivate/help people apply intellectual hygiene. Which is sort of like a manual version of TDT, where you're the one constantly running the tests.
I just finished reading a fantastic Wait But Why post: How Tesla Will Change The World. One of the things that was noted is that the people in the Auto and Oil industries are trying to delay the introduction of Electric Vehicles (EVs) so they could make more money.
The post also explains how important it is that we become less reliant on oil.
- Because we're going to run out relatively soon.
- Because it's causing global warming.
- Make some more money, which gives them and their families a marginally more comfortable life.
- Not get a sense of purpose out of your career.
- Probably feel some sort of guilt about what you do.
- Avoid the short-term discomfort of changing jobs/careers.
- Because of diminishing marginal utility, I doubt that the extra money is making them much happier. I'm sure they're pretty well off to begin with. It could be the case that they're so used to their lifestyle that they really do need the extra money to be happy, but I doubt it.
- Autonomy, mastery and purpose are three of the most important things to get out of your career. There seems to be a huge opportunity cost to not working somewhere that provides you with a sense of purpose.
- To continue that thought, I'm sure they feel some sort of guilt for what they're doing. Or maybe not. But if they are, that seems like a relatively large cost.
- I understand that there's probably a decent amount of social pressure on them to conform. I'm sure that they surround themselves with people who are pro-oil and anti-electric. I'm sure that their companies put pressure on them to perform. I'm sure that they have families and all of that and starting something new might be difficult. But these don't seem to be large enough costs to make their choices worthwhile. A big reason why I get this impression is because they are so short term.
I just saw that Donald Trump is running for president. Which led me to the following thought: would any of the big names in tech have a chance at being elected president of the US? Elon Musk? Sergey Brin? Jeff Bezos? Reid Hoffman? Peter Thiel? Edit: Bill Gates?
Some follow up questions/thoughts:
- As far as maximizing altruistic impact goes, would it be a good idea for them to become president?
- Do these people care about maximizing altruistic impact? To what extent? If so/enough, why not do it?
- What other "sane" people have enough reputation in the public eye to have a chance at acquiring a lot of political power? My first thought was tech people, but I'm sure there are others. Big hedge fund managers? Ray Dalio? Or maybe some famous scientists?
- What does EA have to say about acquiring political power?
Edit: hypothetically, if one of these big-name tech people were to try to gain political power, how should they go about doing so?
This article is something that has been in my head for a while. I hadn't planned on doing a write-up so soon. I wanted to take the time to a) refine my ideas and b) figure out how to express them clearly before posting. But the recent post Less Wrong lacks direction made me change my mind. My thinking now is that I overestimated the downside (wasting peoples time with a less than fully thought out post) and that there's enough value to justify posting a rough draft now.
LessWrong has been one of the most amazing things I've experienced in my life.
- I have learned a ton, and have "leveled up" quite a bit.
- Knowing that there are this many other relatively rational people in the world and being able to interact with them is a truly amazing thing.
- A way to discuss ideas for the site, vote on them, and incentivize the generation of good ideas. I sense that having this would be huge. a) I sense that there are a lot of good ideas out there in people's heads but that they haven't shared. b) I sense that by discussing things, there could be a lot of refinement of current ideas, and a lot of generation of new ideas.
- More generally, my impression is that it'd be a good idea to subdivide sections for posts. Right now it's pretty much Main, Discussion or Open Thread. Ex. someone who has an idea to improve LW might not think it's "Discussion worthy" (or even "Open Thread worthy"), but I sense that if there were a section explicitly for "LW Ideas", they'd be a lot less reluctant to post. More generally, it'd justify more "bite sized posts" rather than requiring a full write-up.
- One example of a subsection that I think would be cool is a Personal Advice section. The ability to post anonymously seems like it'd be a useful feature here. Other ideas for subsections: AMA!, Brainstorming/Unrefined Thoughts, I Don't Understand X, Contrarian Thoughts.
- Social coordination:
- Apartments/living together.
- "What are you currently learning? What do you want to learn?". so8res recommends pairing up, and I agree.
- Geographical map of users to facilitate friendships and/or dating. (This already exists. But it seems that a low proportion of LW users added a pin on the map. My impression is that because of network effects, the usefulness of this is very much a function of how many users there are. Also, I sense that there'd need to be some sort of a different UI that with some sort of organization.)
- Online chat. Like Facebook. I think it'd be a) cool and b) sometimes useful.
- Project ideas. There are a lot of smart, skilled and ambitious people here who want to do good things. If LW made it easier to coordinate and work with people, I could see it having a huge impact.
- Crowdsource the refinement of posts.
- Maybe have an answer wiki for each article that summarizes the main points.
- Maybe let the author award karma to people who submit a diagram of something explained in the article (I'm a big fan of explaining things visually). Along similar lines, maybe do the same thing for people who submit relevant YouTube videos. Ex. I think that this would be a relevant clip to add to an article about expected value (beware: cringeworthy). (Again, I'm really not a big fan of writing as a medium)
- Maybe allow collaboration on drafts. And allow the author to award karma to collaborators.
- Side comments. I really think that for a lot of scenarios, this is a much better UI. But I also think there are use cases for the traditional comments at the bottom of the page, and so there should be both.
- Make use of some sort of debate tool. I think there are a lot of improvements that could be made to the current approach of having nested comments. It might be sufficient for the level of conversation elsewhere on the internet, but not here.
- I should emphasize that this seems like it'd be a large and difficult undertaking.
- But I should also emphasize how important I think it is. Media For Thinking The Unthinkable largely expresses my views here. I think that the mediums we use to write, think and communicate play a large and very underrated role in determining how well we could think. As a society, we don't seem to really recognize this, and we don't seem to have made much progress as far as inventing such tools goes. The importance of such tools goes way beyond LessWrong, but I guess I'm just noting here that LW would benefit greatly from it. I don't think that there are many legitimately deep conversations on LW, and I think that the limitations of nested comments are a big part of the reason why.
- Along similar lines, I think it's pretty important for there to be a way to highlight and take notes on articles (currently, most people don't). I've been using scrible. Come to think of it, scrible actually isn't that bad of a solution, but I think it'd be awesome if there were a better way to do this built in to the site. (This is another thing I'd like to see across the internet, but I digress...)
- Have a "LessWrong Ideas" thread. Preferably link to it on the sidebar so we don't get too many repeat ideas, so the good ideas have enough time to be voted up, and so ideas don't get "lost in time".
- Have a "Project Ideas" thread. Preferably link to it on the sidebar, for similar reasons.
- Have a Project Ideas Google Doc. This would be a list of the more serious and vetted ideas, with brief summaries, skills required, and you could add your name to the list of people interested in working on it.
- Link to the map of community members on the sidebar. Give, say 50 karma points for adding your data point. I'm not sure how the data would be used for social coordination though. It'd be incredible if there was an API.
- Actually, maybe it'd be a better idea to create a site that allows users to input their data point on the map, and it'd create the API for us. And you could add things like contact info, interested in finding a roommate? dating? friends/fun activities? On second thought, maybe this is getting too far from the idea of a version 1.
- Discuss a) the idea of having subsections (ex. Personal Advice, Unrefined Thoughts), and b) which ones we'd like to see. Then create and manage threads based on interest and traction.
- Have a Google Doc to help people learning the same things pair up. Potential information to include: what you want to learn, how much time per week you want to spend, how many months you'd like to spend learning, fields in which you're knowledgeable (ex. math, psychology, genetics...).
There's a lot that I really like about communicating via writing. Communicating in person is sometimes frustrating for me, and communicating via writing addresses a lot of those frustrations:
1) I often want to make a point that depends on the other person knowing X. In person, if I always paused and did the following, it'd add a lot of friction to conversations: "Wait, do you know X? If yes, good, I'll continue. If no, let me think about how to explain it briefly. Or do you want me to explain it in more depth? Or do you want to try to proceed without knowing X and see how it goes?". But if I don't do so, then it risks miscommunication (because the other person may not have the dependency X).
In writing, I could just link to an article. If the other person doesn't have the dependency, they have options. They could try to proceed without knowing X and see how it goes. If it doesn't work out, they could come back and read the link. Or they could read the link right away. And in reading the link, they have their choice of how deeply they want to read. Ie. they could just skim if they want to.
Alternatively, if you don't have something to link to, you could add a footnote. I think that a UI like Medium's side comments is very preferable to putting the footnotes at the bottom of the page. I hope to see this adopted across the internet some time in the next 5 years or so.
2) I think that in general, being precise about what you're saying is actually quite difficult/time consuming*. For example, I don't really mean what I just said. I'm actually not sure how often that it's difficult/time consuming to be precise with what you're saying. And I'm not sure how often it's useful to be precise about what you're saying (or really, more precise...whatever that means...). I guess what I really mean is that it happens often enough where it's a problem. Or maybe just that for me, it happens enough where I find it to be a problem.
Anyway, I find that putting quotes around what I say is a nice way to mitigate this problem.
Ex. It's "in my nature" to be strategic.
The quotes show that the word inside them isn't precisely what I mean, but that it's close enough to what I mean that it should communicate the gist of it. I sense that this communication often happens through empathetic inference.
*I also find that I feel internal and external pressure to be consistent with what I say, even if I know I'm oversimplifying. This is a problem and has negatively effected me. I recently realized what a big problem it is, and will try very hard to address it (or really, I plan on trying very hard but I'm not sure blah blah blah blah blah...).
Note 1: I find internal conversation/thinking as well as interpersonal conversation to be "chaotic". (What follows is rant-y and not precisely what I believe. But being precise would take too long, and I sense that the rant-y tone helps to communicate without detracting from the conversation by being uncivil.) It seems that a lot of other people (much less so on LW) have more "organized" thinking patterns. I can't help but think that that's BS. Well, maybe they do, but I sense that they shouldn't. Reality is complicated. People seem to oversimplify things a lot, and to think in terms of black-white. When you do that, I could see how ones thoughts could be "organized". But when you really try to deal with the complexities of reality... I don't understand how you could simultaneously just go through life with organized thoughts.
Note 2: I sense that this post somewhat successfully communicates my internal thought process and how chaotic it could be. I'm curious how this compares to other people. I should note that I was diagnosed with a mild-moderate case of ADHD when I was younger. But that was largely based off of iffy reporting from my teachers. They didn't realize how much conscious thought motivated my actions. Ie. I often chose to do things that seem impulsive because I judged it to be worth it. But given that my mind is always racing so fast, and that I have a good amount of trouble deciding to pay attention to anything other than the most interesting thing to me, I'd guess that I do have ADHD to some extent. I'm hesitant to make that claim without ever having been inside someone else's mind before though (how incredibly incredibly cool would that be!!!) - appearances could be deceiving.
3) It's easier to model and traverse the structure of a conversation/argument when it's in writing. You could break things into nested sections (which isn't always a perfect way to model the structure, but is often satisfactory). In person, I find that it's often quite difficult for two people (let alone multiple people) to stay in sync with the structure of the conversation. The outcome of this is that people rarely veer away from extremely superficial conversations. Granted, I haven't had the chance to talk to many smart people in real life, and so I don't have much data on how deep a conversation between two smart people could get. My guess is that it could get a lot deeper than what I'm used to, but that it'd be pretty hard to make real progress on a difficult topic without outlining and diagramming things out. (Note: I don't mean "deep as in emotional", I mean "deep as in nodes in a graph")
There are also a lot of other things to say about communicating in writing vs. in person, including:
- The value of the subtle things like nonverbal communication and pauses.
- The value of a conversation being continuous. When it isn't, you have to download the task over and over again.
- How much time you have to think things through before responding.
- I sense that people are way more careful in writing, especially when there's a record of it (rather than, say PM).
This is a discussion post, so feel free to comment on these things too (or anything else in the ballpark).
View more: Next