Proposal: Don't fear GATTACA. A post where I explain why people are afraid of the dystopia featured in GATTACA, and why these fears are unjustified.
Proposal: You don't need politics. In which I argue that keeping up with the news and political controversies is not a duty nor effective altruism. Intended to counteract the "Rah political activism!" message I got in school.
I endorse this approach. Ever since high school (so for about 12 years), I have deliberately stayed ignorant of all politics local to my country and of local news. I absolutely never watch or read the news, and I rarely find myself discussing these topics with my friends.
News and politics are designed to generate outrage and promote anti-rationalism and epistemic dark arts. They also strongly select for bad and depressing news, and for non-representative surprising incidents. On the other hand, the value from my knowing about the news is very small (e.g. in terms of changing my behavior).
Sometimes politics steps into your life. For example, you want to teach people rationality, but a religious political party just made religious education mandatory in schools. Or you invented a better way to teach maths to kids, but you can't use it, because in your country all schools must strictly follow the plans written by government. Etc. The idea is that the political power can prevent you from doing the right thing, so unless your plan is just to break the law and go to jail, you must somehow get involved with politics.
Of course, you could also just give up this specific topic, and choose some other topic where there is no direct political opposition. Or you could just write a lot about your idea, and hope that someone else will notice it and do the dirty work for you.
I would like to see a post collecting examples from history and great literature of rationality or irrationality that illuminate key LW principles.
A while back, I read "The Little Book of Common Sense Investing" by John Bogle, the founder of Vanguard and creator the first index fund. It's an analysis of why index funds are a better option than actively managed mutual funds.
I've had some highly-upvoted comments on the merits of index funds on the past, so I've considered doing a writeup on it to give LessWrong a summary since it seems that a lot of people around here know that they're supposed to be good, but don't really understand all the reasons why. Is there any interest around this?
Edit: Thanks for the positive response. I'll work on it and try to get it out in the next couple weeks. Does anyone have any input on whether it would be appropriate to post in Main?
I would like to read it, as long as there will be more than just the basic idea of "you can't be reliably better than the market, and the index funds copy the market".
For example, there are many index funds. I know they are supposed to be better than all other options, but how do I compare them against each other? Or, how much of the historical success of index funds is a survivor bias, that USA was simply not destroyed in a war, while many other countries were? (If you had invested your money in 1900 in Russian or German index funds, how much would you have today? Let's suppose you would put 1/3 in Russian, 1/3 in German, 1/3 in American index funds, how much would you have today? Is the advice for your readers to pick a random country, to always pick USA because that worked in the past, or to diversify internationally?)
Some questions I'd love to see addressed in posts:
How much can we raise the sanity waterline without transhumanism (i.e. assuming current human biology is a constant)?
Is the sanity waterline rising?
What is the best way to introduce rationality to different groups of people/subcultures?
Does LW and other rationality reading material unnecessarily signal nerdiness so strongly that it limits its effectiveness and ability to spread?
What are the best things someone with very low tech skills can do for the rationality movement, and for the world?
If LW is declining/failing, why is this happening, could this have been prevented, and are other rationality-related communities infected with the same problem?
If LW is declining/failing
I like the idea in general, I just recommend caution in evaluating whether the LW is declining. I mean, it's obvious from the context of this thread that many people feel so, however...
There was a time when Eliezer wrote a new article every day, for a year. And I loved reading those articles, but writing them was not how Eliezer wanted to spend the rest of his life, so it is natural that he gradually stopped. This feels like a decline from the "less new cool stuff to read every day" point of view. But on the other hand... all the stuff Eliezer wrote, it's still there. We are not in a newspaper business, the old copies are not automatically thrown away, and don't have to be repeated every year. It's collected to the e-book now (by the way, how's the progress there?). There is CFAR as a separate organization; they do seminars. There are meetups in many countries around the world.
What I'm saying is that the important part is the rationalist movement, not merely its website. If people at meetups actually accomplish something, that is more awesome than debating online. So we shouldn't judge the whole thing only by the daily number of new articles in the Discussion. Ironically, the fact that until recently the Discussion page was cluttered by meetup announcements was a signal of success (and of a bad design - which later got fixed). Now, if the number of LW meetups were declining, that would be something to worry about; and I didn't look at specific numbers.
Proposal: Quantified risks of gay sex: As a bi-curious man, I have some interest in gay sex, but I'm also worried about STDs. As a nerd, I'd like to weight my subjective desire to have gay sex against the objective risks of stds. This has been surprisingly difficult.
The risks of lesbian sex doesn't need quantification because it's basically zero. The risks of straight sex have been decently-enough quantified here and here. But there's no comparable guide for gay sex.
All of the websites for gay men give vague advice like "wearing a condom is safer than not wearing a condom." Sure, but does wearing a condom make gay sex safe enough to rationally partake in, or is it like wearing a seatbelt while you're drunk driving? I'd like to write a post that told men how risky gay sex was and how much of that risk can be avoided. It would help men decide not just whether they should have gay sex, but whether they should get circumcised or insist their partners be tested.
This post could be a hazard if it exposes Less Wrong to legal risk, or if it says something boneheaded and damages the forum's credibility. So I'd probably need some help researching and editing it and I'd want to show it to whoever is in charge of these forums before I post it.
I would be interested in helping with this post. (I am a gay man who does not partake in casual sex, primarily because of the health risks.) From what I recall when I looked into this last, there's huge value in breaking out the various kinds of sex, because of huge risk differences.
Proposal: If you're depressed, maybe your life sucks. A meta-contrarian post where I argue that you can't always "have a positive attitude" towards bad things in your life, and that fixing your life's problems might be a better strategy than learning to cope with them.
How do you give compliments effectively? When do you give them?
What kind of compliments are best? How do you train yourself to give them? When do you give them? What are the effects on the person receiving the compliment?
Review of the literature on the effectiveness of various strategies to mitigate cognitive biases
Spending five minutes thinking about it:
Boredom. What it is, how to notice it, what to do about it. Eliezer wrote some about this in Fun Theory but it could probably be expanded on. (and the topic is amusing to me for...Reasons)
To recurse on the subject: Idea generation. How new ideas work, how to come up with them on demand, how to separate good ones from bad ones.
Dealing with irrationality in others. e.g. if you're part of a group that's mindkilling itself, and you can't just walk away, how can you de-mindkill the group? Successfully, that is. Yvain wrote a bit about this here with regard to individuals, but there's probably room for more.
...I had three more suggestions that I thought of while away from the keyboard for a few minutes, but they all went out of my head before I made it back, even though I specifically tried to note them. If someone knows why this sort of thing happens, it would be nice to write it up, because it happens to me all the time and it drives me insane.
(yes, I know the fix is to carry a notebook everywhere and use it religiously. I still want to know why it happens)
I would suggest the fix is to carry a smartphone at all times rather than a notebook. The phone fits in your pocket and odds are you might need one anyway.
I'm teaching at a summer school intended to introduce philosophy students to science and the scientific method. I'm teaching a course on probability and statistics, and also one on physics. In both courses, I'm emphasizing conceptual issues that philosophy students might find interesting and relevant. Maybe I'll try polishing up some of my lecture notes and posting them here.
I'm broadly interested in the question, what physical limits if any, will a superintelligence face? What problems will it have to solve and which ones will it struggle with?
Eliezer Yudkowsky has made the claim "A Bayesian superintelligence, hooked up to a webcam, would invent General Relativity as a hypothesis—perhaps not the dominant hypothesis, compared to Newtonian mechanics, but still a hypothesis under direct consideration—by the time it had seen the third frame of a falling apple. It might guess it from the first frame, if it saw the statics of a bent blade of grass.”
I can't see how this is true. It isn't obvious to me that one could conclude anything from a video like that without a substantial prior knowledge of mathematical physics. Seeing a red, vaguely circular object, move across a screen tells me nothing unless I already know an enormous amount.
We can put absolute physical limits on the energy cost of a computation, at least in classical physics. How many computations would we expect an AI to need in order to do X or Y. Can we effectively box an AI by only giving it a 50W power supply?
I think there are some interesting questions at the intersection of informati...
One thing I've been thinking about would be posts specifically designed to elicit discussion, rather than teaching about something.
To give an example, someone on /r/LessWrong posted a question about creating a rationalist sport. It's fun and interesting to talk about and it's a good way of exercising things like "holding of on proposing solutions."
I keep a list, in Workflowy, of titles for posts almost none of which I've turned into posts. (I generally recommend using Workflowy for capture in this way.) Here are the ones where I at least remember what the point of the post was supposed to be:
It was going to be something like a guide to what kind of mathematics it might be good for rationalists to learn, but when I started writing the post I realized it was a gigantic project and I didn't care about it enough to actually give it the time it deserved. Sorry!
Be Impressed by the Status Quo.
It was going to be a post about how the world is actually really complicated and awesome, and on average one misses a lot of things by assuming that one could do better than the status quo just because cynicism and worldliness. I had a few examples in mind, too. It has really just been a lack of focused time and a feeling that the post might be poorly received that has prevented me from working on it.
How to talk about politics without getting mindkilled and go beyond simple rehearsing talking points
A topic that I've been working on recently is applying a lot of the rationality, effectiveness lessons from LW, Thinking Fast and Slow, Getting Things Done, etc. into a model of leadership. Preferably a model I could devise ways to test, and tweak.
I think that having accurate beliefs about the world is great, but effectiveness in modifying the world is better, and a big part of of the world we're interested in modifying is made up of other people. How do I apply all of this rationality stuff to actually accomplishing things, especially when accomplishing t...
From "Go Forth and Create the Art!":
...Yet there is, I think, more absent than present in this "art of rationality"—defeating akrasia and coordinating groups are two of the deficits I feel most keenly. I've concentrated more heavily on epistemic rationality than instrumental rationality, in general. And then there's training, teaching, verification, and becoming a proper experimental science based on that. And if you generalize a bit further, then building the Art could also be taken to include issues like developing better introducto
How can we share evidence effectively? More generally, how can aspiring human rationalists make group decisions?
I also noticed the low number of high vote posts recently. But I didn't jump to the conclusion that it is a decline. Is it? Didn't such periods of low activity occur before? Is there a real problem or is there more fear of a problem than a real problem? Or is the slow onset of a (natural?) decline in an online forum when the caravan moves on (to avoid the picture of the greener pastures I used recently).
Someone with access to the DB should be able to quickly generate a histogram of the monthly number of posts with >N votes.
I'm writing an attempt at a RationalFic. So far, I posted about it to /r/rational at Reddit, in an Open Thread here, and I've now finished the first story arc. Since there's a new Media Thread tomorrow, I plan on posting about it there.
What I haven't done is make a separate Discussion post. I kind of want to, in order to generate as much useful feedback and commentary as possible, but I've been told in the past that a few of my Discussion posts should have gone in Open Threads instead.
Meta: I've submitted four proposals, and might submit more later. I don't have time to write all these posts and I have no idea what posts people would want to read, so I'd appreciate advice, criticism, or requests.
I think this is a good idea and perhaps should become a recurring thread.
I know that my experience is that I often have ideas about topics/questions/potential discussion that might be of interest to the LW community, but I have a finite amount of time to invest in writing blog posts/preparing my musings for an outside audience. It isn't always clear to me which of these topics are welcome on LW, since they may be only tangentially related to "refining the art of human rationality". A thread like this provides a sounding board for people in a simi...
I suggested recently that part of the problem with with LW was a lock of discussion posts which was caused by people not thinking of much to post about.
When I ask myself "what might be a good topic for a post?", my mind goes blank, but surely not everything that's worth saying that's related to rationality has been said.
So, is there something at the back of your mind which might be interesting? A topic which got some discussion in an open thread that could be worth pursuing?
If you've found anything which helps you generate useable ideas, please comment about it-- or possibly write a post on the subject.