What am I doing?: Working at a regular job as a C++ programmer, and donating as much as possible to SIAI. And sometimes doing other useful things in my spare time.
Why am I doing it?: Because I want to make lots of money to pay for Friendly AI and existential risk research, and programming is what I'm good at.
Why do I want this?: Well, to be honest, the original reason, from several years ago, was "Because Eliezer told me to". Since then I've internalized most of Eliezer's reasons for recommending this, but this process still seems kinda backwards.
I guess the next question is "Why did I originally choose to follow Eliezer?": I started following him back when he still believed in the most basic form of utilitarianism: Maximize pleasure and minimize pain, don't bother keeping track of which entity is experiencing the pleasure or pain. Even back then, Eliezer wasn't certain that this was the value system he really wanted, but for me it seemed to perfectly fit my own values. And even after years of thinking about these topics, I still haven't found any other system that more closely matches what I actually believe. Not even Eliezer's current value system. A...
Nothing at all against SIAI but
A couple of times I asked SIAI about the idea of splitting my donations with some other group, and of course they said that donating all of the money to them would still be the most leveraged way for me to reduce existential risks.
If you're in doubt and seeking expert advice you should pick an expert that lacks really obvious institutional incentives to give one answer over others.
Regarding the rest of the comment I found it kind of weird and something freaked me out about it, though I'm not sure quite what. That doesn't mean you're doing anything wrong, I might just have biases or assumptions that make what you're doing seem weird to me. I think it has something to do with your lack of skepticism or cynicism and the focus on looking for someone to follow that MatthewB mentioned. I guess your comment pattern matches with things a very religious person would say: I'm just not sure if that means you're doing something wrong or if I'm having an adverse reaction to a reasonable set of behaviors because I have irrationally averse reactions to things that look religious.
Yeah, I realized that it was silly for me to ask SIAI what they thought about the idea of giving SIAI less money, but I didn't know who else to ask, and I still didn't have enough confidence in my own sanity to try to make this decision on my own. And I was kinda hoping that the people at SIAI were rational enough to give an accurate and reasonably unbiased answer, despite the institutional incentives. SIAI has a very real and very important mission, and I would have hoped that its members would be able to rationally think about what is best for the mission, rather than what is best for the group. And the possibility remains that they did, in fact, give a rational and mostly unbiased answer.
The answer they gave was that donating exclusively to SIAI was the most leveraged way to reduce existential risks. Yes, there are other groups that are doing important work, but SIAI is more critically underfunded than they are, and the projects that we (yes, I said "we", even though I'm "just" a donor) are working on this year are critical for figuring out what the most optimal strategies would be for humanity/transhumanity to maximize its probability of surviving into a...
Well, one reason why I feel that I need someone to follow is... severe underconfidence in my ability to make decisions on my own. I'm still working on that. Choosing a person to follow, and then following them, feels a whole lot easier than forging my own path.
I should mention again that I'm not actually "following" Eliezer in the traditional sense. I used his value system to bootstrap my own value system, greatly simplifying the process of recovering from christianity. But now that I've mostly finished with that (or maybe I'm still far from finished?), I am, in fact, starting to think independently. It's taking a long time for me to do this, but I am constantly looking for things that I'm doing or believing just because someone else told me to, and then reconsidering whether these things are a good idea, according to my current values and beliefs. And yes, there are some things I disagree with Eliezer about (the "true ending" to TWC, for example), and things that I disagree with SIAI about ("we're the only place worth donating to", for example). I'll probably start writing more about this, now that I'm starting to get over my irrational fear of...
I like the way steven0461 put it:
...promoting less than maximally accurate beliefs is an act of sabotage. Don’t do it to anyone unless you’d also slash their tires, because they’re Nazis or whatever. Specifically, don’t do it to yourself.
In my case, I knew pretty much from the beginning that something was seriously wrong. But since every single person I had ever met was a christian (with a couple of exceptions I didn't realize until later), I assumed that the problem was with me. The most obvious problem, at least for me, was that none of the so-called christians was able to clearly explain what a christian is, and what it is that I need to do in order to not go to hell. And the people who came closest to being able to give a clear explanation, they were all different from each other, and the answer changed if I asked different questions. So I guess I was... partly brainwashed. I knew that there was something really important I was supposed to do, and that people's souls were at stake (a matter of infinite utility/anti-utility!) but noone was able to clearly explain what it was that I was supposed to do. But they expected me to do it anyway, and made it sound like there was something wrong with me for not instinctively knowing what it was that I was supposed to do. There's lots more I could complain about, but I guess I had better stop now.
So it was pretty obvious that I wasn't going to be able to save anyone's sou...
But it took me a few weeks of swinging back and forth before I finally settled on Singularitarianism.
Here's a quote from an old revision of Wikipedia's entry on The True Believer that may be relevant here:
A core principle in the book is Hoffer's insight that mass movements are interchangeable; he notes fanatical Nazis later becoming fanatical Communists, fanatical Communists later becoming fanatical anti-Communists, and Saul, persecutor of Christians, becoming Paul, a fanatical Christian. For the true believer the substance of the mass movement isn't so important as that he or she is part of that movement.
And from the current revision of the same article:
Hoffer quotes extensively from leaders of the Nazi and communist parties in the early part of the 20th Century, to demonstrate, among other things, that they were competing for adherents from the same pool of people predisposed to support mass movements. Despite the two parties' fierce antagonism, they were more likely to gain recruits from their opposing party than from moderates with no affiliation to either.
Can't recommend this book enough, by the way.
Thanks for the link, and the summary. Somehow I don't find that at all surprising... but I still haven't found any other cause that I consider worth converting to.
At the time I converted, Singularitarianism was nowhere near a mass movement. It consisted almost entirely of the few of us in the SL4 mailing list. But maybe the size of the movement doesn't actually matter.
And it's not "being part of a movement" that I value, it's actually accomplishing something important. There is a difference between a general pool of people who want to be fanatical about a cause, just for the emotional high, and the people who are seriously dedicated to the cause itself, even if the emotions they get from their involvement are mostly negative. This second group is capable of seriously examining their own beliefs, and if they realize that they were wrong, they will change their beliefs. Though as you just explained, the first group is also capable of changing their minds, but only if they have another group to switch to, and they do this mostly for social reasons.
Seriously though, the emotions I had towards christianity were mostly negative. I just didn't fit in with the other christians. O...
My way of asking these questions:
What is the single most important thing you should be doing?
Are you doing it?
(I'm writing the damn help file. Why? Because nobody on the team has the necessary domain experience and writing skills and English knowledge needed for that. Why is the help file important? Because it's the single biggest chunk of work needed for the final release of our software. Very few users read help, but those who do are important. Why release the software? Because a release of a major new version brings in additional revenue and new customers -- and because abandoning a project at 95% completion is (usually) a stupid idea. Why do we need more revenue? To explore a more mainstream, less nerdy business than our current one. Why explore a more mainstream business? I could go on and on and on, but sorry -- time to write the help file.)
I'm writing some programs to take some numbers from a typical "new renewable energy plant under construction!!" news article and automatically generate summaries of how much it costs compared to other options, what the expected life span will be, and so on. I intend to go on a rampage across the internet, leaving concise summaries in the comment sections of these news articles, so that people reading the comments will be able to see past the press release bullshit.
Why? Because I believe that people will be more rational if the right thing is obvious. A simple table of numbers and a few sentences of commentary can strip away layers of distortions and politics in a matter of seconds, if you do it right.
Essentially, it's a more elaborate and more automated version of what I did in this comment thread on Reddit: give the perspective that lazy journalists don't, and do the simple arithmetic that most journalists can't do.
It's very simple, but maybe it'll be effective. A lot of people respond well to straight talk, if they don't have a strongly-held position already.
You forgot one high-leverage component in this kind of inquiry: ask the question "why" not one but five times or more.
Just before reading the above, I was looking at instructions for building a laser show from cheap parts and the Arduino microcontroller I've been playing with lately.
Why - because I'm getting an interest lately in programming that affects the physical world.
Why - because, in turn, I believe that will broaden my horizon as a programmer.
Why - because I think learning about programming is one of the more important things anyone interested in the origins and improvement of thinking can learn. (Post about this coming sometime in the next few weeks.)
Why - because I want to improve my thinking in general. Which is also the reason I stopped here after I figured I had collected enough information about the laser stuff.
Why - because my thinking is my highest leverage tool in dealing with the world.
(ETA: to be quite honest, another reason is "because it's fun", but that tends to apply to a lot of the things I do.)
Reading PJ Eby's book chapters on positive vs. negative motivations
because
I think that now is a good time in my life to read up on the skills of real life, of motivation and of success.
Actions speak louder than words. A thousand "I love you"s doesn't equal one "I do". Perhaps our most important beliefs are expressed by what we do, not what we say. Daniel Dennett's Intentional Stance theory uses an action-oriented definition of belief:
...Here is how it works: first you decide to treat the object whose behavior is to be predicted as a rational agent; then you figure out what beliefs that agent ought to have, given its place in the world and its purpose. Then you figure out what desires it ought to have, on the same consi
3 priorities, in no particular order: support myself, become more capable, enhance rationality by publishing "seed exoshell" software.
An exoshell, as I understand it, is the software that you think with, in much the same way that you think with a piece of paper or a whiteboard. Current exoshell-ish software might include emacs ("emacs as operating system") or unix shell scripting ("go away or I will replace you with a small shell script"). Piotr Wozniak clearly uses SuperMemo as an exoshell. Mark Hurst's "Bit Literacy&quo...
I'm doing exactly what I would be doing if I had never found Less Wrong, but now I'm telling myself this is provably the best course because it will make me a lot of money which I can donate to the usual worthy projects. This argument raises enough red flags that I'm well aware of how silly it sounds, but I can't find any particular flaws in the logic.
What am I doing?: I am studying to be an electrical engineer at SPSU.
Why am I doing it?: Because I want to make a lot of money to pay for Friendly AI, anti-aging, and existential risk research, while following my own interest in power and similar technology.
Of course, these are my LessWrong what and whys, and not necessarily the other what and whys of other sectors of my life. If you took a look at the percentage of my time being devoted towards these activities... lets just say my academic transcript would not flatter me well....
(PS: Asking what and why is a good way to get far-mode about what you're doing, which increases motivation. So thanks for getting me kick-started on my studying ^_^)
You should go and donate $10 to the SIAI if you haven't, because people that donate any amount of money are much more likely to later donate more amounts of money. Anti-akrasia, etc. etc.
What am I doing? Trying to write a few thousand words on legal reasoning about people's dispositions.
Why? To finish my dissertation, and graduate with an Honours degree in law.
Why am I doing that? To increase my status and career opportunities, but also due to inertia. I've almost finished this degree, and a few more weeks of work for this result seem worthwhile. Also, doing otherwise would make me look weird and potentially cut off valuable opportunities.
Why does that matter? Much intellectually interesting and well paid work seems to require signalling a...
What am I doing?: Bug testing a program for my thesis. (A mathematical computation.)
Why?: Because I want it to work perfectly (and it doesn't).
Why?: Because I want to impress my advisor.
Why?: two reasons: a) to have a good working relationship since I will be at the same school for a while. (why?: because it's unpleasant to work with people you don't get along with; terminal value.)
b) to get better recommendations.
Why?: to have better career prospects.
Why?: to make more money.
Why?: a) to make my life more pleasant (terminal value) b) to donate to reducing...
I think these two questions are the basic questions of rationality that you should be asking yourself as much as possible. There is this great quote that I have on my desktop:
Only the real determinants of our beliefs can ever influence our real-world accuracy, only the real determinants of our actions can influence our effectiveness in achieving our goals. -- Eliezer Yudkowsky
Background: two years ago, I dropped out of college with a tremendous amount of debt. I'd failed several classes right before I dropped out, and generally made a big mess of things.
Still alive today, I'm beginning to step free of a lot of social conventions, letting go of shame and the habit of groveling, and learning to really value (and not just know I should value) important things. I am searching for how to make my strongest contribution. In the short term, that probably has to do with making a lot of money, but on the side, I have an inkling that work...
What am I doing? Working for SIAI. For the last hour or so I've been making a mindmap of the effects of 'weird cosmology' on strategies to reduce existential risk: whether or not the simulation hypothesis changes how we should be thinking about the probability of an existential win (conditional on the probability (insofar as probability is a coherent concept here) that something like all possible mathematical/computable structures exist); whether or not we should look more closely at possible inflationary-magnetic-monopole-infinite-universe-creation-horror...
What I am doing:
I'm sorry to do this because I'm sure it's off topic, but Tim Minchin (comedian) just did a 10 minute piece that will make skeptic that's had to sit through exchanges about auras, and magic, and how science is "just a theory too," just holler.
http://www.youtube.com/watch?v=V0W7Jbc_Vhw
Isn't this enough? Just this world?
Listening to Autechre's new album
Because it contains sufficient audial texture and sophisticated sound modulation combined with an intermittent hip-hop beat, so as to sit nicely within my current, self-imposed tolerances of what good electronic music should sound like....which results in a state of favorable brain chemistry.
I do not know if there is any causal connection here, but this piece of art is precisely relevant:
What am I doing?
Finally responding to this post on LessWrong.
Why am I doing it?
I don't quite feel tired yet, and I don't know which book to pick up for pre-sleep reading: Wicked, so I have some context when I see the musical with my girlfriend in June. The Ancestor's Tale because I find evolution extremely interesting, and there's the off chance that it will be relevant to my future research (and I'm obligated to read it since it was a gift). Or The Theory of Moral Sentiments because I find the moral sense theorist to be interesting precursors to Eliezer...
A bit off topic, but you've got me thinking about Babylon 5, so have a few more question:
I was, before reading this, reading the Selfish Gene by Dawkins Why? Because it is well written Why? Because it relates to the topic I chose for my IB EE Why? Because I enjoy learning about alternative descriptions of functional units of evolution and of organisms, and Dawkins treats genes like the puppet master of evolution. Why? Because I am trying to bridge a cultural gap with my father (reasons for that are for somewhere else) Why? Because it is refreshingly different from the magic realism I had been reading.
"Define And thus expunge The ought The should ... Truth's to be sought In Does and Doesn't "
-B. F. Skinner (an interesting soundbite from an otherwise misguided disagreement with Chomsky over language acquisition)
[quote]Define And thus expunge The ought The should
Truth's to be sought In Does and Doesn't [/quote]
-B. F. Skinner (an interesting soundbite from an otherwise misguided disagreement with Chomsky over language acquisition)
[quote]Define And thus expunge The ought The should
Truth's to be sought In Does and Doesn't [/quote]
-B. F. Skinner (an interesting soundbite from an otherwise misguided disagreement with Chomsky over language acquisition)
I'm doing my best.
"Narns, Humans, Centauri… we all do what we do for the same reason: because it seems like a good idea at the time." -- G’Kar, Babylon 5
I do this: http://critticall.com/SQU_cir.html
In fact, the machine on my left does it, I do something else.
I think that what you do (and why you do it) follow your beliefs, and that's why interrogating beliefs is the more fundamental question.
For example, you might do 'X' because you believe 'X' matters, or, more meta -- and more fundamental -- you might believe that whether you do 'X' or not matters because you believe that what you do matters. This is only true within a particular belief structure.
What are you doing?
Voting this post down.
And why are you doing it?
Because it contributes nothing substantive to the site; we're aware of the problem of grounding behavior, and this doesn't help solve it.
What are you doing?
Voting this post down.
And why are you doing it?
Because it contributes nothing substantive to the site; we're aware of the problem of grounding actions, and this doesn't help solve it.
ETA: On reflection, I made this point way too harshly, in an attempt to be cute. Sorry about that.
ETA2: Someone seems to be modding down everyone who's replied to this. Just so you know, it's not me. I don't vote on comments in arguments I'm directly involved in, and I've made a big deal about adhering to this in the past.
Nitpicking, but this "doing" question can't possibly be of equal importance to the fundamental question of rationality, because answering the "Why are you doing it?" part obviously depends on you having come to terms with what you believe, and why you believe it.
That said, I think this "doing" question is fundamental as well, second in importance only to the Fundamental Question. Good post.
[quote]Define And thus expunge The ought The should
Truth's to be sought In Does and Doesn't [/quote]
-B. F. Skinner (an interesting soundbite from an otherwise misguided disagreement with Chomsky over language acquisition)
It has been claimed on this site that the fundamental question of rationality is "What do you believe, and why do you believe it?".
A good question it is, but I claim there is another of equal importance. I ask you, Less Wrong...
What are you doing?
And why are you doing it?