adamzerner comments on On not getting a job as an option - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (187)
Personally, I've done a version of this. I've had jobs, but never a career, choosing to travel and have fun instead. I didn't need anyone to persuade me that this was an acceptable option, but I'm curious if anyone could persuade me that it's not. Redline mentioned giving the least possible effort and receiving maximum utility in return; this is the story of my life, only my "utility" has been fun.
First, I went to Guatemala and taught SAT prep 12 hours/week with 3 day weekends. This gave me the status of having a job, personal fulfillment of making a positive impact on students' lives, and paid far more than what I needed to cover living expenses. I spent my days reading suspense/fantasy novels, trail running, bike riding, volcano climbing, hanging out with friends, exploring, and seeking new experiences. Life was like a full time vacation!
Now, I'm here in California, "working" as a nanny (still have 3 day weekends), and my job consists of taking care of 2 fun, hilarious, well-behaved boys, playing marco polo, having nerf gun wars, buying and cooking whatever I want, reading bedtime stories, and it gives me the personal fulfillment of feeling super appreciated. In my free time, I play board games, go hiking, play ultimate frisbee, catch up with friends, and do loads of reading. Life is like a full time vacation!
I love my life, but every now and then, people will look down on me for "wasting my potential" and I'm tempted to agree with them. I'm not the best at anything (or I would probably feel more guilty about my decision), but I am very good at a lot of things (high school valedictorian, top 1% standardized test scores), and I'm genuinely curious:
Can a good argument be made in favor of ambition over hedonism, or does it all just boil down to intrinsic motivation and feelings of personal satisfaction?
That sounds completely awesome! I've always imagined that sort of lifestyle, but it always felt too abstract. Reading your description has helped my understanding become more concrete and vivid. Thanks you.
Ok, so the following is the state of my beliefs and understanding. In a way, I feel rather confident in it, because I've done a good amount of reading into other arguments, and after doing so I still think my reasoning makes more sense. But on the other hand, I definitely notice confusion, enough such that I wouldn't describe myself as "very confident". I wrote about it a bit more in depth here and here, which you might be interested in. It's about as well as I could articulate it without spending weeks writing and researching.
Summary - Morality is sort of a question asking about what you should do. Someone might say, "you should do X" or "you shouldn't do Y". My response - "should requires an axiom". You can only say, "you should do X... in order to achieve this end". Or "you shouldn't do Y... in order to achieve this end". The way people use the word, they're usually referring to an end implicitly.
Then there's the question of "well, what should the end be?". Which is circular. Consider two things though:
1) Preferences
2) Goals
Your Preferences are what produce the most desirable "mind-states". Imagine a thought experiment where you take a person, stimulate his brain to produce a bunch of different mind-states and have him rank them according to how preferable they are. This is what I'm mean by Preferences.
Goals are what you choose to strive towards. For example, you may choose to strive towards being a good mother, even if it doesn't maximize your Preferences.
You could choose whatever Goals you want. Preferences are pretty fixed though (seemingly).
Anyway, I don't think there's really an answer to "what Goals should you choose?". You have to say, "what Goals should you choose... in order to achieve this end". Goals are arbitrary. Rationality is about doing the best job you could at achieving the Goals you choose, but it doesn't help you actually choose them (because they're completely arbitrary). I've heard attempts to side-step this, and I've never been convinced. But like I said, there might be something I'm missing (I really hope there is).
Some people frown on a lack of ambition.
To be practical:
Also, Ambition can be poison (one of my favorite posts). I think it's a very slippery slope. Personally, I've fell pretty far down the slope and am trying to climb back up a bit.
Thanks so much for sharing! Sorry for the late reply, just got back from vacation.
I have had the exact same thought so many times! But I perform cost-benefit analyses only for small decisions, like your ticket example, with pretty clear cut preference ratios. When it comes to the option of pursuing a life goal, everything gets really fuzzy. I think it's that fuzziness that's keeping me from seriously considering giving up my fun-filled life to do something more ambitious.
I guess my life goals right now are pretty simple: maximizing happiness and avoiding feeling guilty for being so happy. I maximize happiness by having fun and doing nice things for people on the individual level, and I manage to discharge most of my guilt through effective altruism. Despite my natural resource consumption, I think I contribute enough happiness to the world for it to be better off than it would have been without me.
As for caring about what other people think, this actually doesn't come up often in real life. Almost everyone I associate with is also pretty into fun-centric activities, and think my life is cool, even if they appreciate the status and high income from more prestigious jobs. I think it's from perusing Less Wrong that I finally started to feel self-conscious about my choices. I see such a high percentage of rational people with high intelligence doing ambitious stuff, so I was curious whether there was an objective reason for it. So if being on LW is contributing to a slight increase in guilt, but not enough to make me want to become more ambitious, I should consider deleting my account and reading more fiction instead, haha. Pretty sure the cost-benefit ratio will keep me here though.
I think I agree with you here. But if goals are arbitrary, I might as well continue delighting in my career-less life for now. Maybe when I'm older, I'll have more ambition... It seems like some people do, but my dad is very smart and perfectly content working 10 hours/week as a lawyer and spending his free time reading, disc golfing, and winning poker tournaments/fantasy sports contests.
So personally, with a title like "Morality Doesn't Exist" would you be willing to describe your views as moral relativism? That there's no compelling reason (outside of yourself, depending on your personal preferences) to put yourself behind a veil of ignorance and put societal goals above personal happiness? The title of the website, Less Wrong, almost implies an objective morality, and it seems like many LWers shy away from the term relativism. Although I don't like it either, I still don't totally understand why they do, assuming they're rational enough to have a reason other than discomfort with the idea.
Depending on how far up the slippery slope of ambition you want to climb, I just heard my old boss in Guatemala is looking for a new SAT teacher for the next year, starting this summer, if you happen to be interested. But beware, the slope is slippery in both directions!
Very understandable. It makes sense that things that are more clear have a bigger influence on your motivation than things that are less clear.
I think it's a really good sign that you a) know this and b) acknowledge it. Given that it's such an important topic, it seems worth putting proportional thought into it though. And it seems like you are trying to do that. Check out Ugh fields if you haven't already. It's been one of the most practical articles I've found here.
Count me among them! In some not so far away alternate universe, I'm doing the same things you are. Which is why your situation is interesting to me.
I think it's a pretty hard question that most people don't seem to actually take seriously. For the record, my impression is that most people here aren't really too ambitious. Two big reasons seem to be a) "it's too difficult/unlikely that I succeed" and b) akrasia. Perhaps you'd like to investigate this further and more formally. If you do, please let me know what you find. If you don't, I probably will, but it'd be at the end of my current to-do list.
But anyway, you seem to be trying to take the question pretty seriously, and seem to be a pretty self-aware and reasonable person. I shall try to say something useful.
Question: What are your terminal goals. The ends that you seek. Obviously an incredibly difficult question. It may be possible to proceed without a perfect answer to it though if you have a rough idea of what your preference ratios are.
Question: How strong an impulse do you feel to do something ambitious? How manageable is this impulse? How do you expect this impulse to change over time? Personally, I have an incredibly strong impulse to do some ambitious things, and I've taken it into account that I expect that this impulse would remain strong and would make my life unpleasant if I ignored it.
Question: How happy would you be if you weren't to pursue an ambitious life? Seems like you have done a pretty good job so far. It seems that you'll continue to be pretty happy, although you seem to be in your early 20s and I'm not sure how much you could extrapolate from your current experiences.
Question: How big a positive impact would you have on the world if you pursued a non-ambitious path?
Question: What is the probability that you succeed in your ambitious endeavors? My thoughts about this are unconventional. I think that a truly smart and dedicated person would have very very good chances of success. I see a lot of big problems as puzzles that can be solved.
Very very rough calculations on startup success:
Some thoughts on where I see opportunity if I had the resources.
Question: How altruistic are you really? How much do you really care about the billions of people who you never have and probably never will meet? What about the bajillions of people who haven't been born yet? To what extent are you willing to make sacrifices for these people? (I know this is implicitly addressed in some other bullet points, but I thought it'd be worth mentioning explicitly) EDIT: See here for thoughts on EV and ambitiousness.
Question: What are the selfish reasons to be ambitious? How happy would you be if you succeeded in your ambitions? (note Ambition can Be Poison and it's easy to never be satisfied, so I don't think it'd be as happy as one would think) Could you possibly contribute/build a better world for yourself?
Some thoughts of mine. I don't think I'm nearly as well read as most people here, am lacking information and thus am of limited confidence, but I plan on reading up in due time. Anyway, it seems to be that we live in a truly special time. Kurzweil's LOAR makes sense to me (the gist of it anyway). Compared to previous generations, we have an unprecedented opportunity to do big things. AI, cryonics, anti-aging, the internet, joint-consumption economies, etc. I really do think there's a reasonable chance that you could contribute/build a better world for yourself.
I know that ambition can be poison. I know there's a reasonable chance I do some "slipping down the slope", but long term I think I'll be able to get over this (to a reasonable extent). Nevertheless, I take it into account in my calculations. However, I think that it'd feel really really really good to have had some big positive impact on the world. It seems like something that really would boost that happiness setpoint up a bit.
Finally, the money/fame/power that result from successfully achieving any ambitions definitely have their value, although I think it's nothing compared to the personal satisfaction.
Confessions: I think the clarity of these thoughts are about 40% more clear than the previous best analysis I had done. I personally don't think I've thought nearly hard enough about these things given there importance, although part of the reason is intentional - I have a tendency to overthink things which is stressful and I've sort of reached a point of diminishing returns... idk. And I unfortunately do fear that this analysis suffers a bit from confirmation bias (ie. biased in favor of ambition). So please take all of this into account.
I'm not particularly well read in philosophy. Probably way moreso than the average person, but below average for someone here. I don't know what moral relativism is, but I'll look it up...
It seems to be saying that goals are arbitrary, and if so, then yes - I do think my views could be described that way. Thanks for introducing me to the official viewpoint. You seem to know a bit about it - do you know anything about it that seems inconsistent with what I appear to believe?
After brief reading, it seems that I may not agree with what seems to be the less strict interpretations of moral relativism. It seems that people use it as an excuse to say "don't judge others for what they believe". It seems to me that a lot of viewpoints really do have a terminal goal of something along the lines of utilitarian, but these other viewpoints try to invent rules of thumb that promote this end, but they don't admit that the end is actually what they're after.
I wouldn't say so. The way I think of it is "less bad at achieving your ends". If you read HPMOR I would say "Quirrel is (at times) quite rational, even if his goals are sometimes selfish".
My impression is that the community does have some "soft spots", and not wanting to believe in moral relativism sort of seems like it's one of them (based on what I remember when I read through the metaethics stuff. Not wanting to seem naive appears to be another "soft spot" of the community to me.
And I think that "anti-religion" is a bias here too. I had gotten slammed for asking about the possibility of an afterlife here. Regardless of whether I was right or wrong, I don't think I was uncivil or anything and I think it's a topic worth discussion, at the very least (from their perspective) to help me better understand it. But I sense that it hit a soft spot, hence the downvoting and mild incivility. And I've seen similar things happen elsewhere here also. I figure you should know this given your background. For the record, I'd probably call myself a confused agnostic. I definitely don't believe in the teachings of religion or god in the traditional sense, but I don't pretend to understand the true workings of consciousness or the universe and I remain open to possibilities that atheist wouldn't. And on some level, I think Louis CK makes a good point (plus it's funny).
Anyway, my point here is that humans are quite flawed. I love LW but people here are far from perfect. And so am I. And even EY is far from perfect (although I think he's astonishingly smart).
No thanks (see above). But I appreciate the thought :)
Thanks for the link. You're right about this being an "ugh field" for me, something I usually flinch from even thinking about. I think my doubts about Christianity used to be an "ugh field" too, but I feel a lot better for having confronted them.
Those seem to apply to me too. I'd never heard of akrasia before, what a great word. If I investigate this further among the LW community, I'll let you know.
Thanks so much for your thorough reply. I really, really appreciate it! Answers to your questions:
You're right. This is an incredibly difficult question. Based on the sample human terminal goals given, I think the biggest for me are health, joy, and curiosity. Can environmentalism be a terminal goal? What about efficiency in general?
My impulse to do ambitious things is about a 2 out of 10, so not very strong at all, and very manageable, currently. It used to be more like a 1 though, so the current trend seems to be that the older I get, the more attractive a life of accomplishment looks.
How happy would I be not pursuing ambition? You're right; I'm super happy right now. I have absolutely no idea if this happiness with a leisurely lifestyle is something I can maintain or not. My dad and his best friend are both super smart and not very ambitious, and seem to be quite happy even as they approach their 50's, which makes me think I could stay very happy. Then again, I might be different. Maybe my lack of ambition was just from the way I was raised (in my family, we all bragged about acing tests with no outside study, about never having homework, about never doing assigned readings, about skipping class to hang out in the rec room, etc.. kinda pathetic, in hindsight).
How big an impact would I have? If I knew this, things would be lots less fuzzy! One goal that I'd love to pursue would be promoting hitchhiking/slugging. This has to do with my other values of environmentalism and efficiency. I also think it would be wonderful if people were less fearful of strangers. I'm not sure how exactly I'd work toward this goal, so it's really hard to gauge potential impact. If it were successful though, traffic would be decongested, carbon emissions would be decreased, and people would save money on transportation and have more opportunities to interact with new people... so yeah, it could potentially have a significant impact.
Probability of success? Good question. No idea, again this is very fuzzy since I don't even know where I would start; it's just not something I've thought about much. I'd probably have to find someone to team up with who has more concrete skills. All I have is a general idea and a pretty logical mind, no relevant experience or education. I am usually pretty confident and anything I think I can do, I can do, but I normally don't set my sights too high.
How altruistic am I, really? I don't know. I'm still going through the repercussions of my deconversion. Right now, the amount of caring I have for people in the world is relative to the amount I used to have as a Christian. Now that eternity/an afterlife is out of the picture for me, I'm a little less frantic about saving the world and more content doing my own thing. Still, I think I care enough that if I were pursue a big goal or career, altruism would be my chief motivation.
The selfish reasons to be ambitious are significant too, I guess. Currently, I'm so happy with my leisurely life, it's hard to remember back to times when I had accomplishments, like academic awards and track and field records. Money I don't care about so much, but accomplishments feel really great, whether it's because of the personal satisfaction or the praise, it's hard to tell. I'm fairly confident that ambition would never be poison for me. The accomplishments I've had in life felt great, but they were really just side benefits of me pursuing other terminal goals, and as great as they were, they didn't give rise to any ambition.
This was a good analysis; thank you! You're right that I really should put proportional thought into this.
Hmm, I'm not particularly well read in philosophy either, but I hear the term "moral relativism" thrown around a lot; mostly as a result of sharing my deconversion story actually, as a lot of people have commented that atheists almost have no choice but to be moral relativists. I think "moral relativism" is pretty simple and just means there is no "right" or "wrong" outside of an individual, and I think it's consistent with your views, but I'm not totally sure.
Haha, okay, I hear you. Actually as soon as I typed that sentence, I realized this would be your response. It's just a different definition of "wrong" than I'm used to, but it makes sense.
Yeah. I think this is another topic that probably deserves more discussion among the community than it currently gets. If our society gets to be extremely rational (which I think most people here strongly desire), it will be really hard to draw the line between individual freedom and what's best for the future of humanity, and I think this is something worth serious thought.
Yeah, I get the same feeling about an anti-religion bias here. Your post about an afterlife is interesting. We really have no reason to believe in one, but without data, I definitely don't think people should assign near-certain probability to the non-existence of an afterlife, either. I guess I'm more of an agnostic, too. I don't believe in the Christian God, but like I said in my very first post, I can't be sure there isn't a good god or gods struggling against an evil god out there somewhere. It is possible, I just have no reason to believe it's true, so I don't really think about it.
Maybe there's a subconscious tendency to go along with the mainstream views on LW just because almost everyone here is so good at thinking and rational people usually tend to agree with other rational people. Personally, I discovered this site and thought, wow! So many people who think SO similarly to me, only they've been thinking much harder and for much longer... the general ideas around here must represent the most rational and least biased opinions on any topic. It's tempting to just trust that the ideas around here are all things I can agree with and understand, just because I've agreed with almost everything I've read so far.... I guess I just have to be cautious, keep putting in the effort of thinking for myself, and remember that LW is a (wonderful) resource, not a bible.
A big part of the reason why I'm ambitious is because I try really hard to not fall victim to scope insensitivity. And regarding ambition, there's some really really big magnitudes at play. Ex.
Another reason why I'm ambitious is more practical - I want to retire early, really ASAP. Starting a startup, making a lot of money and being able to retire would be great.
Re: afterlives - we have tons of data. Brain damage can cause loss of function in a way which varies depending on what part of the brain is damaged. Everything points towards total brain damage causing total lack of function. We also have evidence that stimulating one part of the brain can turn off consciousness, and some evidence that conscious experience requires many parts of the brain working together.
I posted about the first part of that somewhere, though apparently not in response to the linked post. Probably I did not respond to that one because I'd already made this point, and it gets tiring to see people ignoring it again.
Oh, interesting, thanks for sharing! Data is good; that's cool that we know that, and I think I agree that it makes any afterlife extremely improbable. Sorry, I wasn't ignoring your point, I just noticed this now as a result of having just realized I can click on the little orange envelop and see replies and private messages.
Glad to help (if I am actually helping)! I find this fun.
Of course, anything can be a terminal goal :). But consider how strong a statement it is to say that something is a terminal goal. That it has intrinsic value. As for environmentalism, would the environment matter if there was no one on earth to experience it? If not, it makes me think that environmentalism matters to the extent that it makes peoples lives better, and thus would be an instrumental goal.
Some people would respond to what I just said by saying something along the lines of "Of course it wouldn't matter if no one was on earth, but don't be ridiculous - be practical." My response to that is that in discussing things like this, it's important to be very precise with what you say. Because a lot of disagreement comes from arguing over semantics, which comes from bad communication.
The good thing is that a) this is a testable question that you'll get more and more evidence for as time progresses, and b) you can easily adjust the extent to which you pursue ambitions. It's not like you have to decide once and for all now (not to imply you don't know that, just saying).
My guess is that you will be able to maintain your happiness.
a) The happiness set point theory seems rather accurate (my reason for thinking this is mostly based on anecdotal evidence, not on reading much into the research).
b) Anecdotally, it also seems to me that the "need to be ambitious" is also pretty set in stone. Ie. You know if you're one of those people, and you know somewhat early in life. I don't know of many 40 year olds who suddenly develop an irresistible urge to do something ambitious. Note: in HPMOR the distinction between having ambition and being ambitious is made.
That's amazing! I hated school and did a lot of rebellious things out of spite. Back to that alternate universe again... I wish my family was like that. One of my favorite rebellious things was that I refused to do some AP Micro project at the end of the year because I was already in college, getting a zero would only bring my grade from an A to a B, and economics is all about incentives, so it just felt too right to boycott the project on those grounds.
With respect, this seems like an ugh field (a more specific instance of what you said was a broader ugh field). P(success) seems like it plays a big role in whether or not you decide to be ambitious. I'm not sure though - if you thought you had a, say >50% chance of having a big impact on the world, would you then want to be ambitious?
If P(success) does indeed play a big role, I think it'd be a good idea to take an idea and give a real honest effort at seeing if you could "solve the puzzle". Try to break it down into it's components. What would have to happen in order for you to succeed? Break those components down further and ask the same question, etc. Honestly, try doing this for 5-10 ideas.
After doing this, you should have a much better sense of what P(success) is. Which has two benefits: 1) increases the chances you make the right decision as to whether or not to be ambitious, 2) will make you feel more confident in your decision, and perhaps more "at peace"/less likely to feel any sort of guilt.
Very understandable.
WOAH!!! Slow down there :)
I really don't want to die and am really hoping that I won't have to. And I plan on doing what I can to avoid it. A lot of people here think similarly, and there seems to be reason to hope.
A lot of people are hopeful that we might not have to die. There's the possibility of cryonics working out, anti-aging research, AI (<- a very clear introduction to AI if you don't know much about it). And there's even the possibility that we have no clue how consciousness really works and that there is indeed an afterlife. Note that I used the word possible. I don't know how probable these things are. This talks a bit about it.
I think there is. But note the distinctions between types of conformity. Part of it is sensible. The fact that other smart people believe something to be true is evidence that it's true (in that it increases the likelihood that it's true). And so it makes sense to adjust your beliefs accordingly. The real question is "how much should you adjust your beliefs".
As for the bad types of conformity, I think it exists here too. My judgement is that it's moderately less than average.
I can definitely empathize with that. Discovering LW was one of the best things that's ever happened to me. I had that same sense of agreeing with almost everything I was reading, and it was really really nice to hear the thoughts articulated so well.
Always :)
You know it's funny, I've never thought about this before, but I actually would like for the earth to stay beautiful, even if there are no humans around to enjoy it. Feeling such a strong attachment to the earth makes me think that I empathize a bit too much with Kaczynski... which got me thinking about psychopathic tendencies, and after looking them up, I realized I borderline have many of them. I'm not really too worried about myself, but this got me back to morality again. Psychopaths are probably quite rational about pursuing their own personal terminal goals. You can't say they're doing anything wrong, can you? Is there anything you can really say to convince a rational psychopath who is smart enough to get away unpunished for his actions to act in a way that is better for society?
Anyway, yeah, I think you're right that I could maintain my happiness. I'll probably continue in this "phase" of life another 2-3 years at least, enjoy my free time, do a lot of reading, and start thinking harder about what to do with the rest of my life. My back-up plan is to become a cop/detective in the Bay Area, which would be somewhat physically and mentally engaging, offer great hours and benefits, allow me to retire on pension after 25 years, and pay enough that I would probably end up donating more than 10%... a fun, comfortable, guilt-free life, but definitely not something that would change the world or leave me filling immensely fulfilled. So, I'll take your suggestion and try to work out the pieces to the puzzle for a few more ambitious ideas.
A distinction between being ambitious and having ambition? Wow, I think I'm going to love that book.
Yes!! Haha I did the same thing, when it came to final exams, some people would calculate the score they needed to bump their grades up one notch, but for me, every year, every class, even in college, my question was "What's the lowest score I can get and still get an A in the class?" If I knew I would get a C without studying, or I could do a quick 15 minute review and get an A, I wouldn't even do it. The effort I put into classes also correlated with how harsh a grader the teacher was. I actually had one teacher who believed in grading students according to their effort rather than according to their ability/final product compared to the rest of the class. I hated it, but in hindsight, I would have gotten a lot more out of school if all teachers had done that.
You're right about the probability of success being an ugh field. I like your idea about solving puzzles and high probabilities of success. I'll try breaking some ideas down, someday. Obviously higher P(success) correlates with a stronger desire to do something ambitious, but even if it were >50%, I would still be selfish enough to consider doing my own thing.
Either that or...well, you know. Maybe this is why it's an ugh field. I'm too happy living my leisurely life and subconsciously fear that if I find a high probability of success, I won't change anything, but will feel a more substantial amount of guilt.
Anyway, I found Scott's post about comparative advantage interesting and relevant. It was the first SSC post I read, which made me read tons of the archived posts, which eventually led me here. I know my strengths and weaknesses, but I really wish I knew my comparative advantage. Even if I did know, though, what if it wasn't nearly as fun as nannying? The post ends:
God is definitely convenient here.
Oh, yeah! Not dying would be cool! I haven't read much about it yet, but I'm encouraged by the fact that people here are so hopeful. Right now, when I think about people dying and turning to dust, I think it sounds great, but really that's only relative to thinking about people dying and going to hell. Edit: I read the AI article you linked, and the part 2 afterwards, and it made the whole AI idea seem a lot more concrete/probable/exciting. Thanks! It makes me wonder, though, is it selfish to work on AI? The people working on it will die if they don't succeed, so personally they have nothing to lose even if they accidentally cause an early extinction of the human race. Then again, if we assume our eventual extinction is inevitable without AI, the overall risk-reward ratio favors research. Maybe I should consider giving some/all of my donations to MIRI. Thanks for bringing this all to my attention. Figuring out what I consider to be the probability of immortality should probably be more urgent than figuring out the probability of success in pursuing ambition, and like you mention, there will probably be some relation between the two.
Yeah, this is intuitive. But I think we should be careful here not to look at an idea and ask, "What % of people who believe this are smart?" because the real question is, "Out of all the smart people who have seriously considered this idea, what % believe it?" It may be that some ideas are considered mainly by smart people, which would explain why a high percentage of people believing them are smart.
I doubt it. In my experience, the average person is quite stupid. My thought is that the fact that they're a sociopath means that they have different goals, but not necessarily that they're more instrumentally rational (better at achieving your goals, whatever they are).
No, but I can say that I don't like them :)
Interesting question. If they genuinely prefer to cause harm to people, and if they really are instrumentally rational enough to only do things that help them achieve their goals, then no. But altruistic acts are one of the biggest correlates of happiness in normal people, so perhaps their psychopathy isn't set in stone and they could be convinced that there's a way to achieve more happiness.
You may need a lot less money to retire than you'd think. Depending on how much you spend. The author argues (throughout the site) that a lot of spending is on essentially status-related goods, and that spending money on free time (indirectly) and security is more likely to lead to happiness (if you're the right type of person, but I sense that you are).
My thoughts on this are a bit unconventional. Most people use the term intelligence to refer to things like aptitude, working memory size and ability to remember things. I think that those things are overrated and that the ability to break things down like a reductionist is underrated. I started to write about it here, but am having trouble. I welcome any feedback (if you have any thoughts, please use Medium's side comments, it's really useful)
Yes! Well, I think it's an oversimplification, but I very much agree with the direction of the advice. I hate formality. In school they give you all of these rules about how to write, and these rules seem to take you further and further away from how you actually speak. I always thought that these rules were bad, and I rebelled and got only average grades in writing even though I think I'm an amazing writer :)
That seems to be the central point the article is about, and also sort of what we're talking about. I actually don't even think there's that much to say. When I dissolve the topic, all I see is:
I could add a lot of qualifiers to 1, 2 and 3, but I think you get what I'm saying so I won't.
It seems to me that people don't apply EV when calculating comparative advantages. Ie. they think about how much output they could generate right now rather than how much output they could be expected to generate over a period of time.
I'm a big believer that the ceiling of peoples' abilities is much higher than they think, and so taking this into account, my calculations of EV tend to be higher. Like, to people who say that they can't contribute to existential risk reduction, I'd say "How much do you think you could contribute if you studied really hard for 20 years?". And in calculating EV's for things like existential risk reduction, I think people fall victim to scope insensitivity. Even if you don't have a great chance at contributing, the magnitude of impact that a contribution would have is soooo great that it probably still leads to a high overall EV. Depending on preference ratios of course.
I'm sorry you thought that. I can't imagine how horrifying that must be.
:) It was somewhat life changing for me. I actually understood it. Before reading that I just read a few things on LW, and didn't really understand it.
Yes! I think that selfishness is a huge component of the benefit of working on AI. After all you are one of the people who would benefit, and you have a lot to gain/lose. People don't seem to acknowledge this. But you would also be helping billions of currently living people, and bajillions of yet-to-be-born people, so for those reasons it's an incomprehensibly altruistic thing to do.
I'm confused. If you assume that dying is bad, you have a lot to lose (proportional to the badness of dying). Are you considering death to be a neutral event?
To me that seems like a great option. Others seem to think so as well. Personally I don't know nearly enough about AI or the other options to be able to say with even moderate confidence.
Okay, yeah, I should have added the word some. Kaczynski is the only psychopath I've really read much about, so maybe I really did extrapolate his seeming rationality onto other psychopaths, even though we probably never hear about 99% of them. That would have to be some kind of bias; out of curiosity how would you label it? Maybe survivorship bias? Or availability heuristic? Anchoring? Or maybe even all of the above?
Believe me, I know. Even without trying to save money, I actually end up spending less on myself (excluding having paid for college) than on charity. Free hobbies are great. I didn't mean a pension was a reason to become a detective; it would just be a nice perk. Thanks for the link, though. Lots of good articles on that site!
Well, I'm biased in favor of this idea, since I have an awful memory, but a pretty good ability (sometimes too good for my own good) to break things down like a reductionist and dissolve topics. I'll check out your post tomorrow and try to give some feedback.
I think so too!
Nope, there's really not, but another thing I've realized from reading SSC is that a major component of great writing (and teaching) is the sharing of relevant, interesting, relatable examples to help an idea. If you skillfully parse through an idea, the audience will probably understand it at the time. But if you want the idea to actually sink in and stick with them, great examples are key. This is one reason I like Scott's posts so much; they actually affect my life. Personally, I was borderline cocky when I was younger (but followed social norms and concealed it). Then, I got older and started to read more and more, moved to the Bay Area, and met loads of smart people. Because of this, my self-esteem began to plummet, but I read that article just in time to stabilize it at a healthy, realistic level.
Anyway, Scott allows people to go easy on themselves for contributing less to the world than they might like, relative to their innate ability. Can we also go easy on ourselves relative to innate conscientiousness?
Yeah, this is sooo real. On a logical level, it's easy to recognize my scope insensitivity. On a "feeling" level, I still don't feel like I have to go out and do something about it. But I don't want to admit my preference ratios are that far out of whack; I don't want to be that selfish. Ugh. Now I feel like I should do something ambitious again, I'm so waffley about this. Thanks for all the help thinking through everything. This is BY FAR the best guidance anyone has ever given me in my life.
No... sorry, I was just working through my first thoughts about the idea, not making a meaningful point. Continuing on the selfishness idea, all I meant was that the researchers themselves would surely die eventually without AI, so even if AI made the world end a few years earlier for them, they personally have nothing to lose relative to what they could gain (dying a few years earlier vs. living forever). My first thought was "that's selfish, in a bad way, since they care less than the bajillions of still unborn people would about whether humans go extinct" but then I extrapolated the idea that the researcher would die without AI to the idea that humanity would eventually go extinct without AI and decided it was selfish in a good way.
Anyway, another question for you. You know how you said we care only about our own happiness? Have you read the part of the sequences/rationality book where Eliezer brings up someone being willing to die for someone else? If so, what did you make of it? If not, I'll go back and find exactly where it was.
I don't know too much about him other than the basics ("he argued that his bombings were extreme but necessary to attract attention to the erosion of human freedom necessitated by modern technologies requiring large-scale organization").
I think that his concerns are valid, but I don't see how the bombings help him achieve the goal of bumping humanity off that path. Perhaps he knew he'd get caught and his manifesto would get attention, but a) there's still a better way to achieve his goals, and b) he should have realized that people have a strong bias against serial killers.
The reason I think his concerns are valid is because capitalism tries to optimize for wanting, which is sometimes quite different from liking. And anecdotally, this seems to be a big problem.
I'm not sure what the bias is called :/. I know it exists and there's a formal name though. I know because I remember someone calling me out on it LWSH :)
Yes, I very much agree. At times I think the articles on LW fail to do this. Humans need to have their System 1's massaged in order to understand things intuitively.
Idk. This seems to be a question involving terminal goals. Ie. if you're asking whether our innate conscientiousness makes us "good" or "bad".
When I think of morality this is the/one question I think of: "What are the rules we'd ask people to follow in order to promote the happiest society possible?". I'm sure you could nitpick at that, but it should be sufficient for this conversation. Example: the law against killing is good because if we didn't have it, society would be worse off. Similarly, there are norms of certain preference ratios that lead to society being better off.
I don't think we'd be better off if the norm was to have, say equal preference ratios for everyone in the world. Doing so is very unnatural would be very difficult, if not impossible. You have to weigh the costs of going against our impulses against the benefits that marginal conscientiousness would bring.
I'm not sure where the "equilibrium" points are. Honestly, I think I'd be lying to myself if I said that a preference ratio of 1,000,000,000:1 for you over another human would be overall beneficial to society. I suspect that subsequent generations will realize this and look at us in a similar way we look at Nazis (maybe not that bad, but still pretty bad). Morality seems to "evolve" from generation to generation.
Personally, my preference ratios are pretty bad. Not as bad as the average person because I'm less scope insensitive, but still bad. Ex. I eat out once in a while. You might say "oh well that's reasonable". But I could eat brown rice and frozen vegetables for very cheap and be like 70% as satisfied, and pay for x meals for people that are quite literally starving.
But I continue to eat out once in a while, and honestly, I don't feel (that) bad about it. Because I accept that my preference ratios are where they are (pretty much), and I think it makes sense for me to pursue the goal of achieving my preferences. To be less precise and more blunt, "I accept that I'm selfish".
And so to answer your question:
I think that the answer is yes. Main reason: because it's unreasonable to expect that you change your ratios much.
It's great that you understand it on a logical level. No one has made much progress on the feeling level. As long as you're aware of the bias and make an effort to massage your "feeling level" towards being more accurate, you should be fine.
Why?
I think that answering that exploring and answering that question will be helpful.
Try thinking about it in two ways:
1) A rational analysis of what you genuinely think makes sense. Note that rational does not mean completely logically.
2) An emotional analysis of what you feel, why you feel it, and in the event that your feelings aren't accurate, how can you nudge them to be more accurate.
Wow! Thanks for letting me know. I'm really happy to help. I've been really impressed with your ability to pursue things, even when it's uncomfortable. It's a really important ability and most people don't have it.
I think that not having that ability is often a bottleneck that prevents progress. Ex. an average person with that ability can probably make much more progress than a high IQ person without it (in some ways). It's nice to have a conversation that actually progresses along nicely.
I think I have. I remember it being one of the few instances where it seemed to me that Eliezer was misguided. Although:
1) I remember going through it quickly and not giving it nearly as much thought as I would like. I'm content enough with my current understanding, and busy enough with other stuff that I chose to put it off until later. Although I do notice confusion - I very well may just be procrastinating.
2) I have tremendous respect for Eliezer. And so I definitely take note of his conclusions. The following thoughts are a bit dark and I hesitate to mention them... but:
a) Consider the possibility that he does actually agree with me, but he thinks that what he wrote will have a more positive impact on humanity (by influencing readers)
b) In the case that he really does believe what he writes, consider that it may not be best to convince him otherwise. Ie. he seems to be a very influential person in the field of FAI, and it's very much in humanities interest for that person to be unselfish.
I haven't thought this through enough to make these points public, so please take note of that. Also, if you wouldn't mind summarizing/linking to where and why he disagrees with me, I'd very much appreciate it.
Edit: Relevant excerpt from HPMOR
Sorry, I feel like I'm linking to too many things which probably feels overwhelming. Don't feel like you have to read anything. Just thought I'd give you the option.
Also, I thought your posts were well-written, so if you recommend any others, I will read them :)
Interesting, thanks for the feedback. I hope you're being honest. I have a bit of a hard time judging the quality of my own writing.