Posts

Sorted by New

Wiki Contributions

Comments

(Edit: I noticed that the Oxford event is already sold out, so this is kind of a moot point for me this year, but I'd be interested in the answer anyway, as this isn't the last event like this.)

You've given examples of how different active EA members have found each other and come up with new projects to do together. This supports the idea that if you're an EA activist, you should attend the conference. But do you think "rank and file" Effective Altruists whose only aspiration is to donate a portion of their income should attend as well, even if they aren't interested in becoming more active? Is it still more effective to spend the money on the convention than to simply donate it to an effective cause? If I were to attend, I'd expect to have many interesting lectures and a few interesting conversations, but I'd be surprised if I became more involved in the community building side.

An alternative explanation I can think of is the placebo effect. It's possible that your behaviour Y changed after changing X, because you believed behaviour Y would change. Especially as you wanted to change those behaviours in the first place.

Also, even if this was not due to placebo effect, it's only evidence on how your mind works. Other people's minds might work differently. (And I suspect it's also quite weak as evidence goes, though I can't seem to articulate why I think so. At the very least I think you'd need a very big sample size of behaviour changes, without forgetting to consider also the failed attempts at changing your behaviour.)

Hello!

My name is Tommi, and I'm a 34-years old Finn living in Berlin at the moment. I work as a freelance developer, focusing on the Unity development environment, making educational games, regular games, virtual art galleries, etc. for an hourly fee (so that's the skill set I bring into the community). I found Less Wrong some years ago via HPMOR (I forget how I found HPMOR). I've read it occasionally, but over the last year or so I've been slowly gravitating towards it, and decided now to make the effort to try on this community.

I've always valued reason and science over hearsay and guessing, but so far it's manifested mostly in terms of what I like to read and who I vote for. I also participated in the Green party of Finland for some years, in order to advance scientific decision making and a long-term, global approach to things (the Greens in Finland have a fairly strong scientific leaning despite hanging onto some dogmas). However, as an introvert my effect was, as far as I could tell, minimal. Now that I've learned that lesson, and am also in a good position financially and in terms of available time, I'm looking at my life goals again, and would like to see if this community could help me reach them.

As I understand them now, my goals are as follows:

1) Live a comfortable life materially. I'm not willing to sacrifice all of life's comforts to serve a higher goal. However, my material desires are lowish compared to my ability to earn (I'm a freelance programmer and apparently a pretty good one).

2) Have a fullfilling social life. One reason I've been looking at Less Wrong are the articles on improving social skills. However, I'm not certain if improving them is worth the effort - perhaps it would be better to settle for the kind of social life I can get with my current skills, and focus on other things. (Romance seems to be particularly hard to achieve - I think it's particularly hard because I'm gay and I haven't found many social circles that are simultaneously gay and nerdy enough to feel comfortable to me.)

3) Have a high net positive impact on the world. Unless I suddenly lose my income, I intend to pay 10% of my income this year to charity. I'll probably go for a GiveWell approved charity, although I have some reservations on the utilitarian leanings of it. I believe in more complex ethics than a simple sum total of utility. For example, I believe that debt exists: If someone loses utility because of me (either they helped me or I did them harm), I'm obligated to compensate them (if they want that) instead of helping some other person. So I tend to think I should become carbon-neutral before contributing to other charities, unless those charities help the same people damaged most by carbon emissions (something that may well be true). I also believe that the utility of people who do harm to others is worth less than the utility of those who don't. The application of this second rule isn't as clear, though.

4) Artistic aspirations. I wish to advance the field of interactive storytelling. Basically, I'd like to make a game/games that offer the player/players meaningful choices. Meaningful in storytelling, moral, and strategic sense. Such games already exist, of course, but I want to make the choices more open-ended than in an RPG like Mass Effect, and more real and personal than in a strategy game. Ideally, I'd like to make the player feel like they're interacting with and affecting the lives of real people in an imaginary setting. My ambitions are similar to Chris Crawford's (http://www.erasmatazz.com/library/interactive-storytelling/what-is-interactive-storyte.html) but my approach is not as puristic as his. My other role models are the people behind the game King of Dragon Pass.

Initially, I was thinking about this in terms of the usual heroic stories that are being made into games over and over (just doing it better, of course). However, now I'm thinking about combining this ambition with another ambition, which was turning one of my old roleplaying campaigns into a novel/series. I wrote a few chapters a couple of years ago, and it was very well received at the creative writing workshop I showed it at. Some of the honor goes to the failed MMO Seed which my roleplaying campaign was fanfiction of. Seed, and by extension the campaign, had strong rationalistic leanings - it's a science fiction story about a group of colonists on another planet sorting out various problems via science and technology, and having political games about which way to steer the colony. The characters tend to be very analytic and look at things with a long perspective.

My campaign was pre-HPMOR, though, so it wasn't that super-deep about rationalism. But now I think it might be interesting to combine the writing goal with the interactive story goal, and strive to deepen the thinking involved as much as I can. Ideally, the game would reward the player/players for thinking rationally, while also making them care about the characters and the unfolding story - without turning it into a series of rationality puzzles with only one right answer.

So, I'd like to see if digging deeper into the Less Wrong community would help me with these goals.