Comment author: PeerInfinity 12 December 2010 12:56:18AM *  5 points [-]

I'm surprised that noone has asked Roko where he got these numbers from.

Wikipedia says that there are about 80 billion galaxies in the "observable universe", so that part is pretty straightforward. Though there's still the question of why all of them are being counted, when most of them probably aren't reachable with slower-than-light travel.

But I still haven't found any explanation for the "25 galaxies per second". Is this the rate at which the galaxies burn out? Or the rate at which something else causes them to be unreachable? Is it the number of galaxies, multiplied by the distance to the edge of the observable universe, divided by the speed of light?

calculating...

Wikipedia says that the comoving distance from Earth to the edge of the observable universe is about 14 billion parsecs (46 billion light-years short scale, i.e. 4.6 × 10^10 light years) in any direction.

Google Calculator says 80 billion galaxies / 46 billion light years = 1.73 galaxies per year, or 5.48 × 10^-8 galaxies per second

so no, that's not it.

If I'm going to allow my mind to be blown by this number, I would like to know where the number came from.

Comment author: FormallyknownasRoko 12 December 2010 12:58:20AM 2 points [-]

I meant if you divide the number of galaxies by the number of seconds to an event 100 years from now. Yes, not all reachable. Probably need to discount by an order of magnitude for reachability at lightspeed.

Comment author: FormallyknownasRoko 10 December 2010 11:55:46PM *  3 points [-]

Suppose that Blackmail is

merely an affective category, a class of situations activating a certain psychological adaptation

-- then we should ask what features of the ancestral environment caused us to evolve it. We might understand it better in that case.

I suspect that the ancestral environment came with a very strong notion of a default outcome for a given human, in the absence of there being any particular negotiation, and also came with a clear notion of negative interaction (stabbing, hitting, kicking) versus positive interaction (giving fish, teaching how to hunt better, etc).

$100 for the best article on efficient charity: the finalists

5 FormallyknownasRoko 07 December 2010 09:15PM

Part of the Efficient Charity Article competition. Several people have written articles on efficient charity --

 

 

  • Multifoliaterose has an article entitled "Efficient Charity" which scored 23 on the main site despite not being promoted.

 

 

Any comments on the finalists? Who do we think should be the winner?

$100 for the best article on efficient charity -- Submit your articles

5 FormallyknownasRoko 02 December 2010 08:57PM

Several people have written articles on efficient charity -- throwawayaccount_1 has an excellent article hidden away in a comment, as does waitingforgodel. Multifoliaterose promises to write an article "at some point soon" ..., and louie has actually submitted an article to the main LW page.

What I'd like is for throwawayaccount_1, waitingforgodel and multifoliaterose to submit to the main LW articles page. People will read the articles, and hopefully vote more for better articles. Srticles not submitted to the main LW articles page are not eligible for the prize.

Note that it is hard for me to judge which article(s) will actually have the best effect in terms of causing people to make better decisions, so at least some empiricism is desirable. Yes, it isn't perfect, but if anyone has a better suggestion, I am all ears.

Superintelligent AI mentioned as a possible risk by Bill Gates

7 FormallyknownasRoko 28 November 2010 11:51AM

"There are other potential problems in the future that Mr. Ridley could have addressed but did not. Some would put super-intelligent computers on that list. My own list would include large-scale bioterrorism or a pandemic ... But bioterrorism and pandemics are the only threats I can foresee that could kill over a billion people."

- Bill Gates 

From

Africa Needs Aid, Not Flawed Theories

One wonders where Bill Gates read that superintelligent AI could be (but in his estimation, in fact isn't) a GCR. It couldn't have been Kurzweil, because Kurzweil doesn't say that. The only realistic possibilities are that the influence came via Nick Bostrom, Stephen Hawking or Martin Rees or possibly Bill Joy(See comments)

It seems that Bill is also something of a Bayesian with respect to global catastrophic risk:

"Even though we can't compute the odds for threats like bioterrorism or a pandemic, it's important to have the right people worrying about them and taking steps to minimize their likelihood and potential impact. On these issues, I am not impressed right now with the work being done by the U.S. and other governments."

$100 for the best article on efficient charity -- deadline Wednesday 1st December

13 FormallyknownasRoko 24 November 2010 10:31PM

Reposted from a few days ago, noting that jsalvatier (kudos to him for putting up the prize money, very community spirited)   has promised $100 to the winner, and I have decided to set a deadline of Wednesday 1st December for submissions, as my friend has called me and asked me where the article I promised him is. This guy wants his god-damn rationality already, people! 

My friend is currently in a potentially lucrative management consultancy career, but is considering getting a job in eco-tourism because he "wants to make the world a better place" and we got into a debate about Efficient Charity, Roles vs. Goals, and Optimizing versus Acquiring Warm Fuzzies

I thought that there would be a good article here that I could send him to, but there isn't. So I've decided to ask people to write such an article. What I am looking for is an article that is less than 1800 words long, and explains the following ideas: 

  1. Charity should be about actually trying to do as much expected good as possible for a given amount of resource (time, $), in a quantified sense. I.e. "5000 lives saved in expectation", not "we made a big difference". 
  2. The norms and framing of our society regarding charity currently get it wrong, i.e. people send lots of $ to charities that do a lot less good than other charities. The "inefficiency" here is very large, i.e. Givewell estimates by a factor of 1000 at least.  Our norm of ranking charities by % spent on overheads is very very silly. 
  3. It is usually better to work a highly-paid job and donate because if you work for a charity you replace the person who would have been hired had you not applied
  4. Our instincts will tend to tempt us to optimize for signalling, this is to be resisted unless (or to the extent that) it is what you actually want to do. Our instincts will also tend to want to optimize for "Warm Fuzzies". These should be purchased separately from actual good outcomes
  5. Our human intuition about how to allocate resources is extremely bad. Moreover, since charity is typically for the so-called benefit of someone else, you, the donor, usually don't get to see the result. Lacking this feedback from experience, one tends to make all kinds of gigantic mistakes. 

but without using any unexplained LW Jargon. (Utilons, Warm Fuzzies, optimizing). Linking to posts explaining jargon is NOT OK. Just don't use any LW Jargon at all. I will judge the winner based upon these criteria and the score that the article gets on LW. Maybe the winning article will not rigidly meet all criteria: there is some flexibility. The point of the article is to persuade people who are, at least to some extent charitable and who are smart (university educated at a top university or equivalent) to seriously consider investing more time in rationality when they want to do charitable things. 

Competition to write the best stand-alone article on efficient charity

15 FormallyknownasRoko 21 November 2010 04:57PM

I have a friend who is currently in a lucrative management consultancy career, but is considering getting a job in eco-tourism because he "wants to make the world a better place" and we got into a debate about Efficient Charity, Roles vs. Goals, and Optimizing versus Acquiring Warm Fuzzies

I thought that there would be a good article here that I could send him to, but there isn't. So I've decided to ask people to write such an article. What I am looking for is an article that is less than 1800 words long, and explains the following ideas: 

  1. Charity should be about actually trying to do as much expected good as possible for a given amount of resource (time, $), in a quantified sense. I.e. "5000 lives saved in expectation", not "we made a big difference". 
  2. The norms and framing of our society regarding charity currently get it wrong, i.e. people send lots of $ to charities that do a lot less good than other charities. The "inefficiency" here is very large, i.e. GWWC estimates by a factor of 10,000 at least. Therefore most money donated to charity is almostly entirely wasted.
  3. It is usually better to work a highly-paid job and donate because if you work for a charity you replace the person who would have been hired had you not applied
  4. Our instincts will tend to tempt us to optimize for signalling, this is to be resisted unless (or to the extent that) it is what you actually want to do
  5. Our motivational centre will tend to want to optimize for "Warm Fuzzies". These should be purchased separately from utilons. 

but without using any unexplained LW Jargon. (Utilons, Warm Fuzzies, optimizing). Linking to posts explaining jargon is NOT OK. I will judge the winner based upon these criteria and the score that the article gets on LW. I may present a small prize to the winner, if (s)he desires it! 

Happy Writing

Roko

EDIT: As well as saying that he will pay $100 to the winner, Jsalvatier makes two additional points that I feel should be included in the specification of the article:

6.  Your intuition about what counts as a cause worth giving money to is extremely bad. This is completely natural: everyone's intuition about this is bad. Why? Because your brain was not optimized by evolution to be good at thinking clearly about large problems involving millions of people and how to allocate resources. 

7. Not only is your intuition about this naturally very bad (as well as cultural memes surrounding how to donate to charity being utterly awful), you don't realize that your intuition is bad. This is a deceptively hard problem. 

And I would also like to add:

8. Explicitly make the point that our current norm of ranking charities based upon how much (or little) they spend on overheads is utterly insane. Yes, the entire world of charities is stupid with respect to the problem of how to prioritize their own efforts. 

9. Mention the point that other groups are slowly edging their way towards the same conclusion, e.g. Giving What We Can (GWWC), Copenhagen Consensus, GiveWell. 

 

Comment author: FormallyknownasRoko 21 November 2010 03:59:42PM *  20 points [-]

This is why I have left the LW community for a year. I think that there is a lot to be learned from LW, but I also think that LW is currently 95% distracting by volume of text and by time-you'll-actually-spend-on-it.

I'd like to make the additional point that LW is not only a time-wise distraction, but it is also motivationally toxic, or at least has been to me.

More specifically, I think that investing emotionally too much in big-picture issues like efficient charity or high-technology risks and futurism tends to remove healthy, positive motivations from one's everyday life. You, as a human being, have to care about what you're going to do tomorrow and in the next week, and you have to be in a frame where most of the time, things are looking good and you're "winning". I think that a lot of the frames that LW encourages people to adopt (e.g. the frame that the entire future of the human race is likely doomed) contribute strongly to psychological depression and motivational exhaustion. That these frames and memes are based upon careful analysis is beside the point: there are some life-frames that you simply cannot live with, truth be damned.

What to do? I think that Patri's idea of more activity focussed posting is a good one. In the online social dynamics communities people are expected to post "field reports" of something that they actually achieved (e.g. starting a conversation and getting somebody's number with the intention of seeing them again).

In LW terms, I'd like to see a sub-forum dedicated to people applying for highly paid jobs. And another one dedicated to people gaining more intangible forms of power and influence, e.g. social skills, networking, etc. And perhaps another dedicated to making a LW-version of givewell.

These are all concrete, non-depressing things that we can do now and some of us will actually succeed at.

(I came back today to look for a specific post to give to a friend but saw this and couldn't help but comment)

View more: Prev