Comment author: topynate 12 December 2010 08:20:55PM 11 points [-]

"Do you want to know?" whispered the guide; a whisper nearly as loud as an ordinary voice, but not revealing the slightest hint of gender.

Brennan paused. The answer to the question seemed suspiciously, indeed extraordinarily obvious, even for ritual.

"Yes, provided that * * * ** * * * * * * * * ** * * ** * * * * ** * * * * **," Brennan said finally.

"Who told you to say that?", hissed the guide.

Comment author: FormallyknownasRoko 12 December 2010 08:44:26PM *  6 points [-]

Brennan is a fucking retard. No, you don't want to know. You want to signal affiliation with desirable groups, to send hard-to-fake signals of desirable presonality traits such as loyalty, intelligence, power and the presence of informed allies. You want to say everything bad you possibly can about the outgroup and everything good about the ingroup. You want to preech altruism and then make a plausible but unlikely reasoning error which conveniently stops you from having to give away anything costly.

All the other humans do all of these things. This is the true way of our kind. You will be punished if you deviate from the way, or even if you try to overtly mention that this is the way.

Comment author: steven0461 12 December 2010 06:39:25PM *  1 point [-]

Establishing a norm of giving away prizes creates very bad incentives and will tend to decrease the degree to which prizes actually motivate people in the future

On the other hand, it decreases the degree to which prizes are spent on ice-cream or movie tickets rather than charity. Evaluating a course of action means weighing the upsides against the downsides, not just listing a downside.

Comment author: FormallyknownasRoko 12 December 2010 07:10:11PM 1 point [-]

Yes but in reality the amounts concerned are good value for what they get.

Comment author: multifoliaterose 12 December 2010 05:51:35PM 6 points [-]

I'm flattered :-). Thanks again for taking the initiative to put the contest together. I agree with the suggestion that prizes for this sort of thing not be given away (and will not give my share away).

I submitted my article to jsalvatier for suggestions and he made some. I'll edit my article in response to some of these shortly.

Does anybody have suggestions for websites/newspapers/magazines where we might submit these articles to publicize the points made therein more broadly?

Comment author: FormallyknownasRoko 12 December 2010 05:59:33PM 0 points [-]

Yes, I will message you with details

Comment author: cousin_it 12 December 2010 05:49:28PM *  0 points [-]

Uh, spending effort on hurting people is negative-sum and most likely lose-lose, while teaching someone to hunt is positive-sum lose-win. Or maybe you see some deeper mystery here that I'm not seeing?

Comment author: FormallyknownasRoko 12 December 2010 05:51:58PM 1 point [-]

The problem with "lose-lose" is that it relies upon there being a "defualt outcome given no interaction". Vladimir is trying to taboo this concept, at least in general. So I am going to focus on a relevant special case, namely specific interactions available in the ancestral environment.

$100 for the best article on efficient charty - the winner is ...

19 FormallyknownasRoko 12 December 2010 03:02PM

Part of the Efficient Charity Article competition. Several people have written articles on efficient charity. The entries were:

The original criteria for the competition are listed here, but bascially the idea is to introduce the idea to a relatively smart newcomer without using jargon.

Various people gave opinions about which articles were best. For me, two articles in particular stood out as being excellent for a newomer. Those articles were:

Throwawayaccount_1

and

Multifoliaterose's

articles.
 

I therefore declare them joint winners, and implore our kind sponsor Jsalvatier to split the prize between them evenly. Throwawayaccount_1 should also unmask his/her identity.

[I would also ask the winners to kindly not offer to donate the money to charity, but to actually take the prize money and spend it on something that they selfishly-want, such as ice-cream or movie tickets or some other luxury item. Establishing a norm of giving away prizes creates very bad incentives and will tend to decrease the degree to which prizes actually motivate people in the future]

Comment author: Alicorn 10 December 2010 04:21:52PM 27 points [-]

I mean, seriously. I never want to know what it was and I significantly resent the OP for continuing to stir the shit and (no matter how marginally) increasing the likelihood of the information being reposted and me accidentally seeing it.

I award you +1 sanity point.

(I note that the Langford Basilisk in question is the only information that I know and wish I did not know. People acquainted with me and my attitude towards secrecy and not-knowing-things in general may make all appropriate inferences about how unpleasant I must find it to know the information, to state that I would prefer not to.)

Comment author: FormallyknownasRoko 12 December 2010 12:51:34PM *  0 points [-]

the only information that I know and wish I did not know.

I don't think it's quite that extreme. For example, I wish I wasn't as intelligent as I am, wish I was more normal mentally and had more innate ability at socializing and less at math, wish I didn't suffer from smart sincere syndrome. I think these are all in roughly the same league as the banned material.

Comment author: PeerInfinity 12 December 2010 12:56:18AM *  5 points [-]

I'm surprised that noone has asked Roko where he got these numbers from.

Wikipedia says that there are about 80 billion galaxies in the "observable universe", so that part is pretty straightforward. Though there's still the question of why all of them are being counted, when most of them probably aren't reachable with slower-than-light travel.

But I still haven't found any explanation for the "25 galaxies per second". Is this the rate at which the galaxies burn out? Or the rate at which something else causes them to be unreachable? Is it the number of galaxies, multiplied by the distance to the edge of the observable universe, divided by the speed of light?

calculating...

Wikipedia says that the comoving distance from Earth to the edge of the observable universe is about 14 billion parsecs (46 billion light-years short scale, i.e. 4.6 × 10^10 light years) in any direction.

Google Calculator says 80 billion galaxies / 46 billion light years = 1.73 galaxies per year, or 5.48 × 10^-8 galaxies per second

so no, that's not it.

If I'm going to allow my mind to be blown by this number, I would like to know where the number came from.

Comment author: FormallyknownasRoko 12 December 2010 12:58:20AM 2 points [-]

I meant if you divide the number of galaxies by the number of seconds to an event 100 years from now. Yes, not all reachable. Probably need to discount by an order of magnitude for reachability at lightspeed.

$100 for the best article on efficient charity: the finalists

5 FormallyknownasRoko 07 December 2010 09:15PM

Part of the Efficient Charity Article competition. Several people have written articles on efficient charity --

 

 

  • Multifoliaterose has an article entitled "Efficient Charity" which scored 23 on the main site despite not being promoted.

 

 

Any comments on the finalists? Who do we think should be the winner?

$100 for the best article on efficient charity -- Submit your articles

5 FormallyknownasRoko 02 December 2010 08:57PM

Several people have written articles on efficient charity -- throwawayaccount_1 has an excellent article hidden away in a comment, as does waitingforgodel. Multifoliaterose promises to write an article "at some point soon" ..., and louie has actually submitted an article to the main LW page.

What I'd like is for throwawayaccount_1, waitingforgodel and multifoliaterose to submit to the main LW articles page. People will read the articles, and hopefully vote more for better articles. Srticles not submitted to the main LW articles page are not eligible for the prize.

Note that it is hard for me to judge which article(s) will actually have the best effect in terms of causing people to make better decisions, so at least some empiricism is desirable. Yes, it isn't perfect, but if anyone has a better suggestion, I am all ears.

Superintelligent AI mentioned as a possible risk by Bill Gates

7 FormallyknownasRoko 28 November 2010 11:51AM

"There are other potential problems in the future that Mr. Ridley could have addressed but did not. Some would put super-intelligent computers on that list. My own list would include large-scale bioterrorism or a pandemic ... But bioterrorism and pandemics are the only threats I can foresee that could kill over a billion people."

- Bill Gates 

From

Africa Needs Aid, Not Flawed Theories

One wonders where Bill Gates read that superintelligent AI could be (but in his estimation, in fact isn't) a GCR. It couldn't have been Kurzweil, because Kurzweil doesn't say that. The only realistic possibilities are that the influence came via Nick Bostrom, Stephen Hawking or Martin Rees or possibly Bill Joy(See comments)

It seems that Bill is also something of a Bayesian with respect to global catastrophic risk:

"Even though we can't compute the odds for threats like bioterrorism or a pandemic, it's important to have the right people worrying about them and taking steps to minimize their likelihood and potential impact. On these issues, I am not impressed right now with the work being done by the U.S. and other governments."

View more: Next