Comment author: blacktrance 12 January 2014 05:41:23AM *  2 points [-]

In the unlikely even that anyone is interested, sure, ask me anything.

Edit: Ethics are a particular interest of mine.

Comment author: Tuxedage 12 January 2014 08:28:13AM 15 points [-]

Would you rather fight one horse sized duck, or a hundred duck sized horses?

Comment author: Tuxedage 06 January 2014 02:20:37AM 1 point [-]

I mean this in the least hostile way possible -- this was an awful post. It was just a complicated way of saying "historically speaking, bitcoin has gone up". Of course it has! We already know that! And for obvious reasons, prices increase according to log scales. But it's also a well known rule of markets that "past trends does not predict future performance".

Of course, I am personally supportive and bullish on bitcoin (as people in IRC can attest). All I'm saying is that your argument is an unnecessarily complex way of arguing that bitcoin is likely to increase in the future because it has increased in price in the past.

Comment author: timujin 04 January 2014 08:03:50PM 2 points [-]

I want to play as a Gatekeeper, where can I enroll? I don't expect any particular outcome, I just think that both winning and losing the game will provide me with utility. Especially losing, but only if I genuinely try to win.

Comment author: Tuxedage 04 January 2014 09:32:11PM 0 points [-]

Generally speaking, there's a long list of gatekeepers -- about 20 gatekeepers for every AI that wants to play. Your best option is to post "I'm a gatekeeper. Please play me" in every AI box thread, and hope that someone will message you back. You may have to wait months for this, assuming you get a reply. If you're willing to offer a monetary incentive, your chances might be improved.

Comment author: Tuxedage 29 December 2013 05:47:03PM *  2 points [-]

You may feel that way because many of your online conversations are with us at the LessWrong IRC, which is known for its high level of intellectual vigor. The great majority of online conversations are not as rigorous as we are. I suspect that IRL conversations with other lesswrongers will have equal dependence on citations, references, for example.

Comment author: Tuxedage 23 December 2013 08:26:52PM 8 points [-]

I have posted this in the last open thread, but I should post here too for relevancy:

I have donated $5,000 for the MIRI 2013 Winter Fundraiser. Since I'm a "new large donor", this donation will be matched 3:1, netting a cool $20,000 for MIRI.

I have decided to post this because of "Why our Kind Cannot Cooperate". I have been convinced that people donating should publicly brag about it to attract other donors, instead of remaining silent about their donation which leads to a false impression of the amount of support MIRI has.

Comment author: Brillyant 10 December 2013 10:42:35PM 2 points [-]

Interesting.

I have been convinced that people donating should publicly brag about it to attract other donors

It certainly seems to make sense for the sake of the cause for (especially large, well-informed) donors to make their donations public. The only downside seems to be a potentially conflicting signal on behalf of the giver.

instead of remaining silent about their donation which leads to a false impression of the amount of support MIRI has.

I'm not sure this is true. Doesn't MIRI publish its total receipts? Don't most organizations that ask for donations?

Growing up Evangelical, it was taught that we should give secretly to charities (including, mostly, the church).

I wonder why? The official Sunday School answer is so that you remain humble as the giver, etc. I wonder if there is some other mechanism whereby it made sense for Christians to propogate that concept (secret giving) among followers?

Comment author: Tuxedage 10 December 2013 10:53:15PM 6 points [-]

I'm not sure this is true. Doesn't MIRI publish its total receipts? Don't most organizations that ask for donations?

Total receipts may not be representative. There's a difference between MIRI getting funding from one person with a lot of money and large numbers of people donating small(er) amounts. I was hoping this post to serve as a reminder that many of us on LW do care about donating, rather than a few rather rich people like Peter Thiel or Jaan Tallinn.

Also I suspect scope neglect can be at play -- it's difficult to, on an emotional level, tell the difference between $1 million worth of donations, or ten million, or a hundred million. Seeing each donation that led to adding up to that amount may help.

Comment author: Tuxedage 10 December 2013 07:14:32PM *  55 points [-]

At risk of attracting the wrong kind of attention, I will publicly state that I have donated $5,000 for the MIRI 2013 Winter Fundraiser. Since I'm a "new large donor", this donation will be matched 3:1, netting a cool $20,000 for MIRI.

I have decided to post this because of "Why our Kind Cannot Cooperate". I have been convinced that people donating should publicly brag about it to attract other donors, instead of remaining silent about their donation which leads to a false impression of the amount of support MIRI has.

Comment author: Tuxedage 23 November 2013 09:35:24PM *  27 points [-]

I have taken the survey, as I have done for the last two years! Free karma now?

Also, I have chosen to cooperaterather than defect was because even though the money technically would stay within the community, I am willing to pay a very small amount of money from EV in order to ensure that LW has a reputation for cooperation. I don't expect to lose more than a few cents worth of expected value, since I expect 1000+ people to do the survey.

Comment author: gwern 23 October 2013 02:00:22AM *  18 points [-]

Some LWers may be interested in a little bet/investment opportunity I'm setting up. I have become increasingly disgusted with what I've learned about the currently active Bitcoin+Tor black markets post-Silk-Road - specifically, with BlackMarket Reloaded & Sheep. I am also frustrated that customers are flocking to them, and they all seem absurdly optimistic. So, I am preparing to make a large public four-part escrowed bet with any comers on the upcoming demise of BMR & Sheep in the coming year, in the hopes that by putting money where my mouth is, I may shock at least a few of them into sanity and perhaps even profit off the more deluded ones.

The problem is, I feel I can afford to risk ฿1 ($200), but I'm not sure that this will be enough to impress anyone when split over 4 bets ($50 a piece). So I am willing to accept up to ฿1 in investments from anyone, to increase the amount I can wager. The terms are simple: whatever fraction of the bankroll you send, that's your share of any winnings. If we bet ฿2 and you sent ฿1, then you get half the winnings if any. (I am not interested in taking any cut here.)

My full writeup of the bet, with some statistics helping motivate the death probabilities I am betting based on: http://pastebin.com/bEuryTuF

If you are interested, you can reply here, or contact me at gwern@gwern.net, or we can chat on Freenode (as gwern or just visit #lesswrong). I am currently ignoring private messages on LW, so don't do that.

Also, please don't express interest unless you are genuinely fine with potentially losing your investment: given my best estimate of the probabilities & their correlations, there's somewhere >10% chance that we would lose all 4 bets as both BMR & Sheep survive the full year.

EDIT: if you really want to get in, I'll still take your bitcoins, but I think I have enough investors now, thanks everyone.

Comment author: Tuxedage 24 October 2013 02:32:59AM 3 points [-]

I will be price matching whatever gwern personally puts in.

Comment author: Tuxedage 14 October 2013 10:14:11AM *  12 points [-]

AI Box Experiment Update

I recently played and won an additional game of AI Box with DEA7TH. Obviously, I played as the AI. This game was conducted over Skype.

I'm posting this in the open thread because unlike my last few AI Box Experiments, I won’t be providing a proper writeup (and I didn't think that just posting "I won!" is enough to validate starting a new thread). I've been told (and convinced) by many that I was far too leaky with strategy and seriously compromised future winning chances of both myself and future AIs. The fact that one of my gatekeepers guessed my tactic(s) was the final straw. I think that I’ve already provided enough hints for aspiring AIs to win, so I’ll stop giving out information.

Sorry, folks.

This puts my current AI Box Experiment record at 2 wins and 3 losses.

View more: Next