Comment author: DeevGrape 22 November 2013 04:42:45PM 34 points [-]

I took the survey.

I realized while answering one of the questions that the comments that I make for free karma are one of my main interactions with the LW website.

Comment author: DeevGrape 27 December 2012 07:29:45AM 3 points [-]

The moral tension of what Celestia was doing was well done. I found myself getting excited at how awesome everything was, and then muttering aloud about the "creeping horror" that was coming. I also loved the parallels to the AI-boxing experiments, except the Princess can argue people into boxing themselves.

Thanks very much for writing this! In lieu of anything specifically useful, I leave you with a small proportion of the warm fuzzies you provided to me :)

Comment author: DeevGrape 14 December 2012 06:42:16PM 23 points [-]

Just donated $500 (with the Singularity credit card, so it's really more like $505 ^_^).

Most Likely Cause of an Apocalypse on December 21

0 DeevGrape 03 December 2012 05:10PM

Note: This post is almost completely tongue-in-cheek. Obviously the chances of December 21, 2012 heralding in an apocalypse, definable maybe as an event causing billions of deaths and/or global catastrophic infrastructure damage, are slim to none.

But they aren't actually none.

Let's say there's a 5% chance of a superhuman General AI being developed in the next 10 years and ushering in the singularity. Let's say 4/5 of those scenarios would lead to a Bad End which could reasonably be called an apocalypse (or an "AI-pocalypse", perhaps). And let's say the distribution of probability isn't linear, but is exponentially skewed somehow so that given the emergence of AI in the next 120 months, the chances of it happening in this very month are 1 in 100,000. Then let's divide that by 30 so we can get our apocalypse rolling on the right day.

5/100 * 4/5 * 1/100000 * 1/30 = 1 in 75,000,000

Admittedly, low odds. This comports with the fact that there would have to be a lot of unknown development progress being made already on the problem for an intelligence explosion to be anywhere on the horizon.

But compared to some of the scenarios debunked by NASA (http://www.space.com/18678-2012-mayan-apocalypse-fears-nasa.html), such as a collision with the rogue planet Nibaru, or Earth being sucked into the supermassive black hole in the center of the Milky Way 30,000 light years away, the AI-doomsday scenario starts to seem relatively plausible.

I think the only other (relatively) plausible contenders would be the release of a pandemic-causing biological weapon, or the start of an international nuclear war. I haven't done any Fermi calculations on those, but I'm sure their probability exceeds that of solar flares scourging the surface of the Earth.

Comment author: DeevGrape 12 July 2012 07:37:51AM 0 points [-]

Hi! I'm in town from Tucson for the week, and I'm super interested in meeting some new LessWrongers. Unfortunately, I'm busy right at 7; if I drop in later, at 8:30 or 9:00, will people still be there?

Comment author: DeevGrape 09 July 2012 05:22:54PM *  0 points [-]

Note, this one will go from 7 until 9 before I have to leave.

Also note, if you're trying to find us, I'll have an evolution textbook (red cover like this: http://www.amazon.com/Evolution-Douglas-J-Futuyma/dp/0878931872) and a stripey fedora.

Meetup : Tucson: Fundamental Questions

1 DeevGrape 30 June 2012 12:51AM

Discussion article for the meetup : Tucson: Fundamental Questions

WHEN: 09 July 2012 07:00:00PM (-0700)

WHERE: 2443 North Campbell Avenue

After the moderate success of our last meetup (during which much fun and conversation was had!), there is now a next meetup. The nominal topic we'll be discussing is "the fundamental question of rationality": "What do you believe and why do you believe it?" And maybe we'll also talk about "What are you doing and why are you doing it?" Of course, discussion will veer. Hope to see some new faces!

Discussion article for the meetup : Tucson: Fundamental Questions

Comment author: DeevGrape 26 June 2012 03:03:58AM 0 points [-]

I'm 550 pages into a genetics textbook that I'm reading on my own, which is something I never would have imagined doing two months ago. I've been experiencing success with TDT for habit formation (implementing flossing, fluoride rinsing, sunscreen, running, and taking pills) and using the pomodoro method to chunk my time. I also achieved my goal of creating a piece of fanart for HPMoR (can be watched/read on youtube). Tomorrow or Wednesday I'm going to receive my piracetam + CDPCholine + pyritinol (the nootropics I found recommended as a solid beginner stack). Tomorrow I'm meeting with a researcher to discuss a volunteer position in her genetics lab.

My success with all this has been crucially dependent on finding and recognizing cached beliefs and identities, like "I can't get a cameo in HPMoR, my poetry isn't good enough!" or "I'm a linguist!" or "Drugs are bad!" It's intimidating how many of these I've found and how many more I probably still have to get to.

Comment author: Will_Newsome 14 June 2012 08:03:45AM 1 point [-]

Are you guys super awesome? Any super skilled programmers or summat?

I ask 'cuz I'm considering moving back to Tucson for a semester or so. Conditional on that, might attend UofA. I'm chiefly worried about losing my Player Character status. But Berkeley doesn't seem to be all that happenin' lately, and most of the people I love live in Tucson. Subplots galore.

Comment author: DeevGrape 14 June 2012 05:42:56PM 0 points [-]

I think "super awesome" is a pretty high bar, honestly. Also, not sure who you're asking about, but personally I'm more epistemologically skilled than all get out (though I'm no programmer).

I have no idea whether Berkeley or Tucson is better for being a PC.

Comment author: Jayson_Virissimo 05 June 2012 08:26:31AM *  1 point [-]

Would it make it significantly easier for you two to attend if we slightly altered the start time?

Comment author: DeevGrape 06 June 2012 02:26:42PM 0 points [-]

I'm pretty sure this time works fine. I'll let you know if that changes.

View more: Prev | Next