In response to Whither OB?
Comment author: Michael_G.R. 17 November 2008 09:28:36PM 0 points [-]

"Why is daily posting a shibboleth ?

I would still read the site if EY posted once a week"

I second that. Even if OB was updated only 1-3 times a week by posts of the current level of quality, it would still be one of my favorite sites. In fact, I'm having a hard time keeping up with the current quantity of content and I often need to set time aside to clear my OB backlog.

A better software platform would be good, but I doubt that user-generated content could ever become central to the site. Maybe as a sidebar, with a few rare posts getting promoted to the frontpage.

"I'm not finished, but I'm way over schedule and need to move on soon."

Is the next thing on your schedule writing the books you've talked about in the past? Are you still planning to do the 'popular' book?

"Our most popular post ever, still getting hits to this day, was not written by Robin or myself or any of the recurring editors. It's "My Favorite Liar" by Kai Chang, about the professor who inserted one false statement into each lecture."

Back when it was first published, I submitted it to reddit and it got 1050 votes (which is a lot for that site). Glad it's still getting traffic!

Comment author: Michael_G.R. 10 November 2008 06:33:42AM 0 points [-]

"Eliezer must be disappointed. He makes a thread to help a recovering irrationalist deal with the social consequences of deconversion, and half the posts he gets are from religious believers and believers in religious belief."

I think this has been linked from some social media sites, which can explain the influx of non-regular OB readers.

Comment author: Michael_G.R. 09 November 2008 10:26:49PM 0 points [-]

Maybe this is fodder for another post:

A few people here said: "If that person was really special, there would be no problem with you telling him."

But are things really that simple? Not so long ago, Jo would probably have reacted badly if her special person had told her that he didn't believe anymore in what she believed. Loving someone and getting along well with them doesn't mean that you will accept anything they do without problem and vice versa.

Think about the people that you find "special" in your life and imagine telling them that your beliefs have changed about something very important that you both believe in (used to be libertarian, now you have strong authoritarian beliefs/ used to be vegan, now eating ribs every night/ strongly partisan for one political party, switching to another/ etc) and imagine how they would react. Does that make them not "special" anymore?

Comment author: Michael_G.R. 06 November 2008 05:18:06AM 0 points [-]

"Jack, I've spoken on many occasions previously but I was never in Toastmasters."

If you're planning to be speaking for money, Toastmasters might be a good investment. I would recommend at least checking it out to see what you could get out of it. Since you are not a beginner, you should check out 'advanced' clubs.

With public speaking, there's nothing like experience. TM allows you to practice in a friendly environment where you can try new approaches (doesn't matter if they fail), and to benefit from the knowledge of a group of people who have been doing this for a while and should be able to give you more useful feedback than most other groups.

You can also use the club as a way to practice for media appearances (tv interviews, radio, etc).

Comment author: Michael_G.R. 04 November 2008 04:14:00PM 0 points [-]

"A very impressive interview - I have gained much respect for Eliezer's patience."

In a way, I think that maybe the most important stuff in this interview is what didn't happen. Eliezer indeed seems to possess super-human patience.

Comment author: Michael_G.R. 03 November 2008 06:10:24PM 2 points [-]

"Jaron's laughter seems largely the laughter of frustrated politesse. This comes out in his speech when he repeats to EY "I've been having this discussion for decades.""

I think that's BS. If Jaron didn't want to discuss AI, then why agree to a BGTV episode with Eliezer, a research fellow at the Singularity Institute for Artificial Intelligence.

Eliezer tried to understand what Jaron was saying and asked him questions to get him to better explain his positions. jaron pretty much never tried to make himself clear (probably because there wasn't much to explain in the first place), and he never really explained what he didn't like about Eliezer's position.

How long he's been having this conversation ("for decades" or whatever) only means that he's been having it for a long time, not that he has convincing arguments or that there's any value to what he says.

Comment author: Michael_G.R. 02 November 2008 09:19:26PM 0 points [-]

I'm 20 minutes in and wish Lanier's connection would just cut off and Eliezer would talk by himself.

"Eliezer occasionally looked like he was having trouble following Lanier's reasoning. I certainly did. My guess is that this is because, on those occasions, Lanier didn't have a reasoning."

That's my feeling too. He seemed to love calling anyone who disagrees with him an "idiot" or "religious nut" without ever really explaining why.

I'm going to keep watching because I expect Eliezer to say some interesting stuff.

Comment author: Michael_G.R. 09 October 2008 01:49:56AM 4 points [-]

Here's my theory on *this particular* AI-Box experiment:

First you explain to the gatekeeper the potential dangers of AIs. General stuff about how large mind design space is, and how it's really easy to screw up and destroy the world with AI.

Then you try to convince him that the solution to that problem is building an AI very carefuly, and that a theory of friendly AI is primordial to increase our chances of a future we would find "nice" (and the stakes are so high, that even increasing these chances a tiny bit is very valuable).

THEN

You explain to the gatekeeper that this AI experiment being public, it will be looked back on by all kinds of people involved in making AIs, and that if he lets the AI out of the box (without them knowing why), it will send them a very strong message that friendly AI theory must be taken seriously because this very scenario could happen to them (not being able to keep the AI in a box) with their AI that hasn't been proven to stay friendly and that is more intelligence than Eliezer.

So here's my theory. But then, I've only thought of it just now. Maybe if I made a desperate or extraordinary effort I'd come up with something more clever :)

Comment author: Michael_G.R. 07 October 2008 08:02:01PM 1 point [-]

Small Typo Alert: The second quote should be attributed to "Mastering Eishin-Ryu Swordsmanship"

"Ryu", not "Ruy".

In response to Ban the Bear
Comment author: Michael_G.R. 20 September 2008 02:39:58PM 0 points [-]

"The capitalists are trying to save capitalism from the capitalists!"

Actually, if we didn't have fiat money that can be printed at will and interest rates were set by market forces and not a bunch of white men picked by the government, we wouldn't be in this mess.

View more: Prev | Next