Comment author: Frank_Hirsch 02 June 2008 10:13:26PM 0 points [-]

The Taxi anecdote is ultra-geeky - I like that! ;-)

Also, once again I accidentally commented on Eliezers last entry, silly me!

Comment author: Frank_Hirsch 02 June 2008 09:35:53PM 0 points [-]

[Unknown wrote:] [...] you should update your opinion [to] a greater probability [...] that the person holds an unreasonable opinion in the matter. But [also to] a greater probability [...] that you are wrong.

In principle, yes. But I see exceptions.

[Unknown wrote:] For example, since Eliezer was surprised to hear of Dennett's opinion, he should assign a greater probability than before to the possibility that human level AI will not be developed with the foreseeable future. Likewise, to take the more extreme case, assuming that he was surprised at Aumann's religion, he should assign a greater probability to the Jewish religion, even if only to a slight degree.

Well, admittedly, the Dennett quote depresses me a bit. If I were in Eliezers shoes, I'd probably also choose to defend my stance - you can't dedicate your life to something with just half a heart!

About Auman's religion: That's one of the cases where I refuse to adapt my assigned probability one iota. His belief about religion is the result of his prior alone. So is mine, but it is my considered opinion that my prior is better! =)

Also, if I may digress a bit, I am sceptical about Robin's Hypothesis that humans in general update to little from other people's beliefs. My first intuition about this was that the opposite was the case (because of premature convergence and resistance to paradigm shifts). After having second thoughts, I believe the amount is probably just about right. Why? 1) Taking other people's beliefs as evidence is an evolved trait, and so is probably the approximate amount. 2) Evolution is smarter than I (and Robin, I presume).

Comment author: Frank_Hirsch 02 June 2008 06:03:17AM 3 points [-]

Unknown: Well, maybe yeah, but so what? It's just practically impossible the completely re-evaluate every belief you hold whenever someone says something that asserts the belief to be wrong. That's nothing at all to do with "overconfidence", but it's everything to do with sanity. The time to re-evaluate your beliefs is when someone gives a possibly plausible argument about the belief itself, not just an assertion that it is wrong. Like e.g. whenever someone argues anything, and the argument is based on the assumption of a personal god, I dismiss it out of hand without thinking twice - sometimes I do not even take the time to hear them out! Why should I, when I know it's gonna be a waste of time? Overconfidence? No, sanity!

Comment author: Frank_Hirsch 01 June 2008 10:38:00PM 0 points [-]

Nick:
I thought the assumption was that SI is to S to get any ideas about world domination?

Comment author: Frank_Hirsch 01 June 2008 09:48:00PM 0 points [-]

Makes me think:
Wouldn't it be rather recommendable, if instead of heading straight for an (risky) AGI, we worked on (safe) SIs and then have them solve the problem of Friendly AGI?

Comment author: Frank_Hirsch 10 April 2008 12:11:00PM 2 points [-]

botogol:

Eliezer (and Robin) this series is very interesting and all, but.... aren't you writing this on the wrong blog?

I have the impression Eliezer writes blog entries in much the same way I read Wikipedia: Slowly working from A to B in a grandiose excess of detours... =)

In response to Quantum Explanations
Comment author: Frank_Hirsch 09 April 2008 11:11:43PM 1 point [-]

Wow, good teaser for sure! /me is quivering with anticipation ^_^

Comment author: Frank_Hirsch 06 April 2008 05:56:52PM 0 points [-]

Caledonian:

One of the very many problems with today's world is that, instead of confronting the root issues that underlie disagreement, people simply split into groups and sustain themselves on intragroup consensus. [...] That is an extraordinarily bad way to overcome bias.

I disagree. What do we have to gain from bringing all-and-everyone in line with our own beliefs? While it is arguably a good thing to exchange our points of view, and how we are rationalising them, there will always be issues where the agreed evidence is just not strong enough to refute all but one way to look at things. I believe that sometimes you really do have to agree to disagree (unless all participants espouse bayesianism, that is), and move on to more fertile pastures. And even if all participants in a discussion claim to be rationalists, sometimes you'll either have to agree that someone is wrong (without agreeing on who it is, naturally) or waste time you could have spent on more promising endeavours.

Comment author: Frank_Hirsch 06 April 2008 02:24:01PM 0 points [-]

Will Pearson [about tiny robots replacing neurons]: "I find this physically implausible."

Um, well, I can see it would be quite hard. But that doesn't really matter for a thought experiment. To ask "What it would be like to ride on a light beam?" is quite as physically implausible as it gets, but seems to have produced a few rather interesting insights.

Comment author: Frank_Hirsch 06 April 2008 01:42:22AM 4 points [-]

[Warning: Here be sarcasm] No! Please let's spend more time discussing dubious non-disprovable hypotheses! There's only a gazillion more to go, then we'll have convinced everyone!

View more: Prev | Next