Comment author: komponisto2 05 October 2008 08:08:25PM 2 points [-]

Also do we really want to assign a prior probability of 0 that the mathematician is a liar! :)

That's not the point I was making.

I'm not attacking unrealistic idealization. I'm willing to stipulate that the mathematician tells the truth. What I'm questioning is the "naturalness" of Eliezer's interpretation. The interpretation that I find "common-sensical" would be the following:

Let A = both boys, B = at least one boy. The prior P(B) is 3/4, while P(A) = 1/4. The mathematician's statement instructs us to find P(A|B), which by Bayes is equal to 1/3.

Under Eliezer's interpretation, however, the question is to find P(A|C), where C = *the mathematician says* at least one boy (*as opposed to saying* at least one girl).

So if anyone is attacking the premises of the question, it is Eliezer, by introducing the quantity P(C) (which strikes me as contrived) and assigning it a value less than 1.

Comment author: komponisto2 05 October 2008 06:18:02PM 0 points [-]

No, wait -- my question stands!

Do we really want to assign a prior of 0 to the mathematician saying "I have two children, one boy and one girl"?

Comment author: komponisto2 05 October 2008 06:11:23PM 0 points [-]

Never mind -- missed the "If" clause. (Sorry!)

Comment author: komponisto2 05 October 2008 06:07:02PM 0 points [-]

If the mathematician has one boy and one girl, then my prior probability for her saying 'at least one of them is a boy' is 1/2 and my prior probability for her saying 'at least one of them is a girl' is 1/2

Why isn't it 3/4 for both? Why are these scenarios mutually exclusive?

In response to The Level Above Mine
Comment author: komponisto2 27 September 2008 07:33:00PM 1 point [-]

Lara, I don't think they value it "for its own sake" as opposed to as a means to an end; rather, they see it as a necessary condition for achieving their ends, and are worried they don't have what it takes. Nothing but an anxiety trip.

And of course, there's also the ego thing -- when people build superiority over others into their self-image. This is counterproductive, of course. When someone else demonstrates that they're "smarter" than you by offering unexpected insight, you don't fatalistically wallow in jealous misery; you listen to the content of what they say, in the hope of becoming as smart as they are.

Eliezer of all people ought to realize this (actually I suspect he does).

FWIW, I've met both Eliezer and John Conway, and have spent approximately the same total amount of time with both of them (on the order of 10 hours). I don't know which of them is smarter. Yet I suspect neither is too far above my own level for me to be able to e.g. benefit from listening to a conversation between them.

Comment author: komponisto2 21 September 2008 12:29:13AM 1 point [-]

At the risk of asking the obvious:

Does the fact that no one has yet succeeded in constructing transhuman AI imply that doing so would necessarily wipe out humanity?

Comment author: komponisto2 04 September 2008 12:05:36AM 0 points [-]

Robin, the underlying point of the soldier quote (and others like it) is that the liberal society we enjoy comes at a (military) cost. Freedom, as the saying goes, isn't free. If we really want freedom of speech and the like, we had better be prepared to enforce it (ironic though that may seem).

Comment author: komponisto2 24 August 2008 01:07:18AM 3 points [-]

Eliezer, while I think that Caledonian (and perhaps also Richard Hollerith) has apparently missed the whole point of a number of your posts (in particular the recent ones on Lรถb's theorem, etc), I'm not sure why you are so concerned about people being "fooled". These are comments, which happen to be clearly labeled as not being authored by you. Would anyone really assume that a particular commenter, be it Caledonian or anyone else, has necessarily summarized your views accurately?

Furthermore, for every one such "misrepresentative" comment, there are undoubtedly several lurkers suffering from an honest misunderstanding similar to the one being articulated (whether in good faith or not) by the commenter. It may be worthwhile to simply correct these misunderstandings as often as possible, even at the risk of repetition. These are, after all, subtle points, and it may take time (and reinforcement) for people to understand them.

In response to Dumb Deplaning
Comment author: komponisto2 19 August 2008 03:36:44AM 1 point [-]

The flipside of this is the inanity of Southwest Airlines employees with respect to boarding the plane:

As is well known, Southwest doesn't have assigned seats, so the choice of seating is determined by boarding order, with earlier people getting more choices. People want to avoid middle seats, so the natural tendency of later boarders on crowded flights is to keep walking as far as necessary toward the back of the cabin in the hope of finding and empty aisle or window seat. For some inexplicable reason, however, Southwest flight attendants and gate managers actively discourage this, wanting people instead to take the first middle seat they find. The all-too-predictable result is a traffic jam in the aisle and the jetway, as the line continually stops to wait for the leading person to stow their luggage and take their seat.

It should be obvious that, regardless of how crowded the flight is, boarding efficiency is maximized by having each passenger go as far to the back of the cabin as possible, to allow the line to keep moving forward. Is this goal somehow less important than that of teaching people not to vainly expect aisle or window seats?

Comment author: komponisto2 15 August 2008 02:24:12AM 0 points [-]

Clarification: in the first paragraph of the above comment, when I wrote "The whole point of 'morality' is..." what I meant was "The whole point of non-relativist 'morality' is...".

View more: Prev | Next