All of Phil_Goetz4's Comments + Replies

Of course, if you never assume independence, then the only right network is the fully-connected one.
Um, conditional independence, that is.

I want to know if my being killed by Eliezer's AI hinges on how often observables of interest tend to be conditionally dependent.

In other words, you are advocating a combative, Western approach; I am bringing up a more Eastern approach, which is not to be so attached to anything in the first place, but to bend if the wind blows hard enough.
The trouble is that you cannot break new ground this way. You can't do einstein like feats. You should follow the direction of the wind, but engage nitrous to follow that direction. Occasionally stopping and sticking a finger out the window to make sure you are going the right direction.
I think Einstein is a good example of both bending with ... (read more)

If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.
It is easy to construct at least these 2 kinds of cases where this is false:

  • You have a set of beliefs optimized for co-occurence, and you are replacing one of these beliefs with a more-true belief. In other words, the new true belief will cause you harm because of other untrue (or less true) beliefs that you still hold.
  • If an entire community can be persuaded to adopt a false belief, it may enable them to overcome a tragedy-of-the-common
... (read more)

Eliezer:

If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it.
I think you should try applying your own advice to this belief of yours. It is usually true, but it is certainly not always true, and reeks of irrational bias.

My experience with my crisis of faith seems quite opposite to your conceptions. I was raised in a fundamentalist family, and I had to "make an extraordinary effort" to keep believing in Christianity from the time I was 4 and started reading through the Bible, and findin... (read more)

0Kenny
The posts on making an extraordinary effort didn't explicitly exclude preserving the contents of one's beliefs as an effort worth being made extraordinarily, so you've definitely identified a seeming loophole, and yet you've simultaneously seemed to ignore all of the other posts about epistemic rationality.

From "Twelve virtues of rationality" by Eliezer:

The third virtue is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting; the instant you can see from which quarter the winds of evidence are blowing against you.

Eliezer uses almost the same words as yo... (read more)

7[anonymous]
Agreed. Every time I changed my mind about something, it felt like "quitting," like ceasing the struggle to come up with evidence for something I wanted to be true but wasn't. Realizing "It's so much easier to give up and follow the preponderance of the evidence." Examples: taking an economics class made it hard to believe that government interventions are mostly harmless. Learning about archaeology and textual analysis made it hard to believe in the infallibility of the Bible. Hearing cognitive science/philosophy arguments made it hard to believe in Cartesian dualism. Reading more papers made it hard to believe that looking at the spectrum of the Laplacian is a magic bullet for image processing. Extensive conversations with a friend made it hard to believe that I was helping him by advising him against pursuing his risky dreams. When something's getting hard to believe, consider giving up the belief. Just let the weight fall. Be lazy. If you're working hard to justify an idea, you're probably working too hard.

Carl: None of those would (given our better understanding) be as bad as great plagues that humanity has lived through before.

A forum makes more sense for a blog like this, which is not timely, but timeless.

Consider the space of minds built using Boolean symbolic logic. This is a very large space, and it is the space which was at one time chosen by all the leading experts in AI as being the most promising space for finding AI minds. And yet I believe there are /no/ minds in that space. If I'm right, this means that the space of possible minds as imagined by us, is very sparsely populated by possible minds.

I agree with Mike Vassar, that Eliezer is using the word "mind" too broadly, to mean something like "computable function", rather than a control program for an agent to accomplish goals in the real world.

The real world places a lot of restrictions on possible minds.

If you posit that this mind is autonomous, and not being looked after by some other mind, that places more restrictions on it.

If you posit that there is a society of such minds, evolving over time; or a number of such minds, competing for resources; that places more restricti... (read more)

Phil, I don't see how the argument is obviously incorrect. Why can't two works of literature from different cultures be as different from each other as Hamlet is from a restaurant menu?

They could be, but usually aren't. "World literature" is a valid category.

The larger point, that the space of possible minds is very large, is correct.

The argument used involving ATP synthase is invalid. ATP synthase is a building block. Life on earth is all built using roughly the same set of Legos. But Legos are very versatile.

Here is an analogous argument that is obviously incorrect:

People ask me, "What is world literature like? What desires and ambitions, and comedies and tragedies, do people write about in other languages?"

And lo, I say unto them, "You have asked me a trick question."

"the" ... (read more)