Comment author: [deleted] 21 April 2011 04:43:42PM 3 points [-]

Conditional probabilities are allowed to be 100%, because they are probability ratios. In particular, P(A|A) is 100% by definition.

Comment author: Kai-o-logos 21 April 2011 05:18:26PM -1 points [-]

But P(E|D) is not 100% by any definition. Conditional probabilities are only 100% if

D-->E. And if that was true, why does this argument exist?

Comment author: jake987722 21 April 2011 12:56:43AM *  2 points [-]

As an aspiring scientist, I hold the Truth above all.

That will change!

More seriously though...

As one can see, the biggest problem is determining burden of proof. Statistically speaking, this is much like the problem of defining the null hypothesis.

Well, not really. The null and alternative hypotheses in frequentist statistics are defined in terms of their model complexity, not our prior beliefs (that would be Bayesian!). Specifically, the null hypothesis represents the model with fewer free parameters.

You might still face some sort of statistical disagreement with the theist, but it would have to be a disagreement over which hypothesis is more/less parsimonious--which is really a rather different argument than what you've outlined (and IMO, one that the theist would have a hard time defending).

Comment author: Kai-o-logos 21 April 2011 01:03:51AM *  1 point [-]

I'm not saying that the frequentist statistical belief logic actually goes like that above. What I'm saying is that is how many people tend to wrongly interpret such statistics to define their own null hypothesis in the way I outlined in the post.

As I've said before, the MOST common problem is not the actual statistics, but how the ignorant interpret that statistics. I am merely saying, I would prefer Bayesian statistics to be taught because it is much harder to botch up and read our own interpretation into it. (For one, it is ruled by a relatively easy formula)

Also, isn't model complexity quite hard to determine with the statements "God exists" and "God does not exist". Isn't the complexity in this sense subject to easy bias?

Comment author: JoshuaZ 20 April 2011 10:29:40PM 1 point [-]

Michael:

Underestimating the significance of superintelligence. People have a delusion that humanity is some theoretically optimum plateau of intelligence (due to brainwashing from Judeo-Christian theological ideas, which also permeate so-called “secular humanism”), which is the opposite of the truth. We’re actually among the stupidest possible species smart enough to launch a civilization.

This doesn't seem to be a part of standard Christian or Jewish theology, so blaming that attitude on this seems misguided. His last sentence is also problematic- how does he know that with a sample size of one?

Michael:

Would you rather your AI be based on Hitler or Gandhi?

Seems to understate the case. Mindspace is large. The problem isn't an AI that acts like Hitler. That's not such a bad failure as things go. The worst case scenario more resembles Cthulhu than Hitler.

Anissimov correctly calls out Goertzel on his claim that he's sure he could design a properly functioning nanny AI.

Goertzel does correctly point out that it seems likely that a nanny AI would take less understanding than a full Friendly AI with stable goals under self-modification.

Comment author: Kai-o-logos 21 April 2011 12:15:20AM 4 points [-]

"This doesn't seem to be a part of standard Christian or Jewish theology,"

~Actually even if there is no outright statement in the Bible, through the years, it is commonly accepted that human supremacy is stated in Genesis 1:26 - "Then God said, 'Let us make mankind in our image'". Man is created in God's image - making him superior to all. Also in Gen. 3:22 - “The man has now become like one of us, knowing good and evil". Man is like a God - the only difference is that they are not immortal.

Not necessarily my opinion, just what I believe the theology says, and what I have heard from theist friends that the theology says.

Comment author: [deleted] 20 April 2011 11:51:21PM *  7 points [-]

I've tried to explain this when arguing with theists, and it sometimes creates the following unintentional side effect:

Me: /explains Bayesian framework

Theist: Ah ha! So if we use a simple, uniform prior, then we start out with a 50% chance that God exists!

The problem, of course, is that the theist forgot (or doesn't understand) the distinction between "(the Judeo-Christian) God exists" and "at least one deity exists." It's really important to stress that the search space of possible gods is huge, otherwise you will create even more confusion.

Overall, though, I definitely agree with the main point of this post. Upvoted.

Comment author: Kai-o-logos 21 April 2011 12:05:59AM 3 points [-]

Quite honestly, I think a bigger problem is theists assuming that P(E|D) = 100%. That given a deity or more exists, they automatically assume the world would turn out like this - I would actually argue the opposite, that the number is very low.

Even assuming an omniscient, omnipotent, omnibenevolent God, he could have still, I argue at least, have made the choice to remove our free will "for our own good". Even if P(E|D) is high, in no way is it close to 100%.

Furthermore, you can never assume a 100% probability!!! (http://yudkowsky.net/rational/technical). You could go to rationalist hell for that!

A Problem with Human Intuition about Conventional Statistics:

-1 Kai-o-logos 20 April 2011 11:41PM

 

As an aspiring scientist, I hold the Truth above all. As Hodgell once said, "That which can be destroyed by the truth should be." But what if the thing that is holding our pursuit of the Truth back is our own system? I will share an example of an argument I overheard between a theist and an atheist once - showing an instance where human intuition might fail us.

*General Transcript*

Atheist: Prove to me that God exists!

Theist: He obviously exists – can’t you see that plants growing, humans thinking, [insert laundry list here], is all His work?

Atheist: Those can easily be explained by evolutionary mechanisms!

Theist: Well prove to me that God doesn’t exist!

Atheist: I don’t have to! There may be an invisible pink unicorn baby flying around my head, there is probably not. I can’t prove that there is no unicorn, that doesn’t mean it exists!

Theist: That’s just complete reductio ad ridiculo, you could do infrared, polaroid, uv, vacuum scans, and if nothing appears it is statistically unlikely that the unicorn exists! But God is something metaphysical, you can’t do that with Him!

Atheist: Well Nietzsche killed metaphysics when he killed God. God is dead!

Theist: That is just words without argument. Can you actually…..

As one can see, the biggest problem is determining burden of proof.  Statistically speaking, this is much like the problem of defining the null hypothesis.

A theist would define: H0 : God exists. Ha: God does not exist.

An atheist would define: H0: God does not exist. Ha God does exist.

Both conclude that there is no significant evidence hinting at Ha over H­0. Furthermore, and this is key, they both accept the null hypothesis. The correct statistical term for the proper conclusion if insignificant evidence exists for the acceptance of the alternate hypothesis is that one fails to reject the null hypothesis. However, human intuition fails to grasp this concept, and think in black and white, and instead we tend to accept the null hypothesis.

This is not so much a problem with statistics as it is with human intuition. Statistics usually take this form because simultaneous 100+ hypothesis considerations are taxing on the human brain. Therefore, we think of hypotheses to be defended or attacked, but not considered neutrally.

Considered a Bayesian outlook on this problem.

There are two possible outcomes: At least one deity exists(D). No deities exist(N).

Let us consider the natural evidence (Let’s call this E) before us.

P(D+N) = 1. P[(D+N)|E] = 1. P(D|E) + P(N|E) = 1. P(D|E) = 1- P(N|E).

Although the calculation of the prior probability of the probability of god existing is rather strange, and seems to reek of bias, I still argue that this is a better system of analysis than just the classical H0 and Ha, because it effectively compares the probability of D and N with no bias inherent in the brain’s perception of the system.

Example such as these, I believe, show the flaws that result from faulty interpretations of the classical system. If instead we introduced a Bayesian perspective – the faulty interpretation would vanish.

This is a case for the expanded introduction of Bayesian probability theory. Even if cannot be applied correctly to every problem, even if it is apparently more complicated than the standard method they teach in statistics class ( I disagree here), it teaches people to analyze situations from a more objective perspective.

And if we can avoid Truth-seekers going awry due to simple biases such as those mentioned above, won’t we be that much closer to finding Truth?

 

Comment author: Dustin 20 April 2011 11:18:14PM 13 points [-]

Wife, child, family, friends, business.

Comment author: Kai-o-logos 20 April 2011 11:22:54PM 9 points [-]

that is sad - I know a friend in rural Nebraska who is in a similar predicament (college) and he says if it wasn't for LW, he might have just concluded that people were just un-awesome.

It is sad that demographics limits potential awesome-seekers. That is another reason why I admire Eliezer so much for making this online community.

Comment author: Kai-o-logos 20 April 2011 11:20:29PM *  4 points [-]

Lukeprog's explanation about the ancestral setting makes sense, but I still believe that modern capacity for spreading information is great. Take a modern college setting - Person A asks B out, gets rejected - she gossips to all her friends, goes all around the college, reducing the number further possible dates.

I am not trying to say that said fear is rational, because the possibility that she is that much of a gossip is relatively low, but I am merely saying that when huge negative utilities are in consideration, it should not be taken lightly. When there is a even a 0.1% chance of death, (rational) people refuse to attempt the activity. Similarly, if getting shutdown will ruin your future chances - and we have been conditioned in a school setting for most of our lives, where it can affect future chances- we develop an instinctive hesitation to making the first step.

Comment author: Kai-o-logos 20 April 2011 11:08:06PM 1 point [-]

Part of the great trouble of being a rationalist is the great, great trouble of finding like minded people. I am thrilled at the news of such successful meetups taking place - the reason rationalists don't have the impact they should is poor organization

On the other hand, I really like what Eliezer says about courage. It is one thing to preach and repeat meaningless words about being courageous and facing the Truth, but if we are too afraid to look like a fool in society - who says we won't be too afraid to speak the Truth in the scientific community?

Comment author: Kai-o-logos 13 April 2011 03:16:15AM *  -1 points [-]

Dust specks – I completely disagree with Eliezer’s argument here. The hole in Yudkowsky’s logic, I believe, is not only the curved utility function, but also the main fact that discomfort cannot be added like numbers. The dust speck incident is momentary. You barely notice it, you blink, its gone, and you forget about it for the rest of your life. Torture, on the other hand, leaves lasting emotional damage on the human psyche. Futhermore, discomfort is different than pain. If, for example the hypothetical replaced the torture with 10000 people getting a non-painful itch for the rest of their life, I would agree with Eliezer’s theory. But pain, I believe and this is where my logic might be weak, is different than discomfort, and Eliezer treats pain as just an extreme discomfort. Another argument would be the instantaneous utilitarian framework. Let us now accept the assumption that pain is merely extreme discomfort. Eliezer’s framework is that the total “discomfort” in Scenario 1 is less than that in scenario 2. And if you simply add up the discomfit-points, then maybe such a conclusion would be reached. But now consider, during that 50 year time period, we take an arbitrary time Ti, more than say 2 minutes from the start. The total discomfort in Scenario 1 is some Pi1. The total discomfort in Scenario 2 is 0. This will go on until the end of the 50 year time period. Scenario 1: Discomfort|T=ti = P(ti) Scenario 2: Discomfort = 0. Integrating both functions with respect to dt. Total Discomfort in Scenario1 – INTEG(P(t)dt) Total Discomfort in Scenario 2 – 0. Put in terms of a non-mathematician, the pain of the torture is experience continuously. The pain of the dust is momentary.

One can argue the 0*infinity argument – that the small number produced by integration can be negated by the huge 3^^^3… However, this can be explained by my earlier thesis that pain is different than discomfort. I could measure the Kantian societal “categorical imperative” as my third piece of logic, but everyone else has already mentioned it. If there is any error in judgment made, please let me know.

Comment author: Kai-o-logos 07 April 2011 03:44:59AM *  3 points [-]

On Less Wrong, I found thoroughness. Society today advocates speed over effectiveness - 12 year old college students over soundly rational adults. People who can Laplace transform diff-eqs in their heads over people who can solve logical paradoxes. In Less Wrong, I found people that could detach themselves from emotions and appearances, and look at things with an iron rationality.

I am sick of people who presume to know more than they do. Those that "seem" smart rather than actually being smart.

People on less wrong do not seem to be something they are not ~"Seems, madam! nay it is; I know not 'seems.'" (Hamlet)

View more: Next