Comment author: Morendil 22 June 2010 08:01:09AM 0 points [-]

What then do you make of Jayne's observation in the Comments: "Our present model of the robot is quite literally real, because today it is almost universally true that any nontrivial probability evaluation is performed by a computer"?

Comment author: Christian_Szegedy 22 June 2010 10:31:22PM *  0 points [-]

In my reading it means, that there are already actual implementations for all probability inference operations that the authors consider in the book.

This has been probably a true statement, even in the 60'ies. It does not mean that the robot as a whole is resource-wise feasible.

An analogy: It is not hard to implement all (non-probabilistic) logical derivation rules. It is also straightforward to use them to generate all true mathematical theorems (e.g. within ZFC). However this does not imply that we have an practical (i.e. efficient) general purpose mathematical theorem-prover. It gives an algorithm to prove every provable theorems eventually, but its run-time consumption makes this approach practically useless.

Comment author: Jonathan_Graehl 18 June 2010 09:30:38PM 0 points [-]

I think you misunderstood. The robot has a real number p(v) for every v. Let's grant an absolute min and max of 0 and 1. My point was simply that when p(v)=0 or p(v)=1, v can be simplified out of propositions using it.

I understand why computing the probability of a proposition implies answering whether it's satisfiable.

Comment author: Christian_Szegedy 18 June 2010 11:40:16PM 1 point [-]

Sorry for the confusion. I was very superficial. Of course, your are correct about being able to simplify out those values.

Comment author: Christian_Szegedy 18 June 2010 07:58:34AM 3 points [-]
Comment author: Jonathan_Graehl 18 June 2010 01:35:42AM 0 points [-]

amounts to solving SAT problems

I assume you mean in the sense that deciding satisfiability of arbitrary propositions (over uncertain variables; certainly true/false ones can be simplified out) is NP-complete. Of course I mean that a variable v is uncertain if 0<p(v)<1.

Comment author: Christian_Szegedy 18 June 2010 06:45:10AM *  0 points [-]

Actually, solving SAT problems is just the simplest case. Even so, if you have only certain variables (with either 0 or 1 plausibility), it's still NP-complete, you can't just simplify them in polynomial time. [EDIT: This is wrong as Jonathan pointed it out.]

In extreme case, since we also have the rule that "robot" has to use all the available information to the fullest extent, it means that the "robot" must be insanely powerful. For example if the calculation of some plausibility value depends for example the correctness of an algorithm (known by the "robot", with a very high probability), then it will have to be able to solve the halting problem in general.

Even if you constrain your probability values to be never certain or impossible, you can always chose small (or large) enough values, so that the computation of the probabilities can be used to solve the discrete version of the problem.

For example, in the simplest case: if you just have a set of propositions in (let us say in conjunctive normal form), the consistency desideratum implies the ability of the "robot" to solve SAT problems, even if the starting plausibility values for the literals fall into the open (0,1) interval.

Comment author: cousin_it 17 June 2010 05:07:25PM *  3 points [-]

Ask and ye shall receive: David MacKay, Sustainable energy without the hot air. A free online book that reads like porn for LessWrong regulars.

Comment author: Christian_Szegedy 17 June 2010 06:42:09PM *  0 points [-]

Yes, I've read that (pretty good) book quite a while ago and it is also referenced in the TED talk I mentioned.

This was one of the reasons I was surprised that there is still such a huge disagreement about the figures even among experts.

Comment author: xamdam 17 June 2010 04:55:54PM *  2 points [-]

I agree that Jaynes is using the robot as a literary device to get a point across.

If I understood you correctly it seems you're sneaking an additional claim that a Bayesian AI is theoretically impossible due to computational concerns. That should be discussed separately, but the obvious counterargument is that while, say, complete inference in Bayes Nets has been proved intractable, approximate inference does well on good-size problems, and approximate does not mean it's not Bayesian.

Comment author: Christian_Szegedy 17 June 2010 06:36:17PM *  2 points [-]

Sorry, I never tried to imply that an AI built on the Bayesian principles is impossible or even a bad idea. (Probably, using Bayesian inference is a fundamentally good idea.)

I just tried to point out that easy looking principles don't necessarily translate to practical implementations in a straightforward manner.

Comment author: cata 16 June 2010 10:14:13PM *  2 points [-]

I think I was unclear. Here's what I mean:

Suppose our robot takes these two propositions:

A = "It's going to rain tonight in Michigan." B = "England will win the World Cup."

And suppose it thinks that the plausibility of A is 40, and the plausibility of B is 25.

As far as our robot knows, these propositions are not related. That is, in Jaynes' notation (I'll use a bang for "not,") (A|B) = (A|!B) = 40, and (B|A) = (B|!A) = 25. Is that correct?

Now suppose that the plausibility of A jumps to 80, because it's looking very cloudy this afternoon. I suggest that the plausibility of B should remain unchanged. I'm not sure whether the current set of rules is sufficient to ensure that, although I suspect it is. I think it might be impossible to come up with a consistent system breaking this rule that still obeys the (3c) "consistency over equivalent problems" rule.

Comment author: Christian_Szegedy 16 June 2010 10:44:00PM *  1 point [-]

I think it is impossible to decide this based on Chapter 1 alone, for the second criterion (qualitative correspondence with common sense) is not yet specified formally.

If you look into Chapter 2, the derivation of the product rule, he uses this rubber-assumption to get the results he aims for (very similarly to you).

I think one should not take some statements of the author like ("... our search for desiderata is at an end... ") too seriously.

In some sense this informative approach is defensible, from another perspective it definitely looks quite pretentious.

Comment author: JoshuaZ 16 June 2010 08:06:47PM 2 points [-]

Imagine what people must have thought in 1910 about the feasibility of getting to the Moon or generating energy by artificially splitting atoms (especially within the 20th century).

Two problems with that sort of comparison: First, something like going to the Moon is a goal, not a technology. Thus, if we have other sources of power, the incentive to work out the details for fusion becomes small. Second, one shouldn't forget how many technologies have been tried and have fallen by the wayside as not very practical or not at all practical. A good way of getting a handle on this is to read old issue of something like Scientific American from the 1950s and 1960s. Or read scifi from that time period. One of example of historical technology that never showed up on any substantial scale is nuclear powered airplanes, despite a lot of research in the 1950s about them. Similarly, nuclear thermal rockets have not been made. This isn't because they are impossible, but because they are extremely impractical compared to other technologies. It seems likely that fusion power will fall into the same category. See this article about Project Pluto for example.

Comment author: Christian_Szegedy 16 June 2010 09:09:04PM *  1 point [-]

These are perfectly valid arguments and I admit that I share your skepticism concerning the economic competitiveness of the fusion technology. I admit, if I had a decision to make about buying some security, the payout of which would depend on the amount of energy produced by fusion power within 30 years, I would not hurry to place any bet.

What I lack is your apparent confidence in ruling out the technology based on the technological difficulties we face at this point in time.

I am always surprised how the opinion of so called experts diverges when it comes to estimating the feasibility and cost of different energy production options (even excluding fusion power). For example there is recent TED video where people discuss the pros and cons of nuclear power. The whole discussion boils down to the question: What are the resources we need in order to produce X amount of energy using

  • nuclear
  • wind
  • solar
  • biofuel
  • geothermal

power. For me, the disturbing thing was that the statements about the resource usage (e.g. area consumption, but also risks) of the different technologies were sometimes off by magnitudes.

If we lack the information to produce numbers in the same ballpark even for technologies that we have been using for decades (if not longer), then how much confidence can we have about the viability, costs, risks and competitiveness of a technology, like fusion, that we have not even started to tap.

Comment author: JoshuaZ 15 June 2010 04:12:58AM 5 points [-]

I'm thinking of writing a top-post on the difficulties of estimating P(B) in real-world applications of Bayes' Theorem. Would people be interested in such a post?

Comment author: Christian_Szegedy 16 June 2010 07:40:02PM *  3 points [-]

Funny, I've been entertaining the same idea for a few weeks.

Every time I read statements like "... and then I update the probabilities, based on this evidence ...", I think to myself: "I wish I had the time (or processing power) he thinks he has. ;)"

Comment author: JoshuaZ 16 June 2010 05:24:31PM 1 point [-]

It seem very doubtful that we'll have practical fusion power any time soon or necessarily ever. The technical hurdles are immense. Note that any form of fusion plant will almost certainly be using deuterium-tritium fusion. That means you need tritium sources. This also means that the internal structure will undergo constant low-level neutron bombardment which seriously reduces the lifespan of basic parts such as the electromagnets used. If we look at he form of proposed fusion that has had the most work and has the best chance of success, tokamaks, then we get to a number of other serious problems such as plasma leaks. Other forms of magnetic containment have also not solved the plasma leak problem. Forms of reactors that don't use magnetic containment suffer from other similarly serious problems. For example, the runner up to magnetic containment is laser confinement but no one hasa good way to actually get energy out of laser confinement.

That said, I think that there are enough other potential sources of energy (nuclear fission, solar (and space based solar especially), wind, and tidal to name a few) that this won't be an issue.

Comment author: Christian_Szegedy 16 June 2010 07:23:07PM *  1 point [-]

Imagine what people must have thought in 1910 about the feasibility of getting to the Moon or generating energy by artificially splitting atoms (especially within the 20th century).

View more: Prev | Next