Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
moshez10

As far as complexity-of-logic-theories-for-reason-of-believing-in-them, that should be proportional to the minimal Turing machine that would check if something is an axiom or not. (Of course, in the case of a finite list, approximating it to the total length of the axioms is reasonable, because the Turing machine that does "check if input is equal to following set:" followed by set adds a constant size to the set -- but that approximation breaks down badly for infinite axiom schema).

moshez20

For eating at people's houses: usually people will have enough side-dishes that if one does not make a big deal of it, one can fill up on non-meat dishes. At worst, there's always bread.

For going to steakhouse -- yes, but at every other place, there's usually a vegetarian option, if one tries hard enough.

It does make a good case for being an unannoying vegetarian...but being a strict-vegetarian is a useful Schelling point.

moshez10

Of course e can be evidence even if P(X|e)=P(X) -- it just cannot be evidence for X. It can be evidence for Y if P(Y|e)>P(Y), and this is exactly the case you describe. If Y is "there is a monument and left is red or there is no monument and left is black", then e is (infinite, if Omega is truthful with probability 1) evidence for Y, even though it is 0 evidence for X.

Similarly, you watching your shoelace untied is zero evidence for my shoelaces...

moshez00

No, it is not surprising... I'm just saying that saying that the semantics is impoverished if you only use finite syntactical proof, but not to any degree that can be fixed by just being really really really smart.

moshez30

bryjnar: I think the point is that the metalogical analysis that happens in the context of set theory is still a finite syntactical proof. In essense, all mathematics can be reduced to finite syntactical proofs inside of ZFC. Anything that really, truly, requires infinite proof in actual math is unknowable to everyone, supersmart AI included.

moshez50

Here's how I visualize Goedel's incompleteness theorem (I'm not sure how "visual" this is, but bear with me): I imagine the Goedel construction over the axioms of first-order Peano arithmetic. Clearly, in the standard model, the Goedel sentence is true, so we add G to the axioms. Now we construct G' a Goedel sentence in this new set, and add G'' as an axiom. We go on and on, G''', etc. Luckily that construction is computable, so we add G^w as a Goedel sentence in this new set. We continue on and on, until we reach the first uncomputable countable ordinal, at which point we stop, because we have an uncomputable axiom set. Note that Goedel is fine with that -- you can have a complete first-order Peano arithmetic (it would have non-standard models, but it would be complete!) -- as long as you are willing to live with the fact that you cannot know if something is a proof or not with a mere machine (and yes, Virginia, humans are also mere machines).

moshez00

I'm trying to steelman your arguments as much as I can, but I find myself confused. The best I can do is: "I'm worried that people would find LW communities unwelcoming if they do not go to rituals. Further, I'm worried that rituals are a slippery-slope: once we start having rituals, they might start being the primary activity of LW and make the experience unwelcoming even if non-ritual activities are explicitly open, because it feels more like 'a Church group that occasionally has secular activities. I'm worried that this will divide people into those who properly mark themselves as "LWers" and those who don't, thus starting our entropic decay into a cult."

So far, your objections seem to be to this being the primary activity of the LW group, which -- honestly -- I would join you. But if a regularly meeting LW group also had a Catan night once a week (for Catan enthusiasts, obviously -- if you don't like Catan don't come) and a Filk night once a month (for filk enthusiasts, again), I am not sure this would hasten a descent into a Catan-only or filk-only group. Similarly, if a LW group has a ritual once a year (or even if every LW group has a ritual, and even it's the same ritual), it doesn't seem likely rituals will become the primary thing the group does.

"There is a rather enormous difference between things I care whether lwers do and things I care whether lw does."

I notice I am confused. LessWrong is a web site, and to some extent a community of people, which I tend to refer to as "Less Wrongers". If you mean these words the same as I do, then I do not understand -- "LW does something" means "the community does something" which means "many members do something". I'm not really sure how LW does something is distinguished from LWers doing it...

moshez100

Sorry, that's not the context at which I meant it -- I'm sure you're as willing to admit you were wrong as the next rationalist. I mean it in the context of "Barbarians vs. Rationalists" -- if group cohesion is increased by ritual, and group cohesion is useful to the rationality movement, than ritual could be useful. Wanting to dissociate ourselves from the trappings of religion seems like a case of "reversed stupidity" to me...

moshez40

The same bias to...what? From the inside, the AI might feel "conflicted" or "weirded out" by a yellow, furry, ellipsoid shaped object, but that's not necessarily a bug: maybe this feeling accumulates and eventually results in creating new sub-categories. The AI won't necessarily get into the argument about definitions, because while part of that argument comes from the neural architecture above, the other part comes from the need to win arguments -- and the evolutionary bias for humans to win arguments would not be present in most AI designs.

moshez40

Thanks! You have already updated, so I'm not sure if you want to update further, but I'm wondering if you had read Why our kind can't cooperate, and what your reaction to that was?

Load More