Adirian
Adirian has not written any posts yet.

I think you read something which left out something; Belle's Theorem disproved "neo-realism," which is the idea that there was a classical-physics explanation, i/e, with real particles with real properties. It's the model EPR was trying to assert over the Copenhagen interpretation - and that, indeed, was its only purpose, and I find it odd that you bring that thought experiment up out of the context of its intent.
Well, actually, Everette's Many-Worlds actually repermits classical physics within its confines, and hence real particles, as do other superdimensional interpretations - within his model, you're still permitted all the trappings of classical physics. (As they break an assumption of normality in Belle's Theorem, namely, that there is only one universe, or in the case of superdimensionality, that the universe doesn't extend in other directions we can only detect abstractly.)
Eliezer -
"Information" in this case is the properties; my apologies, I am loose with language. The properties were transformed - and, in the case of a splitting beam, with a 1-1 function. The properties were "lost" when they were split - they weren't the same as they were before. But they weren't irrecoverably lost. (At least close enough for testing; you may have medium degradation, i/e, property attenuation, depending upon the quality of the crystals and the intermediate material provided it isn't in a vacuum.)
To irrecoverably lose properties, you need a non 1-1 function - which is exactly what we had when we sent them through the filter rather than the splitter.
The fundamental descriptive mathematics are known - the interpretations are still debated. As has been the case for nearly a century now, and I don't see that changing anytime in the immediate future. And if you recombine all four sets of split beams, then there isn't anything interesting going on there, either; half still goes through, same as before, and predictably so. That is, if you direct one polarization one direction, and another in another, and then recombine them - and there's the snag, see. You can't combine them without re-emitting both of them; you're performing an additional operation which is generating/modifying information. You aren't reproducing lost information; you're generating new information which is equivalent to the lost information.
For the fundamental physics to be known, they must be falsifiable, and have passed that test. This is not the case. The mathematics are passing with flying colors, of course - nobody is entirely sure what the mathematics mean, however. (Everybody thinks they do, though.)
There is, of course, a fairly simple alternative solution, dealing with "real" particles; the photons coming out of the filters are not the photons that went in. Photons don't travel through the sheet; the energy is absorbed, and the properties of individual components of energy determine what happens next. The properties of some chunks of energy cause similarly-propertied energy to be re-emitted on the other side. It's not that the photons have mysteriously lost the information about their "spin" in the middle sheet - it's that we're dealing with new photons with new property sets, which are being re-emitted with the emission properties of the second sheet, rather than the first.
With this interpretation, the phenomenon makes perfect sense, and the old textbooks are right - after a fashion - that the second measurement destroyed the information that the first measurement generated.
"Well, it's physics, and physics is math, and you've got to come to terms with thinking in pure mathematical objects."
If you're trying to convince anybody here, you're going to fail,... (read more)
Will - field theory is pretty good, yup, although...
We're basically at the same point in physics we were a little more than a century ago. Back then, there were two major camps - the atomicists, and the energists. The energists' position was essentially that everything was made of energy, the atomicists' position was that there were these tiny particles we hadn't seen yet, but they were in fact real.
Now, at the time, both camps had equally valid positions, although the energists had the stronger support - but there was a very interesting distinction between the two. If the energists were right, we were in a position where we knew... (read more)
Will -
The reasoning is better understood in terms of in wave mechanics; if the particle states diverged in the least, then the cancellation wouldn't be complete, and the experimental results would differ.
That is, they must be identical, not indistinguishable, for wave cancellation to operate. (sin-1(sin(x) +.0000000000001) isn't x.
However, again, this depends upon a particular mathematical definition of the particles - in particular, a model which has already defined that particles have no discrete existence. Eliezer is by far my favorite author here, but he has a consistent fault in confusing mathematical descriptions with mathematical definitions. That is, he seems to believe a model which accurately describes and even predicts... (read more)
"But the "electrons" we see today, would still be computed as amplitude flows between simulated configuration"
- Eliezer, the argument being posted against you is that the MODEL could be wrong. Remember, it's a mathematical model - it describes, it doesn't define.
Remember, there are quite a few models of quantum physics that describe the behavior of quantum "particles" - and that presumes on the particles' very existence. It is quite possible to invent a model which describes physics perfectly but which omits the existence of electrons, photons, and other quantum particles, as nothing more than artifacts of interaction between particle's fields. (The math gets ugly in a way that is... (read more)
"Bayes-language can represent statements with very small probabilities, but then, of course, they will be assigned very small probabilities. You cannot assign a probability of .1% to the Sun rising without fudging the evidence (or fudging the priors, as Eli pointed out)."
"So much for begging the question. Please do a calculation, using the theorems of Bayes (or theorems derived from Bayesian theorems), which gives an incorrect number given correct numbers as input."
Your high-capacity Einstein would come to the conclusion, left to those parameters, that the picture never changes. The pattern for that is infinitely stronger, thinking so quickly, than any of the smaller patterns within. Indeed, processing the same information so many times, it will encounter information miscopies nigh-infinitely more often than it encounters a change in the data itself - because, after all, a quantum computer will be operating on information storage mechanisms sensitive enough to be altered by a microwave oven a mile away.
You have a severe bootstrapping problem which you're ignoring - thought requires subject. Consciousness requires something to be conscious of. You can't design a consciousness and throw things for it to be conscious of after the fact. You have to start with the webcam and build up to the mind - otherwise the bits flowing in are meaningless. No amount of pattern recognition will give meaning to patterns.