Jump to content

Recommended Posts

Posted
5 hours ago, Eise said:

Sigh...

Again you are interpreting Zeilinger wrong. And, framed by your interpretation, even @joigus gets it wrong here:

No, you are reading it 'backwards'. Read precisely what Zeilinger is saying:

Superdeterminism would be (reshuffling above sentence):

That would mean that reality (even in the far past) has a very essential influence on our deciding which measurement to perform.

Read closely, so that you see the difference. If necessary, repeat in your own words, so that we can check that you really understand that you read the original sentence backwards.

In Zeilinger's own words, in its own paragraph about superdeterminism (calling it 'total determinism'):

Bold by me.

I mentioned that already here:

And this is what the 'quasar-driven' experiment is about. Not about the choice of locality on one side, and realism on the other. Again, bangstrom, in their technical meanings as used in CHSH, not in what you would like to see as realism (non-locality implies non-realism). Zeilinger and co are very clear in their article: it is about closing the free-choice loophole, not about locality or realism.

Just in case you do not notice: I boldface words in my own texts, that use these words as they are meant in their precise meanings as used by all QM authors, especially CHSH.

Nope. Correlation (consistently, not accidentally) means that the events share a common history. And that is the moment that the entangled particle were produced.

Thanks for the clarifications. +1

The "that would mean" kind of gives it away. I've been worried by this for some days now... Is Zeilinger endorsing superdeterminism???

I will get hold of this book and read the paragraphs in their natural logical flow, if you know what I mean.

I did understand that the quasar experiment seems to rule out superdeterminism, though. But at that point it seemed to me that Zeilinger was considering superdeterminism as the only way out to save locality.

That would be a big "oh, no!!" by me.

Posted (edited)
2 hours ago, joigus said:

The "that would mean" kind of gives it away. I've been worried by this for some days now... Is Zeilinger endorsing superdeterminism???

No, no, definitely not. He tends to give up on realism.

2 hours ago, joigus said:

I did understand that the quasar experiment seems to rule out superdeterminism, though.

Yep. I think on one side he is teasing out the maximum of his experimental capabilities, at the other side I think he is trying to convince people that the free-choice loophole is not very feasible anymore. The orientation of the polarizers should have been determined already about 8 billion LY ago! And that without any meaningful hypothesis how nature does that.

PS But Sabine Hossenfelder is in favour of superdeterminism. I will read her podcast again, see how she argues. (And if she uses the same definition as Zeilinger does...)

Edited by Eise
Posted
19 minutes ago, Eise said:

Yep. I think on one side he is teasing out the maximum of his experimental capabilities, at the other side I think he is trying to convince people that the free-choice loophole is not very feasible anymore. The orientation of the polarizers should have been determined already about 8 billion LY ago! And that without any meaningful hypothesis how nature does that.

Yes, exactly. While not impossible to conceive, thinking that photons emitted 3 bya by remote quasars somehow code the decisions that primates will take here and now on a particularly rainy day in Vienna at, say, 8:00 AM, is next-to-inconceivable. That the information be so intimately interwoven as to produce this next-to-hallucinogenic coincidental effect is something I don't want to even start to consider.

I think the solution to the perplexing results of the quantum theory of measurement are better tackled by some kind of "fiduciary internal determinism." Something very much in the vein of what John Bell proposed with his beable idea.

It would involve the idea that a choice of wave function does not determine everything. Infinitely many gauge descriptions would hide the "hidden variables." The catch is: Hidden variables are hidden forever, they never show up completely. They simply can't be seen, ever. There's a logical --and experimental-- fundamental obstruction.

I think that's the way out for the problem of measurement, and I think that's the natural logical continuation of John Bell's musings. But that's probably the topic for another thread.

33 minutes ago, Eise said:

PS But Sabine Hossenfelder is in favour of superdeterminism. I will read her podcast again, see how she argues. (And if she uses the same definition as Zeilinger does...)

That seems to be the case. I cannot wait for her to give up on that, because she is very much the conscience of theoretical physics today.

Posted
On 10/27/2022 at 7:02 AM, swansont said:

No, it’s not. It’s assumed by you, but it’s not a falsifiable assumption. Much like EM waves were not evidence of an aether. Science requires evidence to back up such claims.

There is no interaction in the theory. It’s assumed because of classical physics preconceptions

You are right, the signal is an assumption based on classical preconceptions so we need to take that into consideration. Somehow entangled particles appear to ‘see’ no separation between them, in which case, there is no need for a signal. The quantum explanation is that their properties are superimposed and I don’t find that very satisfactory either. I consider the ‘signal’ to be a placement word for what is behind the changes we observe.

The trouble with all of our assumptions is that over time they eventually over take on an undeserved reality and become unquestionable facts.

On 10/27/2022 at 7:02 AM, swansont said:

Nobody has disagreed that the states are undetermined. Everyone has confirmed it.

 

You know, I know and everyone knows that entangled states are indeterminate. What everyone doesn’t know is that the indeterminacy of entanglement is what makes it possible.

 

 

Posted
6 minutes ago, bangstrom said:

You are right, the signal is an assumption based on classical preconceptions so we need to take that into consideration. Somehow entangled particles appear to ‘see’ no separation between them, in which case, there is no need for a signal.

In science we can’t just assume this, though. And you need independent verification of the signal - the measured correlation isn’t evidence of it.

6 minutes ago, bangstrom said:

The quantum explanation is that their properties are superimposed and I don’t find that very satisfactory either. I consider the ‘signal’ to be a placement word for what is behind the changes we observe.

Argument from incredulity is a fallacy, though, and carries no weight.

Co-opting terminology is just bad form. “signal” has to mean the same thing for everyone. There is no signal- no interaction - between the entangled particles. 

 

6 minutes ago, bangstrom said:

The trouble with all of our assumptions is that over time they eventually over take on an undeserved reality and become unquestionable facts.

Indeed. So dispense with the “undeserved reality” of a signal.

Posted
13 minutes ago, bangstrom said:

You know, I know and everyone knows that entangled states are indeterminate. What everyone doesn’t know is that the indeterminacy of entanglement is what makes it possible.

You didn't reply to my last post, but I will try this anyway.

Consider the equation sinx = 0.866...

No one is suprised when we say that the solution to this equation is x = 60o or x = 120o or x = 420o or  a further infinity of solutions.

So are all these solutions in superposition or entangled, and just waiting to be selected or observed ?

 

Posted
13 hours ago, Eise said:

Nope. Correlation (consistently, not accidentally) means that the events share a common history. And that is the moment that the entangled particle were produced.

Correlation means the particles are matched together in their quantum properties, in the case of entanglement, they are kept opposite, that is, anti-coordinated. Any one quantum property measured before entanglement need not be the same after entanglement is lost. This requires some form of instant transaction of information.

 

30 minutes ago, studiot said:

You didn't reply to my last post, but I will try this anyway.

I read the your post, the book looked good, I took note of the book, but I don’t have the book. Perhaps sometime later I will have something to say.

33 minutes ago, studiot said:

Consider the equation sinx = 0.866...

No one is suprised when we say that the solution to this equation is x = 60o or x = 120o or x = 420o or  a further infinity of solutions.

So are all these solutions in superposition or entangled, and just waiting to be selected or observed ?

No one is surprised with the math but here is the surprising part. If you measure the polarity of a local entangled photon as 60 degrees, you instantly know the polarity of the other photon should be 105 degrees. The other photon could be miles away or theoretically many galaxies away. The question is, How did the distant photon instantly 'know' what its orientation should be?

 

Posted
1 hour ago, swansont said:

Argument from incredulity is a fallacy, though, and carries no weight.

Co-opting terminology is just bad form. “signal” has to mean the same thing for everyone. There is no signal- no interaction - between the entangled particles. 

 

The anti-correlation among entangled particles is evidence of something superluminal happening. This is not explained by the argument from incerdulity that there is no signal.

Posted
1 hour ago, bangstrom said:

The anti-correlation among entangled particles is evidence of something superluminal happening. This is not explained by the argument from incerdulity that there is no signal.

If you claim there is a signal, the burden of proof is yours - you must provide independent evidence of it. QM doesn’t require it. QM doesn’t have an interaction you can point to.

So what’s your independent evidence? Without it this is just circular reasoning

2 hours ago, bangstrom said:

Correlation means the particles are matched together in their quantum properties, in the case of entanglement, they are kept opposite, that is, anti-coordinated.

They can be the same, as well. For example, type I parametric down-conversion gives photons of the same polarization.

 

Posted (edited)
2 hours ago, bangstrom said:

I read the your post, the book looked good, I took note of the book, but I don’t have the book. Perhaps sometime later I will have something to say.

@studiot is not talking about the book he referred to before. He is now making a new argument, designed to help you understand a point very similar to one I presented before: The anticorrelations are built into the state, very much like the Pythagorean theorem is built into the relations of a right triangle, for example --a similar example to the one he is proposing. He is proposing that infinitely many possible values of an angle can be associated to a given value of a trigonometric function.

And that should be no surprise, nor it implies a "spooky action between angles."

Nor should it be any surprise that the two non-right angles of a right triangle correlate to \( \vartheta_1 + \vartheta_2 = \pi / 2 \).

Continuing on my example of the right triangle ("the speed at which the Pythagorean theorem is true is infinity"), but inspired by Studiot's example of the angles:

You can take as analogues of quantum evolution the right triangles getting bigger and bigger; the analogue of the correlation, the fact that they add up to \( \pi/2 \), and the analogue of being a singlet state (scalar representation of the rotation group), the fact that the triangles evolve while keeping the geometric ratios (analogue of unitary quantum evolution as well.) 

image.png.0165b513c19456740a5bb56eceb677e8.png

Edited by joigus
minor addition
Posted
2 hours ago, bangstrom said:

The question is, How did the distant photon instantly 'know' what its orientation should be?

You claim to have the answer. But how can you demonstrate that there is a signal?

Posted

More comments on Dance of the Photons (my emphasis in boldface throughout):

Zeilinger on how Bell's theorem is not the weirdest theorem in the world:

Quote

How is it possible that a statement as simple as Bell’s inequality might not hold in nature? The problem we have is that the considerations that led us to Bell’s inequality were extremely simple. I would argue that they are so simple that the Greek philosopher Aristotle could already have derived Bell’s inequality had he known that this was an interesting and nontrivial problem.

I said:

On 10/14/2022 at 3:57 PM, joigus said:

Some cleaning up of silliness seems necessary. I can't let this go. I don't know who said it, but it's sooo foolish...

Bell's inequality is certainly not the weirdest theorem in the world. It represents the common world. The world of "yes" and "no."

Zeilinger on standalone realism itself being at stake in the whole question:

Quote

What kind of conclusions can we now draw from the violation of Bell’s inequality? It is clear that at least one of the assumptions we used in its derivation must be wrong. What were these assumptions?

The first fundamental assumption was that of realism. This is the idea that an experimental result reflects in some way the features of the particles that we measure.

But QM already takes care of that => No-hidden-variables for spin, according to the theorems. In this particular connection, I said,

On 9/10/2022 at 1:50 AM, joigus said:

None of that happens. What happens is quantum mechanics.

Bell's theorem is a proof that whenever you have 3 propositions that are either true or not true (classical logic), eg:

A, not A

B, not B

C, not C

then the probabilities satisfy the following constraint:

probability(A,not B)+probability(B,not C) is greater or equal to probability(A,not C)

Bell found a set of propositions that violate this constraint according to quantum mechanics.

Therefore, QM violates classical logic. Period.

Zeilinger about how the argument of locality does not play a role in the Kochen-Specker theorem:

Quote

Now, since Kochen and Specker only considered measurements on single quantum particles, the locality hypothesis does not come into play.

and I also said,

On 10/24/2022 at 12:29 PM, joigus said:

It does it in such a way that, for Hilbert spaces of dimension 3 upwards, you can even build 3 mutually commuting observables for which attributing hidden variables to determine the 3 corresponding eigenvalues is impossible (Kochen-Specker theorem.) With space-time not playing even the remotest part in the argument.

There are more points coming, but for the time being, suffice this to prove that Zeilinger's view is pretty much overlapping, at the very least, with what I've been saying here, and that you haven't the faintest idea of what you're talking about.

Now, I hadn't read Zeilinger's book, and I'm very grateful to @Eise for facilitating me the reading of the most significant passages of the book. It's not very difficult for me to "read his mind," as it were, because I've been thinking, reading, and calculating about these topics very seriously for 40 years. I'm not reading anything I didn't know.

I hadn't read about superdeterminism or thought about it in any length, because I've never thought it was a serious alternative. Now I'm sure Anton Zeilinger doesn't support it either --thank you, Eise.

Now you tell me where I --and most other knowledgeable members-- have misunderstood quantum mechanics, @bangstrom. And please answer Swansont's question once and for all or, with all due respect, "collapse" into the mute state for all that concerns this thread, unless it is for the purpose of stating your thoughtful disclaimers, which I'm sure everyone would accept immediately.

 

 

Posted
3 hours ago, bangstrom said:

No one is surprised with the math but here is the surprising part. If you measure the polarity of a local entangled photon as 60 degrees, you instantly know the polarity of the other photon should be 105 degrees. The other photon could be miles away or theoretically many galaxies away. The question is, How did the distant photon instantly 'know' what its orientation should be?


 

3 hours ago, bangstrom said:

I read the your post, the book looked good, I took note of the book, but I don’t have the book. Perhaps sometime later I will have something to say.

 

Really ?

I find the rudeness of the dismissal the most suprising part in your reply to someone who has offered sources of genuine help for your consideration.

And this is not even your thread.

You clearly do not understand the most fundamental point about entanglement at all.

The entangled properties are set for both particles at the moment of entanglement.

In order to create entanglement you require very close proximity  -  not spooky action at a distance.

But once set it does not matter how far they diverge, anyone who measure one automatically knows the other.

I'm a little hydrogen atom,
You're a little hydrogen atom
I've got one electron
You've got one electron
Let's get spin entangled and form a hydrogen molecule.

At this point we know that one electron is spin up and the other is spin down

But we don't know which is which until we observe one of them.

 

Thanks to joigus +1 for trying to explain my sine example and providing another example.

 

Posted (edited)
1 hour ago, swansont said:

You claim to have the answer. But how can you demonstrate that there is a signal?

1 hour ago, swansont said:

You claim to have the answer. But how can you demonstrate that there is a signal?

There is a demonstrable action and a predictable reaction. That is a demonstration of some kind of connection.

Edited by bangstrom
Tried to get rid of duplication and lost my ability to post?
Posted
32 minutes ago, bangstrom said:

There is a demonstrable action and a predictable reaction. That is a demonstration of some kind of connection.

The predictable reaction is a correlation that was already present. That’s all that the experiment demonstrates.

Where is the demonstration of a signal that does not rely on this correlation?

 

Posted
2 hours ago, joigus said:

The anticorrelations are built into the state, very much like the Pythagorean theorem is built into the relations of a right triangle, for example --a similar example to the one he is proposing. He is proposing that infinitely many possible values of an angle can be associated to a given value of a trigonometric function.

Agreed that the anti-correlations are built into the state. I question how they are maintained after entanglement when the Bell test has ruled out the possibility that the quantum properties of the entangled particles are not static.

To go back to the old gloves in boxes scenario. If you open your entangled box and find a RH glove you know the other box is LH. If you could close the box and somehow re-entangle the boxes the handedness of the gloves again becomes random.

If you open the box a second time and find your glove to be LH  the handedness of the glove in your box has changed and you know the other box contains now contains the RH glove. How did the glove in the distant box 'know' it should be LH on your first observation and RH on your second observation?

8 minutes ago, swansont said:

The predictable reaction is a correlation that was already present. That’s all that the experiment demonstrates.

Where is the demonstration of a signal that does not rely on this correlation?

 

Then the correlation has become the signal.

1 hour ago, studiot said:

I find the rudeness of the dismissal the most suprising part in your reply to someone who has offered sources of genuine help for your consideration.

I am sorry if that offended you and I thank you for the suggestion. I meant no offense.

 

1 hour ago, studiot said:

You clearly do not understand the most fundamental point about entanglement at all.

The entangled properties are set for both particles at the moment of entanglement.

The entangled properties are not 'set' they become random, The conventional explanation is that they are superimposed and indeterminate until the first observation is made and then they become set again until something disturbs them.

 

1 hour ago, studiot said:

In order to create entanglement you require very close proximity  -  not spooky action at a distance.

Close proximity is only necessary for generating entangled particles for experimental purposes but spontaneous entanglement at a distance among charged particles (especially electrons) is normal and common. Carver Mead is THE authority on entanglement among electrons and he claims that any electron can spontaneously become entangled with any other electron on its light cone. He explains this in detail in his book, "Collective Electrodynamics" Part 5 Electromagnet Interaction of Atoms.  

2 hours ago, studiot said:

I'm a little hydrogen atom,
You're a little hydrogen atom
I've got one electron
You've got one electron
Let's get spin entangled and form a hydrogen molecule.

At this point we know that one electron is spin up and the other is spin down

But we don't know which is which until we observe one of them.

 

Electrons in a molecule are entangled as are the electrons in an atomic cloud of electrons and a electron in one atom can entangle with an electron in a distant atom.

"In a time-symmetric universe, an isolated system does not exist. The electron wave function in an atom is particularly sensitive to coupling with other electrons; it is coupled either to far-away matter in the universe or to other electrons in a resonant cavity or other local structure."- Carver Mead

Posted
10 minutes ago, bangstrom said:

To go back to the old gloves in boxes scenario. If you open your entangled box and find a RH glove you know the other box is LH. If you could close the box and somehow re-entangle the boxes the handedness of the gloves again becomes random.

If you open the box a second time and find your glove to be LH  the handedness of the glove in your box has changed and you know the other box contains now contains the RH glove. How did the glove in the distant box 'know' it should be LH on your first observation and RH on your second observation?

Aha!!!

+1. Good. You really want to understand this.

I've found what's confusing you. The reason is this: This doesn't happen if you don't re-entangle the boxes.

If you open the box a second time, or a third time (without bringing the sub-systems together again!) it will give again, and again LH. And again, and again... Unless some further interaction flips the spin or makes it precess, etc. But then, the spins never become correlated again. Or...

If, on the other hand, as you say, you make them "somehow" entangled again, you get them back down to a singlet, and re-start the process, you do need to bring the particles together again. This "make them interact again" is what you say yourself (I repeat):

21 minutes ago, bangstrom said:

If you could close the box and somehow re-entangle the boxes the handedness of the gloves again becomes random.

Don't dwell on the confusion: They are always random, always! What's unmistakably quantum is not the fact the they are anticorrelated (we can reproduce that classically: boots, gloves, coins...), not even that they are random (we can reproduce that classically too, by introducing stochastic terms in the evolution equations), but the nature of the correlations that comes from non-commutativity. That, and only that, you cannot reproduce any other classical way.

I repeat: If you want to re-entangle them again, you must bring them together again. See how (obviously, I would say!) it is totally local?

The postulate of projection is what introduces irreversibility in the world by hand, by an ad hoc assumption. What I've just told you is completely contemplated by the projection postulate:
 

Quote

Effect of measurement on the state[edit]

When a measurement is performed, only one result is obtained (according to some interpretations of quantum mechanics). This is modeled mathematically as the processing of additional information from the measurement, confining the probabilities of an immediate second measurement of the same observable. In the case of a discrete, non-degenerate spectrum, two sequential measurements of the same observable will always give the same value assuming the second immediately follows the first. Therefore the state vector must change as a result of measurement, and collapse onto the eigensubspace associated with the eigenvalue measured.

Postulate II.c

If the measurement of the physical quantity {\mathcal {A}} on the system in the state |\psi\rangle gives the result {\displaystyle a_{n}}, then the state of the system immediately after the measurement is the normalized projection of |\psi\rangle onto the eigensubspace associated with {\displaystyle a_{n}}

{\displaystyle \psi \quad {\overset {a_{n}}{\Longrightarrow }}\quad {\frac {P_{n}|\psi \rangle }{\sqrt {\langle \psi |P_{n}|\psi \rangle }}}}

 

https://en.wikipedia.org/wiki/Mathematical_formulation_of_quantum_mechanics#Effect_of_measurement_on_the_state

IOW, immediately after a measurement, the state is an eigenstate of the observable you've just measured, so it gives you the same value with a 100% probability. When you project a projection, it gives you the same projection. That's the definition of a projection.

The problem is, of course, that we cannot make this rule (that works perfectly in every other sense) comply with the Schrödinger equation. That's the ultimate reason why Zeilinger is inevitably led to say --in the reference you provided,

Quote

At present, there is no agreement in the scientific community as to what the philosophical consequences of the violation of Bell’s inequality really are. And there is even less agreement about what position one has to assume now.

And that's because there's no agreement, really, as to what happens after the measurement.

Particles that become entangled need to do so by means of local interactions. If you study field theory, this becomes obvious. That's why all field theorists, Weinberg, Salam, Gell-Mann, 't Hooft, etc., have always been so little impressed by these claims of non-locality.

 

Posted
4 hours ago, joigus said:

I've found what's confusing you. The reason is this: This doesn't happen if you don't re-entangle the boxes.

The point I was trying to make is that the results are random whether you make the observation once or a dozen times.

 

4 hours ago, joigus said:

Don't dwell on the confusion: They are always random, always!

Agreed, they are always random.

 

4 hours ago, joigus said:

Particles that become entangled need to do so by means of local interactions. If you study field theory, this becomes obvious. That's why all field theorists, Weinberg, Salam, Gell-Mann, 't Hooft, etc., have always been so little impressed by these claims of non-locality.

What do they consider to be "local interactions"? If one electron entangles with any other electron on their common light cone, would that be considered 'local'? One peculiarity of entanglement is that the particles appear to be without any space-like separation between them. Their properties are superimposed and that may be as local as anything can get.

Posted (edited)

Bell's "Theorem": loopholes vs. conceptual flaws
by A.F. Kracklauer

Quote

Abstract
An historical overview and detailed explication of a critical analysis of what has become known as Bell’s Theorem to the effect that, it should be impossible to extend Quantum Theory with the addition of local, real variables so as to obtain a version free of the ambiguous and preternatural features of the currently accepted interpretations is presented. The central point on which this critical analysis, due originally to Edwin Jaynes, is that Bell incorrectly applied probabilistic formulas involving conditional probabilities. In addition, mathematical technicalities that have complicated the understanding of the logical or mathematical setting in which current theory and experimentation are embedded, are discussed. Finally, some historical speculations on the sociological environment, in particular misleading aspects, in which recent generations of physicists lived and worked are mentioned.

Keywords: Bell’s Theorem; Projection Hypothesis; entanglement; non-locality; irreality

1 Introduction
1.1 The issue of dispute
Frequently in human affairs, be they world wars or parlor games, the participants become so fixated on the tactics of the moment that fundamental strategic considerations are overlooked. This writer holds that this is exactly the current situation with regard to the theoretical and experimental study of the nature and consequences of Bell’s Theorem. What has become known as “closing loopholes in the experimental verification of Bell’s Theorem” is a systematic attack on aspects of the experiments due to possible ancillary technicalities of the experimental setups. It is thought that peculiarities pertaining to the experimental equipment, or overlooked physical effects, may introduce erroneous data supporting misleading conclusions. Very few of those concerned with the general validity of the experimental verification of Bell’s theorem also concern themselves with the more fundamental question: is the theorem itself, aside from practical laboratory realities and exotic hypothetical effects, as a statement within the ambit of Quantum Theory, valid? Is it self consistent? And, is it rationally related to entities in the natural world? In short, is Bell’s Theorem logically correct in its ideal form, ignoring practical subsidiary laboratory complications?

In principle, if any statement is conceptually false, then rigorous, logical analysis can identify the offending assumption or deduction in the reasoning chain taken in the attempt to “prove” the conclusion. In mathematics, this process is denoted disproving a theorem. This is a formal matter. Formal logic, however, provides a simpler means to indisputably reject a theorem: display a single counterexample.

Informally, it is likewise instinctively understood, in addition, that any statement or idea that is afflicted with a fatal error, frequently due to complexity or the nonavailability of essential information, can be seen nevertheless as false because some consequence of the statement or a derived idea which should be valid if the statement is true, is in fact invalid. Such a situation can obtain sometimes even when there is no obvious connection to a formal proof or disproof of the statement itself. These “secondary” or derivative falsehoods or inconsistencies can be called “clues.” Often there is no obvious connection of such clues to a formal statement, and, very often they are disregarded as irrelevant.

Now, the critical literature negating the consequences of Bell’s analysis, known as his “Theorem,”[1] to the best knowledge of this writer, contains three long-term schools of analysis criticizing the formal proofs of Bell’s Theorem, and in addition, many single publications proffer clues. The latter are mostly accidental discoveries made in investigations, not always of Bell’s Theorem itself, but of some phenomena used for many purposes only one of which is involved in a experimental proof of Bell’s Theorem. Some of these “clues ” may also be denoted by custom as “loopholes,” which can be distinguished from pure clues by their relevance, not to the core validity of “Bell’s Theorem,” but just to its empirical verification.

Herein a line of critical analysis of “Bell’s Theorem” based on the observation that Bell mistook the use of coincident probabilities, is described. Bell’s analysis deduced an inequality that he asserted must be respected by all local (i.e., conventionally causal: causes of all effects lie within the latter’s past light cone) and realistic (i.e., all material entities exist independent of human interventions or observations).

In addition, there are alternate lines of critical analysis, based on other central observations or propositions, e.g.: [2, 3].

1.2 Clues and counterexamples
The current state of the art regarding proofs of Bell’s Theorem is that experimental realizations of the structure of the inequalities deduced in abstract proofs of Bell’s analysis find that Bell’s inequality is violated for auspiciously chosen parameters. Then, since Bell’s analysis states in short that, without some contribution of irreality (wave function collapse induced by human observation) and/or nonlocality (superluminal interaction) the observed results, i.e., the inequality violation, could not have been obtained. This conclusion is meant to be a logical deduction, in other words, a necessary consequence for the validity of the assertions made to the effect that, “experiments prove Bell’s Theorem.” If it is not valid, then all results from empirical tests of Bell’s analysis are ambiguous, insofar as they may have a conventional explanation, i.e., the experimental ‘proofs’ as such fail to satisfy the proclaimed theoretical deductions.

However, there exist relatively numerous examples of classical phenomena manifestly lacking any hint of irreality or nonlocality violating the very same inequalities. These ‘counterexamples’ usually are based on some macroscopic, classical realization of the microscopic phenomena exploited for the experimental Bell-test experiments. Results from these experiments, to the degree practical, violate Bell inequalities numerically in exactly the way as do Bell verification experiments. As argued above, this should be impossible if Bell analysis is logically fault free. Again, in other words, the conclusion that a Bell Inequality cannot be violated without irreality or nonlocality is baseless.

Herein first, pioneering studies presenting examples of such nonquantum phenomena tending to disprove Bell’s core conclusion by means of counterexample are briefly reviewed. These include those by A. O. Barut & collaborators, Perdijon and Mizrahi and Moussa. Thereafter, the reasons these obviously classical models coincide with the otherwise considered “quantum result” is discussed.

2 The Vanguard
2.1 A. O. Barut & collaborators
In a series of papers beginning about 1984 A. O. Barut and various collaborators have advanced the contention that, for spin -1/2 particles the average from a classical model of an ensemble of similar particles, yields the same correlations as does Quantum Mechanics [4]. They based their analysis on the known fact that Quantum Mechanics addresses only the expectation values of measurable parameters while having nothing to say about individual measurements. This allows then for reasonable physical assumptions regarding individual systems, which they take to be that an ensemble of such entities with spin can have a random distribution of spin orientations over a sphere. These hypothetical inputs to the model imply that, for their model, the singlet state is not a representation of an individual entity with spin, but rather a formalized expression for calculating expectation values for a randomly oriented ensemble of such entities.

In 1986 Barut and his student M. Božic extended their study to the triplet state [5]; and Barut reported explicit examples of hidden variable renditions of Bell Inequality tests thereby claiming to have found a counterexample to the widely accepted assertion that Quantum Mechanics cannot accept such variables [6][2]

Finally, in 1991 Barut published a considerably streamlined analysis of his central assertion regarding spin [7]. Here spin of an entity is taken to be specified by a vector S(θ, φ) giving its direction in space so that the expectation of the correlation E(A, B) is then the average over the randomly distributed angles (θ, φ) for all the elements of the ensemble, namely:

[...]
where a and b are the orientations of the magnetic field interacting with spin, or the axes of polarizers filtering light pulses. Insofar as this classical result is identical to the quantum version, Barut’s model constitutes a counterexample to claims that the observed correlations can be obtained only under the effects of either or both irreality and nonlocality.

2.2 Mizrahi and Moussa
These authors independently extended the analysis of the basic Bell-test by means of a simulation of the classical rendition of the experiment [8]. They proposed an actual mechanical and optical setup to realize the conditions envisioned for the basic, two wing Bell Inequality test. It consists of a randomly flashing light in a rotating tube the ends of which are equipped with two polarizer filters oriented such that their axes have a fixed angular displacement. The light pulses from the flashes then exit the tube on both sides and pass through polarizer filters with axes a, b fixed in the laboratory frame after which pulse intensity is measured and recorded for each. The randomness of the flashes with respect to the rotation of the tube ensures that the polarization orientation of any single pulse is random. The fixed displacement of the axes of the polarizers mounted on the end of the tube ensures that the relationship between the polarization axes of the two pulses is nevertheless fixed. This structure constitutes the essence of the natural phenomena under study. (See Figure 1.)

Figure 1 Schematic of a proposed simulation of the classical variant of the simplest Bell test experiment. The logic of Bell’s inequality derivation would predict that for this setup the inequality would be satisfied. In fact, however, numerical results parallel those from “quantum,” single-photon versions of this setup. Thus, it constitutes a counterexample for the claims made on the basis of Bell’s analysis.
Figure 1
Schematic of a proposed simulation of the classical variant of the simplest Bell test experiment. The logic of Bell’s inequality derivation would predict that for this setup the inequality would be satisfied. In fact, however, numerical results parallel those from “quantum,” single-photon versions of this setup. Thus, it constitutes a counterexample for the claims made on the basis of Bell’s analysis.

The laboratory setup to acquire the data for computing a Bell Inequality consists of two fixed polarizers with photodetectors, one set at each end with axis a or b. The photo detectors then register the intensity of the intensity of the macroscopic light pulses (obviously not single photons) which in the simulation are deduced according to Malus’ Law.

The simulation results are parallel to those obtained in actual optical realizations of the basic Bell-test. The fact that this manifestly classical arrangement leads to a violation of a Bell Inequality must mean either that classical optics also is irreal or nonlocal; or that the significance of Bell’s analysis is misinterpreted, even invalid.

3 The underlying defect
3.1 Edwin Jaynes
The models described above contradict the conclusion to Bell’s analysis. The natural question is: how can a seemingly rigorous deduction be challenged? What, if any, error is involved?

Historically, it seems that the otherworldly consequences of Bell’s analysis were in too great a conflict with otherwise intuitively logical, and without other exception, empirically verified principles to be accepted by everybody, so that in spite of sociological forces of conformity, some researchers sought non fantastical explanations. Perhaps the first to do so, or publish his opinion, was Edwin Jaynes. In the 1980’s Jaynes was engaged in an extensive study of Bayesian Methods within the whole of Physics, and was likely highly sensitized to the intricacies of probabilistic reasoning. With this competence he quickly spotted the fundamental mistake in Bell’s argumentation and made it an example of misapplication of probability theory in the preface to the proceedings of a conference held in 1988 on Bayesian Theory [9]. Therein, without any elaboration or even a single formula, he simply pointed out that Bell had misapplied the concept of a conditional probability. The missing elaboration was subsequently published by Perdijon in 1991.

3.2 Perdijon
In 1991 the French mining engineer J. Perdijon independently proposed the model described above but applied to the optical version concerning the relationship between different states of polarization [10]. His analysis is based clearly and explicitly on the observation that Bell’s expression for joint expectations, i.e.:

[...]
silently presumes that the detections in the two output channels, i.e., “photon detections,” are statistically independent or uncorrelated—contrary to a fundamental, hypothetical input into the analysis. Perdijon notes that for correlated events this formula should be expressed by

[...]
where ρA(λ,a|b) is the conditional probability that a detection is made at station A given that a detection was already made at station B. Such conditional probabilities do not imply, as mistakenly taken by Bell, that there is a causative interrelationship between the polarizers with settings a and b, but that the input signals differ in their characteristics as instilled at a “common cause” on the intersection of the past light cones of the signals[3] Thereafter, as the signals pass through detection stations, they are registered or “seen” if their characteristics correspond to the preset parameters set in the detection apparatus a,b. This is in accord with the conventional understanding of the application of probability theory to correlated events.

When this consideration for the most elementary optical version of experimental tests of Bell’s analysis is correctly taken into account, the derivation of a Bell Inequality does not go through. [Ed.: See removed mathematics from S.3.3 below at the linked page for conditional probability derivation]. Thus, conclusions drawn from the empirical violation of a Bell Inequality are rendered invalid. These results can be extended straightforwardly to more complex coincidence experiments involving more than two channels [11, 12].

3.3 An explicit demonstration
The foundation of a Bell Inequality is the definition of a coincidence probability (or wave function) for correlated events. The version of this expression used by Bell is the following:

[...]


So, here we arrive at the crux of the matter insofar as Eq. (7) cannot follow because the term ∫ dλρ(λ)A(a′|λ)B(b|a,λ) does not equal P (a′, b). In fact it is undefined, or nonsense, as it is the product of the absolute probability A(a′|λ) times the conditional probability B(b|a, λ), which is not conditioned on a′, but on a, thereby rendering the product meaningless.

The final, general conclusion is that this Bell inequality is invalid; deductions from it are void[4]

Exceptionally, of course, when the two detections are uncorrelated, then B(b|a, λ) = B(b|a′, λ), and Bell’s result is valid.

4 Mathematical technicalities
4.1 Quantized and non quantized spaces
There is an intrinsic characteristic related to spin and electromagnetic polarization often overlooked but of fundamental significance. It is that, phenomena for which the mathematical rendition yields an orbital solution manifold with group structure captured by SU(2), are fundamentally non quantum. This follows inexorably from different viewpoints. One such is the fact that, SU(2) is homomorphic to SO(3), i.e., the group of rotations in longitude and latitude on a sphere. The non-commutativity of the generators of SO(3) obviously is geometric in nature. It has nothing to do with quantum mechanical structure, because it is not the consequence of Heisenberg Uncertainty. This is true even though factors of ħ appear, but where this factor scales the radius of the sphere upon which the displacements take place. SU(2) is the group of bi-vector transformations of the 2-D planes in 3-D space orthogonal to vectors (generators) associated with displacements in longitude and latitude. While it is less amenable to visualization than its homeomorph, it is clear that the non-commutative geometric structure of the planes or bi-vectors, like the great circle orbits on a spherical surface, is just a matter of geometry. Quantized spaces, where the non commutativity results from Heisenberg Uncertainty, comprise just two cases, namely phase space (q, p) and quadrature space (phase and amplitude of wave complexes, (ϕ, A)). An obvious consequence of these facts is that all experiments conducted on polarization of electromagnetic signals, (i.e., a structure first introduced by Stokes 70 years before Quantum Theory was envisioned, and having no relationship whatsoever to Heisenberg Uncertainty), cannot be employed for the exploration of implicit consequences from Quantum Mechanics.

This understanding of the fundamental character of this (topological) space, i.e., its non quantum status, is in full accord with all experimental realizations of investigations using Bell Inequalities to plumb Nature as revealed by Quantum Mechanics. In all optical experiments the a’s and b’s are experimenter chosen polarizer axes, which makes the whole setup sensible only if the λ’s are the polarization states of the photons (or electromagnetic pulses) passing through the measuring stations. This obviates the oft encountered theoretical discussion in which it is disputed whether the λ’s are correlated with the a’s and b’s. The whole point of measurement is to exploit a correlation between some property that is not accessible to human perception (because it is too small, outside the ambit of human perception, etc.), here λ, with some variable quantity that is accessible, a meter reading, say. In view of the fact that all the physical processes in the selected venue, i.e., that governed by SU(2), are non quantum in the first place thereby rendering all implications for the existence of preternatural “quantum” phenomena moot; the physical character of all involved variables as prequantum entities is determined by the relevant physics.

These mathematical considerations substantially reinforced by the fact that the instruments and devices employed in optical experiments are in fact capable only of making polarization determinations of electromagnetic pulses, whether such pulses correspond to single photons as imagined or not. The only means to introduce nonlocality or irreality is by hypothesizing that the polarization state of the pulses (photons) is determined by von Neumann’s “Projection” or collapse of the wave function upon observation, in this case by interaction with the polarizers in the measuring stations. However, there is no inexorable reason to reject the non quantum account of the relevant phenomena, specifically, prior causes.

4.2 Representative vice ontological states
Beyond the purely inadequate employ of formulas involving conditional probabilities implicit in this line of critical analysis of Bell’s Theorem, there is an ancillary issue introduced by the singlet state:

[...][Ed.:Mathematics removed d/t formatting--see link; this here is similar to what is mentioned in the thread]:

ψ=12(||||)
If this set of symbols is understood to represent a single, ontological entity, then it, as a composition of mutually exclusive components, is a logical abomination. Nevertheless, in the literature explicating Quantum Mechanics, it is often represented to pertain to a single system or entity. This combination of symbols, however, turns out to be coincidently vitally convenient. For the calculation of a coincidence coefficient as applicable to the experiment in Figure 1, i.e.:

[...]
one sees from this formula that the data streams are to be normalized and have zero mean. In the quantum formalism, both the normalization and the zero mean are built into the definition of the singlet state, so that the calculation of the correlation coefficient conforms to the calculation of an expectation as prescribed by the Born interpretation of wave functions. Thus, in this respect and for that structure governed by SU(2), the quantum formalism merely redresses non quantum notation[5]

In any case, the identification of the expression for the singlet state (and many other similar “quantum” expressions) cannot irrefutably be associated with single ontological entities. Both theory and experiment pertain to ensembles of similar entities; in the case of particles with spin, for example, the ensemble may be distributed randomly over the surface of a sphere.

5 Conclusions
All of the components of the critical analysis presented above are fundamental principles known to virtually any competent practitioner in optics [Ed.Question: Can Supraluminal signals be caught with optical instruments?]. Thus, the question arises: just how can what has been called “... la plus grande mépris de l’histoire de la physique?” [15], persist over 50 years  and become ensconced as professional dogma? The response draws on yet another feature of Quantum Theory of an equally mystical character: the “Projection Hypothesis,” according to which all material entities at their core before measurement are completely described by a wave function consisting of a “superposition” of multiple, mutually exclusive subsequent stages of which one is held to be precipitated ultimately by the act of observation. Although von Neumann is credited with this idea by virtue of having presented it in his book on the mathematical foundations of Quantum Theory, less rigorous discussions and disputations on the interpretation of Schrödinger’s wave functions involving similar notions can be found in historical records. In any case, Bell himself in all his presentations clearly considered that the wave function of the “entangled” daughter particles of two wing variations of envisioned tests were to be ‘realized (i.e., converted to observable or “real” non-entangled entities) by the act of measurement at the detection stations A and B. Nonlocality (superluminal intercourse of some sort) should occur, he took it, in accord with the von Neumann’s Projection Hypothesis applied to separated but formerly entangled subsystems, as a consequence of measurement (which implies intervention by sentient beings). “Projection” is considered to entail ‘realizing’ all space-like separated subsystems instantaneously even when only one is materially engaged, in this regard it violates Einstein’s Principle of Causality that no effect can have a cause outside its past light cone.

It was in trying to accommodate the Projection Hypothesis that induced Bell to the erroneous notion that Quantum Mechanics imposed some kind of instantaneous ’realization’ to an unambiguous state (rather than the superposition of mutually exclusive options) of the spin direction of individual electrons passing through a Stern-Gerlach setup. The fact is, however, probability theory, in particular the use of conditional probabilities correctly employed, has nothing to say about the origin of correlations. The mathematical structure itself would accommodate instantaneous, nonlocal phenomena, were they to exist, without alterations. The source of the issue is not one of Probability Theory, but strictly of interpreting Quantum Theory.

From commentary accompanying early research, one can get the impression that the Projection Hypothesis was introduced in order to accommodate the fact that wave functions, even though interpreted as probability densities, seem also to have physical substance as they are seen to diffract at physical slits. Strictly abstract expressions of knowledge (i.e., epistemological entities) do not also interact with concrete material (i.e. ontological entities). Nevertheless, wave functions for single entities cannot be taken as empirically verified; so that imputing individual (vice ensemble) physical identities to them is not fully justified either. In turn, this complexity led to the introduction of yet another “spooky” notion: complementarity. Here again, weirdness in not the objection, but logical contradiction[6]

Arguably the tolerance of, as well as the public appetite for, mystical or preternatural “scientific” theorizing is best explained perhaps by the Forman Thesis [17]. According to Forman’s historical analysis, the psychological consequence in the post war German Weimar Republic, where the center of the development of Quantum Mechanics took place, as a result of the unexpected and sudden loss of WWI, was such as to foster a general, widespread loss of confidence in rationality and sober consideration of life’s experiences. Nowadays, in retrospect, it seems that this thesis has great merit even when it cannot be taken as the predominant factor. At a minimum, it accounts for the psycho-social environment within which Bell’s generation of physicists (certainly its mentors) were educated, and possibly were also predisposed to “open minded” tolerance of ideas actually deserving deep skepticism.

In conclusion, the analysis presented herein supports the assertion that, Bell’s analysis does not support the contention that Quantum Theory cannot in principle be extended by means of additional local, real variables. Einstein’s life-long conviction that ultimately an interpretation of quantum theory free of the ambiguities he criticized up to his death is seen to be deserving of respect and taken as guidance for the continued development of the understanding of the material world [18].

[...]

[Ed.: Please see link for references and mathematics]

So... conceptually sound, but holey quantification?

@Eise, you asked me a pointed question, and although I think you put it there as low hanging fruit, I will answer it and say quantum tunneling. To send a signal into the past and try to achieve that beer, I say that the emitter is a symmetric gamma ray burst. Given a vacuum energy and pervasive fields I'll blithely pretend I have an effective aether, and a preferred frame, although I read a post of @MigL's where he tried for the beer and I'm sure he wants to take the aetherwind out of my sails.

@hoolayour parallel waves sound like cosine/sine waves -- they'd be out of phase.

 

For illustrative purposes, even though this is a scienceforum; there's been a 17-page shootout here between a group on a bandwagon and a lone @bangstrom on a horse. @Mitcher tried to help but @swansont killed him by saying he couldn't speculate (and we wouldn't want him speculating in here against our conceptually Holey but well quantified Law of Relativity). I told you Murray Gell-Man got mad and left the wagon train to go off and was camping somewhere and devising consistent histories around the campfire. A.F. Kracklauer is now up ahead at the pass through the gorge about to blow the dynamite and close the pass on all your arguments regarding even needing to give up locality or realism. 

This other paper in, "Open Physics", is by Louis Sica who's been on this topic a while:

The ultimate loophole in Bell’s theorem: The inequality is identically satisfied by data sets composed of ±1′s assuming merely that they exist

Quote

The plan of this paper is as follows: Bell’s derivation of the inequality will be contrasted with an alternative derivation that demonstrates that it results only from the operation of computing mutual correlations of datasets consisting of ± 1’s [4]. It is a purely algebraic result. When applied to either random or deterministic variables, it restricts the values of their mutual cross correlations [...]

 

@joigus

The Origin of Complex Amplitudes [PDF DOWNLOAD -- researchgate]

Quote

Abstract. Physics is real. Measurement produces real numbers. Yet quantum mechanics uses complex arithmetic, in which svg.image?\sqrt{-1} is necessary but mysteriously relates to nothing else. By applying the same sort of symmetry arguments that Cox [1, 2] used to justify probability calculus, we are now able to explain this puzzle. The dual device/object nature of observation requires us to describe the world in terms of pairs of real numbers about which we never have full knowledge. These pairs combine according to complex arithmetic, using Feynman’s rules. Keywords: Feynman rules, complementarity.

Introduction
Measurement always involves an interaction between observed object and observing device. Presumably, the device returns to us only one of a pair of numbers that quantified the interaction. We do not probe the inaccessible detail of the interaction — plausibly there may be no classical model for it. It suffices to note that we never attain complete knowledge of either the object (which interacted with the imperfectly known device) or the device (which interacted with the imperfectly known object). We can never bootstrap our way to total knowledge [3], and this indicates that our knowledge is doomed to be, at least in part, probabilistic.

[...]

We describe physics operationally through sequences of measurements that are quantified by pairs of real numbers. This is motivated by the dual (device/object) nature of measurement, and formalized through the pair postulate which notes that our knowledge of a pair is always incomplete, hence probabilistic. Our postulate is a formal expression of quantum “complementarity” [11, 12]. Sequences of measurement obey symmetries. Commutativity and associativity of “parallel” require the sum rule.

svg.image?\begin{pmatrix}%20a_{1}\\a_{2}\end{pmatrix}\oplus\begin{pmatrix}b_{1}%20\\%20b_{2}\end{pmatrix}%20=%20\begin{pmatrix}a_{1}+b_{1}%20%20\\a_{2}+b_{2}%20\\\end{pmatrix}

Associativity and distributivity of “series” allows a choice of three product rules.
A pair c becomes observable through the probability p(c) associated with it. Applying
this requirement in the Markovian case of measurements with closure imposes a particular form of p for each product rule. Consideration of reverse sequences completes the
specification, and gives

svg.image?\begin{pmatrix}%20a_{1}\\a_{2}\end{pmatrix}\odot%20\begin{pmatrix}b_{1}%20\\%20b_{2}\end{pmatrix}%20=%20\begin{pmatrix}a_{1}b_{1}-a_{2}b_{2}%20%20\\a_{1}b_{2}+a_{2}b_{1}%20\\\end{pmatrix} with svg.image?p(\mathbf{c})=c^{2}_{1}+c^2_{2}

These are the Feynman rules, applicable generally. Pairs are known as quantum amplitudes and behave as complex numbers, adding in parallel and multiplying in series, with modulus-squared giving the observable probability. We now see why quantum mechanics uses complex numbers. Quantification is “really” in terms of real pairs, but these behave like single complex entities.

There is logic in there of the type you use and I think you could critique it. I noticed studiot so maybe he can tell if this is worth a damn, too.

Edited by NTuft
grammar, math critique.
Posted
1 hour ago, bangstrom said:

What do they consider to be "local interactions"? If one electron entangles with any other electron on their common light cone, would that be considered 'local'? One peculiarity of entanglement is that the particles appear to be without any space-like separation between them. Their properties are superimposed and that may be as local as anything can get.

Crudely speaking, a theory is local when you can define quantities point-by-point as densities of something (energy, probability, angular momentum, charge, etc., per unit volume), then define current densities from them: the corresponding quantity (energy, probability, angular momentum, charge, etc., per unit surface, and per unit time) that escapes out of that volume through the boundary. For local theories, every quantity that is conserved, in order to leave the volume, must do so going through the surface. In other words, all these conserved quantities satisfy a local conservation law, which reads,

rate of change of quantity Q within the volume V = -(flux of current density of Q through surface S)

where S is the boundary of V, but otherwise S and V are arbitrary, and Q is any of these conserved quantities.

All quantum theories are local in this sense. But the tricky thing is that all these densities become probability densities when quantum mechanics operates at the bottom of it.

What people (sometimes, quite loosely, quite sloppily) call "quantum non-locality) is not that. Quantum theories allow you to prepare states that have no definite value of these locally-conserved quantum numbers Q, and then measure their value at the boundary. So you force the quantum state to "decide" (select, einselect) what particular value of the quantity Q it --for lack of a better word-- "encapsulates."

You can do pairs of measurements for space-like separated events (so no light ray can join them) or for time-like separated events (so the time ordering between the events is observer-independent: the later one is in the light cone of the previous one). That really doesn't make that much of a difference. And it doesn't because all the alternatives that were coded in the quantum superposition in some sense disappear.

Do they go to different branches of existence, no longer connected to the branch we see? Gell-Mann/Hartle style

Do they escape away in the form of empty waves, never to be found again? á la Bohm

Do they immediately disappear from existence, as the Copenhagen interpretation tells us?

Nobody knows. Only theorists really worry about this. And cosmologists, both experimentalists and theorists.

None of these possibilities affords any way of exploiting non-local effects because they have no non-local consequences. I could go into more detail about why that is, but suffice it to say that the same flow of probability density for V volumes and their S boundaries is satisfied. Only with a sharply defined value for Q now.

And the state, as you already know, is always random in other quantities (those incompatible with Q.)

1 hour ago, NTuft said:

@joigus

The Origin of Complex Amplitudes [PDF DOWNLOAD -- researchgate]

Quote

Abstract. Physics is real. Measurement produces real numbers. Yet quantum mechanics uses complex arithmetic, in which svg.image?\sqrt{-1} is necessary but mysteriously relates to nothing else. By applying the same sort of symmetry arguments that Cox [1, 2] used to justify probability calculus, we are now able to explain this puzzle. The dual device/object nature of observation requires us to describe the world in terms of pairs of real numbers about which we never have full knowledge. These pairs combine according to complex arithmetic, using Feynman’s rules. Keywords: Feynman rules, complementarity.

Introduction
Measurement always involves an interaction between observed object and observing device. Presumably, the device returns to us only one of a pair of numbers that quantified the interaction. We do not probe the inaccessible detail of the interaction — plausibly there may be no classical model for it. It suffices to note that we never attain complete knowledge of either the object (which interacted with the imperfectly known device) or the device (which interacted with the imperfectly known object). We can never bootstrap our way to total knowledge [3], and this indicates that our knowledge is doomed to be, at least in part, probabilistic.

[...]

We describe physics operationally through sequences of measurements that are quantified by pairs of real numbers. This is motivated by the dual (device/object) nature of measurement, and formalized through the pair postulate which notes that our knowledge of a pair is always incomplete, hence probabilistic. Our postulate is a formal expression of quantum “complementarity” [11, 12]. Sequences of measurement obey symmetries. Commutativity and associativity of “parallel” require the sum rule.

svg.image?\begin{pmatrix}%20a_{1}\\a_{2}\end{pmatrix}\oplus\begin{pmatrix}b_{1}%20\\%20b_{2}\end{pmatrix}%20=%20\begin{pmatrix}a_{1}+b_{1}%20%20\\a_{2}+b_{2}%20\\\end{pmatrix}

Associativity and distributivity of “series” allows a choice of three product rules.
A pair c becomes observable through the probability p(c) associated with it. Applying
this requirement in the Markovian case of measurements with closure imposes a particular form of p for each product rule. Consideration of reverse sequences completes the
specification, and gives

svg.image?\begin{pmatrix}%20a_{1}\\a_{2}\end{pmatrix}\odot%20\begin{pmatrix}b_{1}%20\\%20b_{2}\end{pmatrix}%20=%20\begin{pmatrix}a_{1}b_{1}-a_{2}b_{2}%20%20\\a_{1}b_{2}+a_{2}b_{1}%20\\\end{pmatrix} with svg.image?p(\mathbf{c})=c^{2}_{1}+c^2_{2}

These are the Feynman rules, applicable generally. Pairs are known as quantum amplitudes and behave as complex numbers, adding in parallel and multiplying in series, with modulus-squared giving the observable probability. We now see why quantum mechanics uses complex numbers. Quantification is “really” in terms of real pairs, but these behave like single complex entities.

Expand  

There is logic in there of the type you use and I think you could critique it. I noticed studiot so maybe he can tell if this is worth a damn, too.

Thank you. I'll take a look ASAP.

5 hours ago, MigL said:

Nice '"Ah-Ha"

Thanks!

Posted
2 hours ago, NTuft said:

For illustrative purposes, even though this is a scienceforum; there's been a 17-page shootout here between a group on a bandwagon and a lone @bangstrom on a horse. @Mitcher tried to help but @swansont killed him by saying he couldn't speculate (and we wouldn't want him speculating in here against our conceptually Holey but well quantified Law of Relativity). I told you Murray Gell-Man got mad and left the wagon train to go off and was camping somewhere and devising consistent histories around the campfire. A.F. Kracklauer is now up ahead at the pass through the gorge about to blow the dynamite and close the pass on all your arguments regarding even needing to give up locality or realism. 

I too have been trying to introduce a measure of levity to this discussion.

 

9 hours ago, bangstrom said:

The entangled properties are not 'set' they become random, The conventional explanation is that they are superimposed and indeterminate until the first observation is made and then they become set again until something disturbs them.

 

Close proximity is only necessary for generating entangled particles for experimental purposes but spontaneous entanglement at a distance among charged particles (especially electrons) is normal and common. Carver Mead is THE authority on entanglement among electrons and he claims that any electron can spontaneously become entangled with any other electron on its light cone. He explains this in detail in his book, "Collective Electrodynamics" Part 5 Electromagnet Interaction of Atoms.  

Electrons in a molecule are entangled as are the electrons in an atomic cloud of electrons and a electron in one atom can entangle with an electron in a distant atom.

"In a time-symmetric universe, an isolated system does not exist. The electron wave function in an atom is particularly sensitive to coupling with other electrons; it is coupled either to far-away matter in the universe or to other electrons in a resonant cavity or other local structure."- Carver Mead

Around 150 years of exacting spectroscopic measurements and the development of the corresponding quantum theory would say otherwise.

Spectroscopic theory is about the most complete and accurate that we posses.

Posted (edited)
1 hour ago, joigus said:

the later one is in the light cone of the previous one).

Sorry. Omission: the later one is in the future light cone of the previous one. And that is agreed upon by all observers, according to SR.

Edited by joigus
emphasis added
Posted
10 hours ago, bangstrom said:

Then the correlation has become the signal.

Correlation is not an interaction, and the correlation is present at the beginning.

3 hours ago, NTuft said:

and we wouldn't want him speculating in here

This applies to you, too.

Posted (edited)

Joigus post above, reminds us of the many interpretations of QM.
All equally valid, and all equally nonsensical from our classical macro perspective.

Maybe if Bangstrom can manage to clean up the 'loose bits' around a non-local theory, we might have yet another interpretation.

 

6 hours ago, NTuft said:

although I read a post of @MigL's where he tried for the beer and I'm sure he wants to take the aetherwind out of my sails.

I'm not an aether fan-boy; I see it as a useless ( and incredible ) add-on, much like I view non-local 'interactions'.

I do hope to some day meet Eise for a drink.
There is a 4 level department store in downtown Zurich, where the lowest level is liquor/wine/beer, and they have a bar, with drinks/snacks ( canapes  ) right in the middle.
If my aunt is still alive, I'll mail him Aug 1/25.

Edited by MigL
Guest
This topic is now closed to further replies.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.