Jump to content

Recommended Posts

Posted (edited)

  • Were their any unexpected anomalies found in Bells Inequality test data aside from those predicted by QM?
  • what was the frequency for Alice and Bob choosing the same test?

    Please dont answer unless you know. e.g. Don't just assume it was 1/3.

    I'm not sure 1/3 is the correct expected results and it may be significantly lower than that such as 1 time in 4 when opposing testors chose the same measurements.

  • Have tests of analogous classical systems been used as a control? such as for what kind of variance to expect?
  • What about Monty Carlo sims of classical systems?
  • Any unexpected results from classical results?
  • What kind of variance was experienced for all results? in general, I'm under the impression that variance was much higher than expected.
  • Were their deviations between different experimental results?
  • What reliability factors are involved.
  • any unexpected anomalies in variance? were probability experts consulted? Forgive my asking, but based on what I know, it seems like proper testing was not much of a concern, contrary to the normal rigorous attention normally applied to physics theory, experiments, especially given the logical nature of the problem, which mathematicians dont seem to appreciate much..
Edited by TakenItSeriously
Posted

Forgive my asking, but based on what I know, it seems like proper testing was not much of a concern, contrary to the normal rigorous attention normally applied to physics theory, experiments, especially given the logical nature of the problem, which mathematicians dont seem to appreciate much..

 

!

Moderator Note

 

Such insinuations need to be backed up with evidence.

 

As to the rest, they are nonspecific questions. The best approach here is for you to do the legwork and investigate individual experiments, and then ask specific questions about them.

 

Posted

 

 

Forgive my asking, but based on what I know, it seems like proper testing was not much of a concern, contrary to the normal rigorous attention normally applied to physics theory, experiments, especially given the logical nature of the problem, which mathematicians dont seem to appreciate much..

 

Perhaps you could cite some specific experiments and point out the shortcomings in methodology?

Posted (edited)

Perhaps you could cite some specific experiments and point out the shortcomings in methodology?

Never paid much attention before discovering the logical flaw, mostly just what I've read about in general about a bunch of "loopholes" or invalid assumptions in theory and experiments, but nothing specific. wikipedia Bell's Inequality under Assumptions

 

The flaw is based on Bob appearing to have 6 results:

test 1

up

down

test 2

up

down

test 3

up

down

But one test is an illusion because it was based on Alice's action not Bob's. The effect breaks the decision tree down in a strange binary way.

 

50% Alice

Test n

50% up

50% down

50% Bob

50% test not n(1)

50% up

50% down

50% test not n(2)

50% up

50% down

Edited by TakenItSeriously
Posted

This is just a repeat of your poker thread in which you couldn't really tie in your objections to actual bell experiments. Now please - especially as this is main fora - be rigorous. How are we testing (electron spin, photon polarity) and what are the experimental (not Alice and Bob Gedenkan) process you believe to be logically flawed.

 

Please do not try to (wrongly) represent Bell and then point out the flaws. You have to show a Bell's Experiment (no need to drill to messy data yet) and why its (ideal) set-up is problematic.

Posted

The biggest anomaly has to do with detector efficiency. It has been known for decades, that classical systems can perfectly duplicate the so-called "Quantum Correlations", at low detection efficiencies; as when only a small fraction of the particle-pairs flowing into the experimental apparatus, are actually detected. Hence, the key question, for all experimental Bell-type tests, becomes, "How great must the detector efficiency be, in order to rule out all possible classical causes, for the observed correlations?"

 

Depending on the nature of the test, various theoretical claims have been made, that detector efficiencies ranging from 2/3 up to about 85% are sufficient to rule-out all possible classical causes.

 

However, a classical system with a pair-detection efficiency of 72%, corresponding the a conditional detection efficiency of sqrt(0.72) = 85% (the figure of merit being reported in almost all experimental tests) has recently been demonstrated (see the discuss at http://fqxi.org/community/forum/topic/2929),and an argument has been made that strongly suggests that 90% efficient, classical systems, ought to be able to perfectly reproduce the observed correlations.

 

Since no experimental tests have ever been conducted at such high efficiencies, the conclusion is, that fifty years of experimental attempts to prove that no classical system can produce the observed correlations, have failed to prove any such thing. Furthermore, the mechanism exploited for classically reproducing the correlations, at high efficiencies, suggests that the physics community has a profound misunderstanding of the nature of information, as the term is used within Shannon's information Theory, and that almost all of the supposed weirdness in the interpretations of quantum theory, are the direct result of this misunderstanding.

Posted (edited)

I looked over the discussion on the other forum as well as the vixra paper ir referenced. Although the vixra is fairly decently detailed I wouldn't place it in the conclusive category.

 

A large part of the problem is that the entanglement spooky action at a distance isn't really all that mysterious. In part it follows from the conservation laws themselves when the particles are first created in pairs. This establishes the correlation function itself.

 

Assuming you have zero interference the correlation function will stay intact. Superposition being a statistical probability function is natural until uou examine the particle state. Naturally you then collapse the wavefunction at this point.

 

As the conservation laws are involved then naturally the state of the other particle is now known. The misnomer is thinking there is action in the first place. The only action that occurs by the physics definition of action is on the examination of the first particle.

 

No action is required for the other particle. (the correlation function itself provides the state of the other particle).

 

Here is a MIT course note on the quantum correlation function which unfortunately far too many people do not understand.

 

https://www.google.ca/url?sa=t&source=web&rct=j&url=https://ocw.mit.edu/courses/chemistry/5-74-introductory-quantum-mechanics-ii-spring-2009/lecture-notes/MIT5_74s09_lec05.pdf&ved=0ahUKEwjUoKzkqf_TAhUK3mMKHbIUBasQFggrMAI&usg=AFQjCNFyQ9vtLRFiAkweDJI1U7jrJAUc_A&sig2=ATQ7hIYkeceapDpp6By0VA

Edited by Mordred
Posted

"Superposition being a statistical probability function" Superposition is a mathematical technique that has nothing whatsoever to do with probability or physical reality. It has to do with being able to process one sinusoid at a time, through a differential equation, in order to combine the solutions with the multiple sinusoidal inputs, into a single solution, corresponding to a general, non-sinusoidal input. Mistaking the mathematical model of a superposition, for a physical model of reality, is one of the biggest mistakes ever made by physicists, and the subject of an unending debate.

 

The Born Rule for treating the square of a wavefunction (Fourier transform) as a probability estimate, arises entirely out the little-known fact, that the computation is mathematical IDENTICAL to the description of a histogram process; and histograms yield probability estimates. It has nothing to do with physics. It is pure math.

Posted (edited)

Math that accurately models reality. When you consider a wave with arrival times to the advanced detectors... After all you want to measure All The spatial and time components on your arrival times

 

How else are you to model a field excitation (particle).

 

Your argument of low sensitivity detectors will naturally give incomplete data. The HUP is fundamental not just some math. When you consider field excitations with constructive/destructive interference from a field or multifield interactions The HUP also makes sense.

 

A Nobel prize was given out by showing that the HUP can be greatly reduced by weak field interactions. The pop media articles described this wrong as well.

. It has nothing to do with physics. It is pure math.

Math is the lanquage of physics. You are not doing physics without it.

 

By the way welcome to the Forum but please note we have a separate section for personal and non concordance models. ie what you learn in textbooks or taught in schools. We make exceptions to an extent for professionally peer reviewed models. A vixra article doesn't count as peer reviewed.

 

For models that do not fall into the above we have a speculation forum to examine and discuss such models as per the rules under Speculation.

"Superposition being a statistical probability function" Superposition is a mathematical technique that has nothing whatsoever to do with probability.

Under QM the particle waveform is a probability. Hence statistical nature. Though thats not the only reason.

 

Under Physics you literally want to predict all possible outcomes and the probability of occurring. QM fields excel at this.

 

The other nature you already noted directly relates to the De-Broglie wavelength nature that partly describes wave-particle duality. The part everyone tends to ignore is the required energy to perform observable action.

 

This is what you are referring to as the biggest mistake in physics. The article you mentioned in no way counters this as the papers argument of low sensitivity detectors literally filters out the dataset that a high sensitivity detector would pick up.

 

How is that good Physics, ignore data as I don't agree with the results and implications?

Edited by Mordred
Posted (edited)

What makes you assume that there even is another component, capable of being measured?

 

This is a very old, but very seldom discussed problem, with all Bell-type theorems. If there is only one component (corresponding to one bit of information), then there is no other, uncorrelated component, to ever be measured. See for example this (https://www.scientificamerican.com/media/pdf/197911_0158.pdf) 1979 article by Bernard d 'Espagnat, page 166: "These conclusions require a subtle but important extension of the mean­ing assigned to a notation such as A+. Whereas previously A+ was merely one possible outcome of a measurement made on a particle, it is converted by this argument into an attribute of the particle itself."

 

In other words, to put it bluntly, simply because it is possible to make multiple measurements, does not ENSURE that those measurements are being made on uncorrelated attributes/properties. One has to ASSUME (AKA converted by this argument), with no justification whatsoever, that those measurements can be placed into a one-to-one correspondence with multiple, INDEPENDENT attributes/components of a particle (such as multiple, independent, spin components), in order for the theorem to be valid. But some entities, like single bits of information, render all such assumptions false - they do no have multiple components. Hence, all measurements made on such an entity, will be "strangely correlated", and as the vixra paper demonstrates, the resulting correlations are identical to the supposed quantum correlations. There is nothing speculative about that. It is a demonstrated fact - quod erat demonstrandum.

 

Math that accurately models physical reality does not imply that one's INTERPRETATION of that math is either correct or even relevant to that reality. Both sides of a mathematical identity, such as a(b+c)=ab+ac, yield IDENTICAL mathematical results, to compare to observations. But they are not physically identical - one side has twice as many physical multipliers as the other. The Fourier transforms at the heart of quantum theory, have wildly different physical interpretations, than the one employed within all of the well-known interpretations of quantum theory. All of the weirdness arises from picking the wrong physical interpretation - picking the wrong side of a math identity, as the one and only one to ever be interpreted.

 

All detection phenomenon are naturally incomplete; it is simply a question of how incomplete. Quantum Theory is unitary, precisely because it only mathematically models the output of the detection process, not the inputs to that process. In the case of quantum correlations, only the detections are counted and normalized by that count - the number of detections (not the number of input pairs) is the denominator in the equation for computing the correlation. Think of it this way - if you counted the vast majority of particles that strike a double slit apparatus and thus NEVER get detected on the other side of the slits, the so-called interference pattern could never sum to a unitary value of 1, corresponding to 100% probability. It sums to a unitary value, precisely because the math only models the detections. Assuming otherwise is a bad assumption that has confused physicists for decades.

 

A particle has no waveform. Only the mathematical description of the particle's detectable attributes, is a waveform. But descriptions of things and the things themselves are not the same: math identities do not imply physical identities. Assuming that they do, is the problem.

 

The HUP, is mathematically identical to the statement that the information content of ANY observable entity >= 1 bit of information.

 

But what happens if it is precisely equal to one, and someone tries to make multiple measurements of the thing?

 

Answer: You get quantum correlations.

 

You have missed the point entirely about detection efficiencies. The point is the calculated, theoretical limits for the supposed, required efficiencies, needed to guarantee that no classical model can reproduce the results, have been falsified. Hence there is something very wrong with the premises upon which those theories are founded.

Edited by Rob McEachern
Posted (edited)

Nailed it I knew you were arguing the strictly particle pointlike view while arguing the waveform characteristics.

 

I'm sorry but you really should come up with a better reference than a 1979 Scientific America article. Particularly

 

Science has gotten a lot farther since then LMAO. That aside either one of us can find tons of recognized peer reviewed articles on what amounts to as Corpuscular theory and your right.

 

High presicion equipment does favor waveparticle duality over corpuscular. For good reason.

 

It is measurable support that the math describes. Physics doesn't define reality, we leave that in the hands of Philosophers. We mathematically describe all observable phenomena and relations of such. Arguing whether or not that describes reality.

 

It can be measured therefore it does. It is a measurable set of relations. Hence the duality itself. We measure characteristics that support BOTH wavelike and corpuscular characteristics.

 

Physics therefore correctly chose NOT to ignore either.

 

Simply put A physicist would define reality as "If its measurable it describes an aspect of reality"

 

So lets not ignore data for a moment. Answer me this question.

 

Under spin statistics which includes the magnetic moment of an electron

 

Why does it take a 720 degree angular momentum (Spin) to reach the original quantum state while under a spin 1 particle only 360 degrees under the corpuscular view?

Edited by Mordred
Posted

"It is measurable support that the math describes." No. The math only describes the observations themselves, not the causal mechanism (AKA interpretation) responsible for producing those observations. In other words, it provides no "support" whatsoever, for any interpretation. That is the origin of the well-known advice to physicists to "Shut-up and calculate."

 

"Physics doesn't define reality, we leave that in the hands of Philosophers." Nonsense. What do you suppose "interpretations" are? All interpretations of theory, are nothing more than metaphysical (AKA philosophical) speculations, regarding the unobserved and hence unknown causes, for the known effects, called observables.

 

There is no wave-particle duality, outside of metaphysical interpretations. There are only observed phenomenon, like correlations and supposed interference patterns. But mathematically, even the classical pattern, produced by macroscopic bullets passing through slits, is mathematically described as a superposition and thus an interference pattern; the pattern just does not happen to exhibit any "side-lobes". These patterns have almost nothing to do with physics, either quantum or classical. It is simply the Fourier Transform of the slit's geometry. Don't take my word for it - calculate it for yourself. In other words, the pattern is the property of the initial conditions, the slit geometry (the information content within the pattern is entirely contained with the geometry of the slits). It does no matter if the things passing through the slits are particles or waves. Physicists have made the same mistake that people watching a ventriloquist act, when they assume the information content being observed, is coming from (is a property of) the dummy (the particles and waves striking the slits). It is not, as computing the Fourier transform will demonstrate. quod erat demonstrandum

 

The particles and/or waves merely act as carriers, analogous to a radio-frequency carrier, which is spatially modulated with the information content ORIGINATING within the slits, not the entities passing through the slits. Do you assume that the information content of your mother's face, encoded into the light scattering off her face and into your optical detectors (AKA eyes) originated within the sun? Of course not. It was modulated onto the reflected light, regardless of whether or not you believe that the light consists of particles, waves, or a wave-particle duality. The patterns are CAUSED by properties of the scattering geometry, not the properties of the entities being scattered; change the geometry and you change the observed pattern.

 

"It can be measured therefore it does" No one has ever measured an interpretation.

 

"Hypotheses non fingo", was how Issac Newton wisely put it:

"I have not as yet been able to discover the reason for these properties of gravity from phenomena, and I do not feign hypotheses. For whatever is not deduced from the phenomena must be called a hypothesis; and hypotheses, whether metaphysical or physical, or based on occult qualities, or mechanical, have no place in experimental philosophy."

Posted (edited)

Sorry your arguments are fruitless, can you explain the Pauli exclusion principle under Corpuscular theory as to why an infinite number of photons can occupy the same space under described by the Bose-Einsten distribution or that only 1 fermion can occupy the same space under the Fermi-Dirac distribution.

 

Or even describe the spin statistics of a particle as to why they are not symmetric under different spin numbers to reach the same quantum state?

 

Arguments that ignore other bodies of evidence on what a particle exhibits isn't science.

 

I have never come across any corpuscular view model that can address those two questions they always ignore it.

 

This is the problem with describing reality via philosophical arguments.

 

You pick and choose your arguments but ignore other bodies of evidence.

Edited by Mordred
Posted

"Why does it take a 720 degree angular momentum (Spin) to reach the original quantum state while under a spin 1 particle only 360 degrees under the corpuscular view?"

 

For the same reason a simple, macroscopic, "L" shaped model, with attached strings needs to be rotated 720 degrees to disentangle the strings (return to zero). Have you never seen this model? It has been used to illustrate this very point, in discussions of spin, for many decades. It is a property of the geometry and has nothin to do with any supposed wave-functions.

Posted (edited)

"Why does it take a 720 degree angular momentum (Spin) to reach the original quantum state while under a spin 1 particle only 360 degrees under the corpuscular view?"

 

For the same reason a simple, macroscopic, "L" shaped model, with attached strings needs to be rotated 720 degrees to disentangle the strings (return to zero). Have you never seen this model? It has been used to illustrate this very point, in discussions of spin, for many decades. It is a property of the geometry and has nothin to do with any supposed wave-functions.

Post the model then. Professionally peer reviewed please.

Mordred, I someone (who I think was you but I may mis-remember) recently posted a link to the Art Hobson paper:

 

https://arxiv.org/pdf/1204.4616.pdf

 

which I found to be absolutely wonderful. It caused me to pretty much put the particle thing to bed.

Yes I posted it. I found it excellent. Based on hard science not philosophy.

 

The amusing part is back in the late 80's early 90's I once strongly supported the corpuscular view. When I started studying particle physics and Cosmology I realized that view simply couldn't address all the questions that QFT could. The funny part was my reason was literally stupid.

 

I always hated statistical mathematics ROFLMAO

 

The more I study action under QFT treatment the more impressed I become. There is literally no interaction that cannot be described under action.

 

For example the coupling constant for each force and the effective distance of each force. Details I posted on a thread I'm still working on.

 

I am developing a list of fundamental formulas in QFT with a brief description of each to provide some stepping stones to a generalized understanding of QFT treatments and terminology. I invite others to assist in this project. This is an assist not a course. (please describe any new symbols and terms)

 

QFT can be described as a coupling of SR and QM in the non relativistic regime.

 

1) Field :A field is a collection of values assigned to geometric coordinates. Those values can be of any nature and does not count as a substance or medium.

2) As we are dealing with QM we need the simple quantum harmonic oscillator

3) Particle: A field excitation

 

Simple Harmonic Oscillator

[latex]\hat{H}=\hbar w(\hat{a}^\dagger\hat{a}+\frac{1}{2})[/latex]

s

the [latex]\hat{a}^\dagger[/latex] is the creation operator with [latex]\hat{a}[/latex] being the destruction operator. [latex]\hat{H}[/latex] is the Hamiltonian operator. The hat accent over each symbol identifies an operator. This formula is of key note as it is applicable to particle creation and annihilation. [latex]\hbar[/latex] is the Planck constant (also referred to as a quanta of action) more detail later.

 

Heisenberg Uncertainty principle

[latex]\Delta\hat{x}\Delta\hat{p}\ge\frac{\hbar}{2}[/latex]

 

[latex]\hat{x}[/latex] is the position operator, [latex]\hat{p}[/latex] is the momentum operator. Their is also uncertainty between energy and time given by

 

[latex]\Delta E\Delta t\ge\frac{\hbar}{2}[/latex] please note in the non relativistic regime time is a parameter not an operator.

 

Physical observable's are operators. in order to be a physical observable you require a minima of a quanta of action defined by

 

[latex] E=\hbar w[/latex]

 

Another key detail from QM is the commutation relations

 

[latex][\hat{x}\hat{p}]=\hat{x}\hat{p}-\hat{p}\hat{x}=i\hbar[/latex]

 

Now in QM we are taught that the symbols [latex]\varphi,\psi[/latex] are wave-functions however in QFT we use these symbols to denote fields. Fields can create and destroy particles. As such we effectively upgrade these fields to the status of operators. Which must satisfy the commutation relations

 

[latex][\hat{x}\hat{p}]\rightarrow[\hat{\psi}(x,t),\hat{\pi}(y,t)]=i\hbar\delta(x-y)[/latex]

[latex]\phi(y,t)[/latex] is another type of field that plays the role of momentum

 

where x and y are two points in space. The above introduces the notion of causality. If two fields are spatially separated they cannot affect one another.

 

Now with fields promoted to operators one wiill wonder what happen to the normal operators of QM. In QM position [latex]\hat{x}[/latex] is an operator with time as a parameter. However in QFT we demote position to a parameter. Momentum remains an operator.

 

In QFT we often use lessons from classical mechanics to deal with fields in particular the Langrangian

 

[latex]L=T-V[/latex]

 

The Langrangian is important as it leaves the symmetries such as rotation invariant (same for all observers). The classical path taken by a particle is one that minimizes the action

 

[latex]S=\int Ldt[/latex]

 

the range of a force is dictated by the mass of the guage boson (force mediator)

[latex]\Delta E=mc^2[/latex] along with the uncertainty principle to determine how long the particle can exist

[latex]\Delta t=\frac{\hbar}{\Delta E}=\frac{\hbar}{m_oc^2}[/latex] please note we are using the rest mass (invariant mass) with c being the speed limit

 

[latex] velocity=\frac{distance}{time}\Rightarrow\Delta{x}=c\Delta t=\frac{c\hbar}{mc^2}=\frac{\hbar}{mc^2}[/latex]

 

from this relation one can see that if the invariant mass (rest mass) m=0 the range of the particle is infinite. Prime example gauge photons for the electromagnetic force.

 

Lets return to [latex]L=T-V[/latex] where T is the kinetic energy of the particle moving though a potential V using just one dimension x. In the Euler-Langrange we get the following

 

[latex]\frac{d}{dt}\frac{\partial L}{\partial\dot{x}}-\frac{\partial L}{\partial x}=0[/latex] the dot is differentiating time.

Still have to give an example solution for the last equation. I plan on this weekend adding to the thread.

 

edit: At Rob I would like the detailed mathematics of the model you mentioned above.

 

I am more than familiar with longitudinal vs transversal wave "heuristics analogies"

 

PS all your arguments have been based upon metaphysical arguments. You simply based your defense by ignoring a larger body of other evidence. After all descriptives of reality is a metaphysical topic unto itself.

 

We can only describe reality not define it. A fundamental concept to understand in physics is that all possibilities are considered as valid until proven otherwise. However that being said physics is literally an art of making predictions.

 

That requires all the statistical mathematics that many rail against.

 

The greatest lesson I ever learned in physics is never approach any view/interpretation with a closed door. Aproach every possibility with an open mind ( provided its a feasible possibility).

 

Making the determination of feasible is oft based on familiarity. Greater study leads to greater insights and understandings. ( provided properly presented).

 

I can accurately state the above true to my thinking even with a masters in Philosphies of Cosmology. Yet I am yet a mere opinion

Edited by Mordred
Posted

Mordred:

There seems to be some gaps in your study/knowledge of QFT. The "L" shaped model, with attached strings, that I mentioned, is not some abstract mathematical entity in string theory. It is an easily observable, real object; a piece of cardboard cut into the shape of an “L”, with three real pieces of string attached to it.

Take the cardboard “L” and place it on a table in front of you. Fasten (tape) one end of each string to each of the L’s three vertices. Fasten (tape) the other end of each string to the table top, leaving some slack in the strings. Now rotate the L 360 degrees about its long axis; the two strings attached to the bottom of the L become twisted together (entangled) in such a manner that they cannot be disentangled without either another rotation or unfastening the strings.

However, if you rotate the L through another 360 degrees, for a total rotation of 720 degrees, it is possible to disentangle the strings, without any additional rotation or unfastening of the strings; thus the 720 rotation is topologically equivalent to a zero rotation.

The device is a variation of the well-known “Plate Trick” AKA “Dirac’s Belt”
see:
https://en.wikipedia.org/wiki/Plate_trick
http://www.mathpages.com/home/kmath619/kmath619.htm

In regards to your comment “PS all your arguments have been based upon metaphysical arguments”, the simulation presented in the vixra.org paper is hardly metaphysical in nature. It is a concrete demonstration of an anomalous (AKA false) prediction in the Quantum Theory associated with Bell-type experiments. That theory predicts that no classical model can reproduce the observed “quantum correlations”, while simultaneously achieving detection efficiencies as high as that exhibited in the model. That claim has been falsified: and without even attempting to maximize the model’s efficiency.

PS: I’m surprised that you, being a frequent poster on this site, have introduced so many things, that have little or no relevance to the topic of this particular thread. I respectfully suggest you stick to the topic. The model presented, is a direct response to the third bullet in the original post: “Have tests of analogous classical systems been used as a control? such as for what kind of variance to expect?” The model was designed to be directly analogous to the polarization measurements, performed on entangled photon-pairs. That is not a metaphysical speculation. It is a statement of fact.

Posted (edited)

I am well aware of Diracs plate. I asked for the mathematical details under Corpuscular theory describing the rotations for those bands.

 

QFT agrees with waveparticle duality. The burden of proof is up to you to show there is no waveforms involved in a particle.

 

Your vixra article is not proof.

Here are the assumptions under spin statistics for those bands

 

The theory has a Lorentz-invariant Lagrangian.

 

1)The vacuum is Lorentz-invariant.

 

2)The particle is a localized excitation. 3)Microscopically, it is not attached to a string or domain wall.

4)The particle is propagating, meaning that it has a finite, not infinite, mass.

5)The particle is a real excitation, meaning that states containing this particle have a positive-definite norm

 

https://en.m.wikipedia.org/wiki/Spin%E2%80%93statistics_theorem

 

Explain particle spin under Corpuscular theory without applying assumptions 1 thru 3.

Edited by Mordred
  • 2 weeks later...
  • 3 weeks later...
Posted (edited)

Here is another point that ought to be considered, in the context of Anomalies in Bell's theorem and the related experiments. It is well known ( https://en.wikipedia.org/wiki/No-communication_theorem ) , that measurements of the second member of each pair of entangled particles, convey no information. But what on earth, does it even mean, to make a "measurement" that is entirely DEVOID of any information? If you go back and attempt to understand what Shannon's Information Theory is all about, you will begin to see the fundamental problem, with the whole Bell type argument; which amounts to trying to figure out the meaning, of meaningless (devoid of ANY information) measurements. And this gets back to the assumption pointed out above (Post #10), in the article by Bernard d 'Espagnat; a measurement, devoid of information, which "is converted by this argument into an attribute of the particle itself."; Presumably an attribute, which is also devoid of information.

Edited by Rob McEachern
Posted

Here is another point that ought to be considered, in the context of Anomalies in Bell's theorem and the related experiments. It is well known ( https://en.wikipedia.org/wiki/No-communication_theorem ) , that measurements of the second member of each pair of entangled particles, convey no information. But what on earth, does it even mean, to make a "measurement" that is entirely DEVOID of any information? If you go back and attempt to understand what Shannon's Information Theory is all about, you will begin to see the fundamental problem, with the whole Bell type argument; which amounts to trying to figure out the meaning, of meaningless (devoid of ANY information) measurements. And this gets back to the assumption pointed out above (Post #10), in the article by Bernard d 'Espagnat; a measurement, devoid of information, which "is converted by this argument into an attribute of the particle itself."; Presumably an attribute, which is also devoid of information.

I agree that information must be exchanged, though it may be obfuscated. Consider this protocol.

 

We know the following:

When Alice tests spin A then Bob tests spin A, then if Alice retests spin A hers results are always the same.

 

When Alice tests spin A and Bob tests spin B then when Alice retests spin A the result is different 50% of the time.

 

So Alice and bob have worked out a protocal to communicate where Alice initializes by testing A Each time.

 

Bob replies back by testing either A for 1 or B for 0.

 

Alice retests A and if it changes which happens 50% of the time, then she has just successfully received a 0.

 

If it doesnt change after some number of repeated trials, then she may conclude with a high probability that Bob has been testing A each time and his intent was to send a 1 bit.

 

This process can continue to send any number of bits back and forth.

Posted (edited)

“We know the following…” No we do not. We do not even know if there is such a thing as BOTH a spin A and a spin B. You seem to be unclear on what a Bell test is all about, so let me explain.

To start with, consider three types of classical objects, a pair of balls (white and black) a pair of gloves (right and left handed) and a pair of coins. If one of the balls, gloves and coins is given to Alice and the other, from each pair, is given to Bob, then both Alice and Bob will instantly know the color of the other’s ball and the handedness of the other’s glove (assuming they were told beforehand, that the paired balls are always white and black, and that the paired gloves are always right and left handed), regardless of how far apart they are; no “spooky action at a distance.”

But they will not know the other’s coin state (or even their own!), because that state, unlike the color of the balls and the handedness of the gloves, is not an attribute of the object itself. Rather, it is an attribute of the relative, geometric relationship, between the observer and the coin; when they look at the coin from one aspect angle, they observe it as being in the state “heads”, but when they look at it from the opposite angle, they observe the state “tails”.

So before they can even determine the state of their own coin, they have to make a decision, about which angle to observe the coin from. Making this decision and observing the result, is what is mistakenly called a “collapse of the wave-function”. But there is no wave-function, there is only a decision-making process, for determining which state, of several possible, states, the object is in, relative to the observer. Unlike the balls and gloves, objects like coins are not in ANY state, until an observer “makes it so.” Note also that even after the decision is made, the two-sided coin did not mysteriously “collapse” into a one-side coin. There is no physical collapse, there is only an interpretational collapse - a decision.

But suppose that the coins were so tiny and delicate, that the mere act of observing one, totally altered its state - it gets “flipped” every time you observe it. Now it becomes impossible to ever repeat any observation of any coin’s relative state. So it is impossible to make a second measurement of such a coin’s original state.

This brings us to the EPR paradox. Since it is now impossible to remeasure any coin, as when attempting to measure a second “component”, EPR suggested, in effect, to create pairs of coins that were “entangled”, such that they are always known, a priori, to be either parallel or anti-parallel. Hence, a measurement of one coin, should not perturb the measurement of the other. The relative orientation of the coins, relative to any observer, is assumed to be completely random. But relative to each other, the coins are either parallel or anti-parallel.

It turns out that for small particles like electrons, it is much easier to create entangled-pairs that are anti-parallel, than parallel, so we will restrict the following discussion to the anti-parallel case.

Now, whenever Alice and Bob measure each coin (one from each anti-parallel, entangled-pair) in a sequence of coins, they obtain a random sequence of “heads” or “tails”, since, regardless of what angles they decide to observe a coin from, all the coins are in different, random orientations.

But what happens if they record both their individually, decided measurement angles and the resulting, observed states of their respective coins, and subsequently get together and compare their results?

As expected, whenever they had both, by chance, decided to observe their entangled-coins from the same direction, they observe that their results are always anti-parallel; if one observed “heads”, then the other observed “tails.” And if they had both, by chance, decided to observe their entangled-coins from exactly opposite directions, they observe that their results are always parallel.

But what happens when they, by chance, happened to observe their coins at other angles? This is where things start to get interesting - and subject to mis-interpretation! Because what happens, is critically dependent upon how accurately Alice and Bob can actually decide upon the observed state of their respective coins.

If both observers can clearly observe their coins, and make no errors in their decisions, even when the coins are perfectly “edge-on”, then you get one result (Figure 1, in this paper: http://vixra.org/pdf/1609.0129v1.pdf ) , when the observers get together and compute the correlations between their observations.

But if the coins are worn-down, dirty and bent-out of shape, and can only be observed, for a brief instant, far away, and in the dark, through a telescope (AKA with limited bandwidth, duration, and signal-to-noise ratio). Then a completely different type of correlation will appear, due to the fact that many of the observer’s decisions are erroneous; they mistakenly decided to call the coin a “head” when it was really a “tail” or vice-versa. Or they may even totally fail to detect the existence of some of the coins (fail to make any decision), as the coins whiz past them, never to be seen again.

And if they attempt to mitigate these “bit-errors”, by attempting to assess the quality of their observations, and eliminating all those measurements of the worst quality, then they will get yet another correlation - one that perfectly matches the so-called “quantum correlations”, when analogous, Bell-type experiments, are performed on subatomic particles, like photons or electrons.

So, should quantum correlations be interpreted as a “spooky action at a distance”, or just a misunderstood classical phenomenon? Given the (little known) fact that the limiting case of the Heisenberg Uncertainty Principle can be shown to (just an amazing coincidence !!!???) correspond to a single-bit-of-information being present, which thereby guarantees that every set of multiple observations must be “strangely correlated”, “spooky action at a distance” seems to be an implausible interpretation, at best.

Edited by Rob McEachern
Posted

It is worth pointing-out, that a classical coin, as described above, is simultaneously BOTH a heads and a tails - that is what a superposition of two states looks like - a coin - until an observer makes a decision, and "calls it" - either a heads or a tails. Perhaps it should be called Schrödinger's coin.

  • 3 weeks later...
Posted

The figure below depicts the fundamental problem with all Bell tests. Imagine you have received a message, in the form of a sequence of polarized coins, as shown in the top line of the figure below. Your task is to decode this message, by determining the bit-value (either 1 or -1) of each coin (a-g). If you know, a priori, the one-time pad (https://en.wikipedia.org/wiki/One-time_pad ) that must be used for the correct decoding (the second line in the figure), then you can correctly decode the message, by simply performing a pixel-by-pixel multiplication of each received coin and the corresponding coin in the one-time pad, and sum all those product-pixels, to determine if the result is either positive (1) or negative (-1) as in the third line of the figure. (Red = +1, Blue = -1).

But what happens if the coins are "noisy" and you do not know the one-time pad, so you use randomly phased (randomly rotated polarity) coins instead of the correct one-time pad? You get a bunch of erroneous bit-values, particularly when the received coin's polarity and its "pad" are orthogonal and thus cancel out (sum to zero). But the noise does not cancel out, so in those cases, you end up with just random values, due to the noise. The statistics of these randomized bit-errors is what is being mistaken, for "quantum correlations" and spooky action at a distance.

 

 

5975110fb8ef5_onetimepadcoins.thumb.jpg.582a2b2bb5243d7413daf6ce99f8f62a.jpg

Posted (edited)

To be perfectly honest, I don't much care about the various loopholes of test or experimental riggor.  I thought I should bring it up since your line seemed to be more along the rigors of test results which I didnt go into that closely since I am not a physicist or an exxperimentalist. So experimentalist is the last thing I need to know, unless a physicist decides to test my hypothesis, in which case I'd be bonning up on the riggors of experimental tests.

I think questioning validity of experimental results is proper and fine if it looks like they made a mistake. My OP wasnt about the vilidity. I was asking if any anomalous results were found? 

I cant access the data myself since you need a student or professional registration for that.

 

Specifically I was asking if the results showed the two testers tested the same spin 25% instead of 33%

The intuitive assumption is they match 33% but I never pay attention to those assumptions when intuition doesnt apply.

My logic based conclusion says says that they are comparing relative results between the two testers, So there is information added from both testors and when you look at the data that way, the they should be reading the same test orientation 25% of the time.

It's non-intuitive, but so what? Its not in our normal human expereinces either so why would we expect results to be intuitive?. Phusicists in QM should understand this more than anyone.

But since no one seems to want to answer this question I do have one other question about Bells premise of classical results.

QM predicts a sinusoidal distribution which  the test data agrees and I dont have a problem with that.

However, it says that for a classical system the expected results results should be a triangle distribution which doesnt seem correct. Or at least I don't know of anything in probability that predicts a triangle distribution, I'm not saying it doesn't exist, just never heard of how it was derived and would like to see a source that explains it.. 

On the other hand I have heard of a binomial distribution which seems like the appropriate distribution for Bells Inequalities and binomial distributions seem closer to sinusoidal than triangular.

 

 

 

Edited by TakenItSeriously

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.