Duda Jarek Posted December 21, 2008 Posted December 21, 2008 (edited) Some physicists believe in the possibility of instant time travels. Let's assume hypothetically something much simpler and looking more probable - that physics of four-dimensional spacetime we are living in, allows for microscopic loops which include time dimension. If they would have at least microseconds and we could amplify/measure them (Heisenberg uncertainty principle...), we could send some information back in time. Observe that computers based on such loop could instantly find fixed point of given function: Let's take for example some NP-problem - we can quickly check if given input is correct, but there is huge (but finite) number of possible inputs. So this computer can work: - take input from the base of the loop, - if it's correct, send back in time to the base of the loop the same input, if not - send the next possible input (cyclically). If there is correct input, it would be the fixed point of this time-loop, if not - it should return some trash. So we would only need to verify the output once again after all (out of the loop). Can such scenario be possible? General relativity theory says that local time arrows are given by solutions of some equations to the boundary conditions (big bang). CPT symmetry conservation suggest that there shouldn't be large difference between past and future. These arguments suggest so called eternalism/block universe philosophical concepts - that spacetime is already somehow created and we are 'only' going through it's time dimension. I've recently made some calculations which gave new argument, that such assumption actually gives quantum mechanics: Pure mathematics (maximizing uncertainty) gives statistical property - Bolzman's distribution - so it should be completely universal statistics. If we would use it to find distribution on constant time plane, we would get stationary probability distribution rho(x)~exp(-V(x)). If we would use it to create statistics among paths ending in this moment, we would get rho(x)~psi(x) (quantum ground state). If we would use it to create statistics among paths that doesn't end in this moment, bu goes further into future, we would get rho(x)~psi^2(x) - like in quantum mechanics. So the only way to get QM-like statistical behavior is to threat particles as their paths in four-dimensional spacetime. So spacetime looks like four-dimensional jello - both 'tension' from past and future influence the present. http://www.scienceforums.net/forum/showthread.php?t=36034 It suggest that particles should for example somehow prepare before they would be hit by a photon. The question is if this can be measured (uncertainty principle)? If yes - are these times long enough to be useful? Observe that if the answer is yes, such computer could e.g. break RSA in a moment. To make cryptosystems resistant to such attacks, they should require long initialization (like based on Asymmetric Numeral Systems). ----------------------------------- I though if we could reduce the required number of bits transferred back in time, and it looks like one (B) should be enough (this algorithm intuitively looks less stable?): - if B then 'input' -> next possible 'input' (cyclically) - if 'input' verify the problem -- then transfer back in time B=false -- else transfer back in time B=true. If it could it should stabilize on B=false and some solution. Such algorithm means that it uses input(B) from some physical process which can predict (in microseconds) for example if there will be photon absorbed and on the end emits this photon or not. If physics could stabilize this causality loop, it should be done. If not - it would be stabilized by breaking it's weakest link - making that the prediction would gave wrong answer. I believe here has just started discussion: http://groups.google.com/group/sci.physics/browse_thread/thread/c5f055c9fc1f0efb ------------------------------------------------- To summarize: the verifier is completely classical computer, but when it is coupled with some effect which can transfer data a few nanoseconds back in time, the physics should make that this couple create stable causality loop. But it could only happen if it by the way solve given NP problem (or e.g. find a keys so that decrypted message looks to have significant correlations). If for given instance of problem, there would be created dedicated chip - which makes calculations layer by layer (without clock), it should make the verification in nanoseconds - such jumps are easier to imagine. This suggests some nice though experiment: make such loop but much simpler - just spatial (it's a tube in four dimensions): take a chip with verifier of some problem and the algorithm from the first post. Now instead of sending 'input' back in time just connect it to the 'input'. Such loop should quickly check input by input and finally create stable loop if it can... Is that really???? This scenario require the clock, doesn't it? What if there wouldn't be a clock...? Shouldn't it find the solution practically instantly? Edited December 21, 2008 by Duda Jarek multiple post merged
bascule Posted December 22, 2008 Posted December 22, 2008 If they would have at least microseconds and we could amplify/measure them (Heisenberg uncertainty principle...), we could send some information back in time. This would violate causality and as such cannot figure into a computational model, as all computational models (particularly the von Neumann model which your conjecture appears to ascribe to) are inherently causality-dependent.
Duda Jarek Posted December 22, 2008 Author Posted December 22, 2008 No As I've already written: If physics could stabilize this causality loop, it should be done. If not - it should be stabilized by breaking it's weakest link - by making that the prediction would gave wrong answer. Why this link? Because this link would require an extremely precise measurement of some process which is already not preferred energetically. Creating causality paradoxes should be even less, so for physics it should be easier for example to shorten the time of this reverse temporal propagation, especially that the rest of this causality loop is just classical computation. I think that such spatial, purely classical loop should already have (smaller but still) strong tendency to stabilize. Without clock it would be pure hydrodynamics of electrons: http://www.scienceforums.net/forum/showthread.php?t=37155
bascule Posted December 23, 2008 Posted December 23, 2008 If physics could stabilize this causality loop, it should be done. If not - it should be stabilized by breaking it's weakest link - by making that the prediction would gave wrong answer. Okay, well what machine model are you trying to implement? A Von Neumann architecture? Because that's inherently sequential...
Duda Jarek Posted December 24, 2008 Author Posted December 24, 2008 (edited) I don't see a problem with some architecture? It would make calculations only once, but physics would ensure that it would be the correct one. But from practical reason (prediction) it should be extremely fast - so it even doesn't have to use clock. For many problems, like universal 3SAT, we could do this verifier using some number of layers (O(log N)) of basic logic gate -------------------------------------- Physicists usually believe in CPT conservation, that means that in small scale past and future should be symmetric. So if we make high energy scattering in some accelerator, there should be also created some amount of particles which travels into the past and hit e.g. some detector. From the perspective of out time perception, such particle was created in this detector and goes straight into the scattering point - we should be able to detect the scattering BEFORE it actually happen. So using accelerator as a 'part' of time-loop computer, we should be able to close the causality loop. http://www.scienceforums.net/forum/showthread.php?p=455246 Edited December 24, 2008 by Duda Jarek multiple post merged
Mr Skeptic Posted December 25, 2008 Posted December 25, 2008 That would be possible if there were time travel and the "many worlds" interpretation of quantum mechanics were correct. Otherwise I think it would be a paradox.
Duda Jarek Posted December 25, 2008 Author Posted December 25, 2008 In many worlds interpretation such causality loop doesn't have to bother to close. We have plenty of mystical QM interpretation and all of them wants to look at reality from the perspective of constant time plane - maybe here is the problem! GR, field theories strongly suggest that we have to think about physics four-dimensionally - we live in some already created spacetime and time arrow we percept is only some local solution (eternalism). As I've shown - choosing this view there shouldn't be a problem with QM interpretation. Even better - QM should naturally occur there! Making this assumption - eternalism - that 4D spacetime is already there and is stabilized, causality loops/question we will ask are already stabilized/answered. If physics couldn't do it, it can easily break the weakest link - prediction. For example by choosing statistics of the scattering so that we didn't spot these 'virtual' particles. ------------------------- If the idea could really work, transferring many bits would require much more resources than one. Especially that in practical problems the first algorithm would require hundreds/thousands of them. So I though about the second algorithm, let's remind it (we transfer one bit: B) - if B then 'input' -> next possible 'input' (cyclically) - if 'input' verify the problem -- then transfer back in time B=false -- else transfer back in time B=true. If it could it should stabilize on B=false and some solution. The problem is that 'input' have to be initialized somehow before. The solution could be using a really good random number generator, for example measuring photon 45deg from its polarization (before the loop). So the causality loop should cause that this generator has already chosen a good solution - cause something even in its own past. But because of it, this algorithm looks to be less stable - physics should make the prediction more difficult by for example making statistics of scattering worse.
Duda Jarek Posted December 26, 2008 Author Posted December 26, 2008 So please motivate, why You think it cannot be a realistic scenario? We have our subjective perception of direction of time, but it is only the result of the boundary conditions with relatively small entropy (big bang). What is less subjective is the causality: reason-result chains. And CPT conservation, which allows to switch past and future, suggests that such relations should be possible in both time directions.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now