fredrik Posted September 26, 2007 Posted September 26, 2007 I am curious if there is anyone on here that would under some circumstances could imagine a sensible fundamental non-unitarity model of physical reality? To argue about conservation of probability may seem foolish, because it follows from the axiomatics of probability theory. This is not the level the question is intended. The key is, in reality, as opposed to pure mathematics, can we really properly attach the axioms of probability theory to a fundamental theory? That we can do it with great success in many effective theories we all already know. So that isn't the question either. We can calculate the probability, and then collect all the data from the lifetime of the universe and then get the relative frequency, which we take to represent probability? Is this completely satisfactory, or just almost satisfactory? And does the distinction make a difference? I don't mean to make this a lenghty discussion of the topic but I am mainly curious to hear comments, reflections and opinions from everyone that has been reflecting over this once of twice. /Fredrik
fermion Posted September 26, 2007 Posted September 26, 2007 I don't think there is any mathematical argument for unitarity (i.e. for conservation of probability.) We insist on unitarity based on experimental observation. Among other things, for example, the fundamental particles do not disappear into nothing occasionally, or appear from nowhere. All conservation laws (both continuous and discrete) have some connection to unitarity, and in the absence of unitarity they all get modified in strange ways. While we don't know for sure we live in a unitary universe, no observation has been reported to the contrary. So, I think you can build non-unitary theories:-) , but you will have a very difficult time to keep the predicted rates of the unwanted (more precisely yet unobserved) processes under control:-( . You may be able to tune some parameters to suppress some of them, but probably not all.
Severian Posted September 26, 2007 Posted September 26, 2007 You could very easily have a non-unitary theory by placing current theories in a box. When the wavefunction of a particle leaves the box, you have a unitarity violation because the probability of finding it somewhere in the box (where the theory is defined) is no longer one. That may sound trivial, but in essence this is what a discovery of non-unitarity would tell you - that you have forgotten to include something (like an extra dimension). Or at least, that is how it would be interpreted...
ajb Posted September 26, 2007 Posted September 26, 2007 You might violate unitary on space-time that are not globally hyperbolic. Such space-times can have CTC's and this must make things more complicated. So, if we don't include non globally hyperbolic space-times, I would agree with Severian. Either we must throw away the theory or try to "cure" it somehow. One possible way might be to include more fields, as is done in Yang-Mills theory viz. FP ghosts and anti-fields.
fredrik Posted September 27, 2007 Author Posted September 27, 2007 Thanks for the comments! In a sense I'm all with you, I think what you say is pretty much the standard methodology - unitarity is sort of a consistency check, and if new evidence arrives, the "scientist" needs to replace or rework his theory to recover unitarity... I will not argue with this. ..buthere we are stepping over the non-trivial step - the response that recovers unitarity? This response itself does not seem to respect unitarity in the general case. My point was that in the quest of a fundamental model, one might expect some consistency of reasoning beyond what is demanded from an effective theory. Shouldn't one such requirement be a uniform treatment of interactions? Why should the scientists response to observations, be fundamentally different than a particles response to a field, or collision wit another particle? Indeed a solution is to inflate our models, more dimensions and fields - basically increase the information capacity of our model, but if those additional degrees of freedom are introduced on a generic basis and taken to be objective, then I see another issues with that. The model is far more complex than called for, in most cases. The model is designed to give unitariy even in the worst case. We are considering hidden / not yet seen structures to explain what we might possibly see. Isn't this severly disturbing?? Sure unitarity makes things easier in a certain sense - it provides us with a solid playground. In non-unitary models, the playground itself is not solid. But that's exactly what reality is, isn't it? We insist on unitarity based on experimental observation. Is this *really* what we see, or is it what we think we see? Anway, this is not what I see. I see a bunch of scientist that do unitary theories, and everytime they are wrong, they respond with reworking the theory to again make it unitary. And they pretend this "reworking" isn't part of reality. Why would the life of a scientist be _fundamentally_ different than that of any physical system? The one obvious difference is an enourmous scale difference in complexity, but if we are looking for fundamental understanding of reality, I can not accept this special treatment? /Fredrik What I'm after is a fundamental understanding of physics in terms of dynamic inflation that is observer invariant. The let's say "differential" deviation from unitarity is what inflates and tweaks the observers state. That we can not just keep inflating things in absurdum seems clear because a limited observer can hardly (informationwise) relate to this? And then the question is... what happend to the idea that science is about what the observer can measure? I see a consistency issue in the methodology here. Comments? /Fredrik but you will have a very difficult time to keep the predicted rates of the unwanted (more precisely yet unobserved) processes under control:-( . You may be able to tune some parameters to suppress some of them, but probably not all. Yes since non-unitary modelling is a fundamental relaxation, it gets more sensitive. But in my thinking there may be fundamental principles that yields stability - self organisation, unstable structure simply don't survive, they exist transiently but they don't persist. In my thinking anytime an observer are close to equilibrium (in a wide sense) with his environment he will see unitariy, all consistent with experience. But one can imagine cases where we are either so far from equilibrium that unitarity is a poor approximation. This is where I associate equilibration to learning. This provides (in my thinking) the analogy between scientist and any other system. The particles knowledge it has attained, need to be stored in correlation to the environment, also this equilibration is I think relatedto the arrow of time, and explains why the arrow of time is hard to see on microlevel, but as we beef up the complexity it becomes evident. To me it's and indication of something missing. /Fredrik
foodchain Posted September 27, 2007 Posted September 27, 2007 I am curious if there is anyone on here that would under some circumstances could imagine a sensible fundamental non-unitarity model of physical reality? To argue about conservation of probability may seem foolish, because it follows from the axiomatics of probability theory. This is not the level the question is intended. The key is, in reality, as opposed to pure mathematics, can we really properly attach the axioms of probability theory to a fundamental theory? That we can do it with great success in many effective theories we all already know. So that isn't the question either. We can calculate the probability, and then collect all the data from the lifetime of the universe and then get the relative frequency, which we take to represent probability? Is this completely satisfactory, or just almost satisfactory? And does the distinction make a difference? I don't mean to make this a lenghty discussion of the topic but I am mainly curious to hear comments, reflections and opinions from everyone that has been reflecting over this once of twice. /Fredrik Perhaps I don’t understand the question enough but are you by chance looking more for a dynamic selection over a constant? Such as maybe how we would observe or what would exist if most the visible universe was made of helium or carbon, or maybe some other exotic material? I think I understand the question to a point in which you may be just referencing mathematical tools of models used for data, and if that’s the case well I don’t think I know enough about such really. I always wondered what really happened when all the natural phenomena of the universe was fed into math operators and the likes, but its just wonder on my behalf and I doubt I could word such well enough for debate.
fredrik Posted September 27, 2007 Author Posted September 27, 2007 Perhaps I don’t understand the question enough but are you by chance looking more for a dynamic selection over a constant? The question is remotely related to many things but the main question was to hear what people think of the appliance of normal probability theory to describing reality with focus on natural and physical phenomena. I always wondered what really happened when all the natural phenomena of the universe was fed into math operators and the likes One can probably make many reflections on this in different wayas. One way of seeing it is that if we want to make quantiative predictions in an structured way so we can have some book keeping on our progress, one seems to be naturally lead to develop symbolics and the concept of numbers and mathematics, which tends to be provide us with a consistent language, and sometimes considering the possible mathematical formulations, can provide a hint to how reality might behave that is consistent with the chosen framework. I see mathematics as a something that has been developed by humans, and humans are part of nature. It seems to guide many physicists. For example extending the intuitive ideas of geometry to higher dimensions and non-euclidian geometries has attracted many people. Assuming you think that reality must be described by such framework, you can study the formalisms themselves and find that there are only certain ways that is internally consistent, which means consistent with the framework itself. The frameworks themselves are constructed from axioms, which themselves can not be proven or disproven, then must only be consistent with the rest of the axioms. One can construct many different frameworks from different axioms, and the question is how to choose the more efficient one. Also, there is nothing that stops us from constructing new axioms. Another framwork is the probabilistic one. My question was wether the probabilistic framworkd, originating from the axioms of probability really provides a sounds basis for a fundamental theory as is? This is almost a philosophical question, but then even science has it's roots in philosophy. My focus is not just to find the best theory, but also to describe that very process better, and consider theories as dynamical objects too. /Fredrik
foodchain Posted September 27, 2007 Posted September 27, 2007 The question is remotely related to many things but the main question was to hear what people think of the appliance of normal probability theory to describing reality with focus on natural and physical phenomena. One can probably make many reflections on this in different wayas. One way of seeing it is that if we want to make quantiative predictions in an structured way so we can have some book keeping on our progress, one seems to be naturally lead to develop symbolics and the concept of numbers and mathematics, which tends to be provide us with a consistent language, and sometimes considering the possible mathematical formulations, can provide a hint to how reality might behave that is consistent with the chosen framework. I see mathematics as a something that has been developed by humans, and humans are part of nature. It seems to guide many physicists. For example extending the intuitive ideas of geometry to higher dimensions and non-euclidian geometries has attracted many people. Assuming you think that reality must be described by such framework, you can study the formalisms themselves and find that there are only certain ways that is internally consistent, which means consistent with the framework itself. The frameworks themselves are constructed from axioms, which themselves can not be proven or disproven, then must only be consistent with the rest of the axioms. One can construct many different frameworks from different axioms, and the question is how to choose the more efficient one. Also, there is nothing that stops us from constructing new axioms. Another framwork is the probabilistic one. My question was wether the probabilistic framworkd, originating from the axioms of probability really provides a sounds basis for a fundamental theory as is? This is almost a philosophical question, but then even science has it's roots in philosophy. My focus is not just to find the best theory, but also to describe that very process better, and consider theories as dynamical objects too. /Fredrik I have to agree. Regardless of opinion a certain pattern of thought of worldview has to surround the subject, which then of course leads to philosophical underpinnings. An example I can think of is biology. For instance its easy for many to quickly view life from the aspect of DNA. What about a view of billions of cells, differentiated even, working in concert. Of course this did not come about overnight, but evolved. So regardless of any current thought on the subject the one thing that instantly becomes apparent to me is that naturally evolution occurred, was allowed to occur, or some other wording to it. So to take the concept of evolution from the biological realm and look for other applications of such is something of an interest to me. Such as in regards to other natural phenomena. Now in the universe physics is to be the same throughout right? Well giving the concept of QM, that would to me suggest probability, but one that has bounds, or else physics would come to be variable I would think relative to a giving environment. It would seem though that all the matter or energy in the universe has to some extent formalized to express what allows the idea of physics as understood to be universal throughout the visible or known universe. The problem as I would see it is how far in the past can we go with that. Such as conservation laws and QM, do they correctly explain each other, or need to? Much as the drive to connect QM with relativity which seems to this date to elude any empirical answer. I just wonder how the current spectrum of physical laws came to dominate in the universe, if such was dynamically selected at a certain point from the reality of QM. Then of course many other questions exist such as dark stuff such as dark matter or energy that for some reason seems to exist but escapes currently means to interact with such. The universe overall does not seem to be static, and one of the prime reasons I say this again is that from the biological realm, things are dynamic, from the laws of chemistry and physics how life has evolved or continues to do such, could such allow other means to be derived to view possibly why we have hydrogen or galaxies for that matter, or even black holes. Which would imply at first some degree of philosophy towards making even a hypothesis. I think a neat place to look would be time as it relates to the QM level in regards to even the simple concept of multiple universes, but all of such is so large and currently the big means of testing seems to be atom smashers, but all of this ultimately is conducted in the presence of or an overall environment of the same, not is some void, which I think has an impact.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now