Jump to content

Recommended Posts

Posted
dude law of conservation of mass is simple chemistry. i could go and quote it to you out of my chemistry book if you would like.

 

Dude, it's an approximation of chemistry, because the mass difference in a few eV of a chemical reaction isn't large enough to worry about and probably won't cause a problem. But it's still an approximation.

  • Replies 57
  • Created
  • Last Reply

Top Posters In This Topic

Posted
Dude, it's an approximation of chemistry, because the mass difference in a few eV of a chemical reaction isn't large enough to worry about and probably won't cause a problem. But it's still an approximation.

 

can u give me a link or some such proof for this?

Posted
can u give me a link or some such proof for this?

 

It's relativity

 

E = mc2

 

Perhaps you've seen this equation before; mass is one form of energy, and it's total energy that is conserved.

 

c2 is about 931.5 x 106 eV/amu

 

So for each eV in a reaction, your mass difference is about 10-9 amu, and the reactant molecules typically have tens or hundreds of amu of mass. Nobody worries about a part in 1010 (or smaller) if your calculations don't go to that many digits. Hence, mass being conserved is a reasonable approximation in chemistry, and fluid flow, and many parts of science.

 

But in a nuclear reaction, the binding energies are a million times or more larger, and a reaction liberating MeV of energy results in a reasonable fraction of an amu difference in mass. It is no longer something you can validly ignore.

Posted
can u give me a link or some such proof for this?

 

The proof is simple---electron positron anihilation. Photons are massless, and electrons aren't. Two electrons annihilate into two photons.

Posted

sorry i be quite the amateur. im going to be taking sum college courses on this, but i have no formal education. all i know is wat ive learned from reading on my own. so y do scientists approximate that and say that mass is conserved? i thot u werent supposed to approximate like that? wouldnt it mess up more complex equations, or do they factor that in?

Posted

1. Yes. That's why we never say mass is conserved.

2. If you plan to be taking ``sum'' college courses, you should learn some Enrish so you can m-press your professors.

Posted
sorry i be quite the amateur. im going to be taking sum college courses on this, but i have no formal education. all i know is wat ive learned from reading on my own. so y do scientists approximate that and say that mass is conserved? i thot u werent supposed to approximate like that? wouldnt it mess up more complex equations, or do they factor that in?

 

Scientists and engineers make approximations all the time, and it's OK if you check and make sure that whatever you're ignoring is small compared to other terms and ignoring it won't affect the answer.

 

"Conservation of mass" is really conservation of certain types of particles. In chemistry, it's the number of nuclides. You know that in the reactions under study, whatever you start with, you will end up with the same number of each type of atom; only the configuration will change. "Conservation of mass" is a somewhat sloppy way of saying that.

Posted
destroying information utterly, this is in no way consistent with quantum mechanics. One can show (as I said earlier) that unitary operators forbid information from being lost. This means that, if you want a good interpretation of probabilities, the evolution of the system has to be governed by unitary operators. If information is lost, the evolution is unabiguously non-unitary

 

I agree this is the eye of the needle.

 

Not everyone might agree that standard QM does offer a "good interpretation of probabilities" in the first place, even though I suppose most people think so(?).

 

If probabilities are conserved at all times, things gets easier, But I see no fundamental reason for elevating such an (admittedly natural) expectation to something that must never be violated. To me it's an expectation supported by the accumulated experience of science, making it a good case, but to assume that expectations can't change is naive in principle.

 

It's often argued that probabilities are conserved by definition. But then the missing point is that real life is not a mathematical system. The ensembles and imagining of infinite experiment series is the weak point in the logic, ultimately making it (IMO) nothing more than an expectation, no matter how good.

 

I think it will be very exciting to see what comes out once we have some better proper understanding of gravity in the context of quantum information. My personal guess is that the interpretation of probabilities and information from standard QM will not be left untouched.

 

/Fredrik

Posted
Not everyone might agree that standard QM does offer a "good interpretation of probabilities" in the first place, even though I suppose most people think so(?).

 

Ummm...who is everyone? Quantum mechanics superficially offers a good interpretation of probability because that's how it is constructed---you know the song, hermitian operators make the time evolution operator unitary...

 

I don't know who in the world would disagree with this...

 

If probabilities are conserved at all times, things gets easier, But I see no fundamental reason for elevating such an (admittedly natural) expectation to something that must never be violated. To me it's an expectation supported by the accumulated experience of science, making it a good case, but to assume that expectations can't change is naive in principle.

 

I'm not REAL sure what you are talking about, but one would expect the evolution operators to ALWAYS be unitary. Are you making a general statement? Or is there some counterexample you have?

 

The idea of ``conserving probabilities'' seems pretty meaningless to me, and I don't have any idea how one would do physics if one couldn't do statistics.

 

I am thouroughly confused by your post. You seem to be open to the possibility that probability is not something fundamental. But this is wrong, which has been shown in the hidden variables experiments. So I really don't know what you are saying here.

Posted

This gets both speculative and philosophical... so forgive me. But I figure the original topic was a bit speculative by definition.

 

Ummm...who is everyone? Quantum mechanics superficially offers a good interpretation of probability because that's how it is constructed---you know the song, hermitian operators make the time evolution operator unitary...

 

I don't know who in the world would disagree with this...

 

Ben, at the level of "superficial" or effective theories I have no problems with this - in QM as we know it, time evolution must be unitary for consistency, this is OK. There is no case for having second opinions on a proven effective theory - if it works it works. My opinions are not at the level of effective theories. It would be begging to play a fool to suggest that QM is anything but highly successful. QM is certainly one of the prides of modern science.

 

But the real life problem, and also a philosophical problem, and as per my personal guess a likely key to a more fundamental framework for our models in the more generic case, is howto properly define a *probability* (unlike relative frequencies which is not the same thing) from real experiments. So "good" interpretation may be interpreted in two ways, simple or realistic. The standard interpretation is simple but IMO not realistic.

 

This touches the philosophical interpretation of probability theory as in frequentist vs bayesian. We don't measure probabilities, we measure indirectly relative frequencies, and then various information is merged to an expectation of a probability at best. The assumption is that if the same experiment that is repeated sufficiently many times, will have the relative frequency converge to something, and we assume this to be the objective probability. It's not hard to see that this contains a number of assumptions that is sometimes acceptable, but that is vague in the general case.

 

For example pulling balls out of a urn. It's easy to see how an unambigous probability can be inferred from experiment to an arbitrary accuracy. But this is a simple special case. In the general case this appeal to intutition is not near satisfactory to me at least. I prefer to speak for myself, and I am not pleased with the standard procedure. Without supplying any list or anything, there are others that share this critics. Anyway, I agree with you that resolving this is not done over a coffebreak. I don't want to imply that I have the answers, at least not yet, but I want to highlight some points that I think is relevant. However there is a logic in rejecting this unless there are some faint ideas howto really resolve this by abandoning the standard probability interpretation. I have ideas, and these motivates me. In my view this is not at the level of effective theories, it's on the level of "modelling the models".

 

The idea of ``conserving probabilities'' seems pretty meaningless to me, and I don't have any

idea how one would do physics if one couldn't do statistics.

I think your standpoint is the most common one. Given no other options, any one option is defendable.

 

I am aware I'm in minority but that's why I wrote (not everyone).

My idea is that one can still to statistics, but there is a required tweak to it, and consistency at leads me to an evolutionary formalism. Bayesian probabilities is a component. I think of the "second quantization" as a simple type of application of these ideas. But I think it can be generalized.

 

I am thouroughly confused by your post. You seem to be open to the possibility that probability is not something fundamental. But this is wrong, which has been shown in the hidden variables experiments. So I really don't know what you are saying here.

 

I think you misunderstood me, probably becaues I was unclear, see above.

 

I am not talking about hidden variables, I am no "bohmist" at all. My "suggestions" is not at all a hidden variable thinking.

 

QM typically says that we can not generally with arbitrary precision determine the outcome of a particular experiment, but we can with arbitrary precision determine the probability of a certain outcome. I am suggesting that not even the probabilities can be known precisely. Standard QM uses the concept that they can, and in many cases they of course can be known with almost arbitrary precision for all practical purposes.

 

My suggestion, however generates a second problem, namely that everything suddenly seems to get very uncertain and we get more unknowns than we need... and we seem to loose grip of things. But I see some possible resolutions to this, and that can only be understood in a evolutionary context.

 

I have not found what I am looking for yet, but intuitively I associate close to unitary behaviour to dynamics near information equilibrium systems. Near equilibrium here refers to information exchange. Far from equilibrium our expectations are sometimes poorly met, and we learn alot, during this process I see non-unitary behaviour are a solution - not a "problem".

 

This also explains why particle physics experiments beeing reproduced in controlled in labs tend to stay unitary... it's close to information equilibrium, relatively speaking. I define info. eq. to be when your expectations stay unitary. When you are FAR from eq. OTOH, even your very _reference_ is shaking... and at this point I would expect non-unitary processes relative to certain observers. I also think that the most general observer-2-observer transformation is hardly unitary, because sometimes observers simply can't encode the same information. There is an element of learning or equilibration involved, which also will (IMO) will bring time into the picture.

 

The exact mathematical formalism for all these terms I use are yet to be found. To me this is all intuitive at this point. I'm not sure if I was able to express my point... there are multiple focal points here... but the interpretation of probabilities and unitarity is one... which was my point.

 

/Fredrik

Posted

Ben, perhaps you've seen this old notes from John Baez FUN section

 

http://math.ucr.edu/home/baez/nth_quantization.html

 

It's obviously some fun toying, but this does tangent to some of my thinking and toying aside it's not as silly as it might seem at first. But to make that into something sensible, I think one has to combine it with some other constructions - this is in order to avoid getting lost in an infinite landscape ;) This is in part what I'm trying to understand for myself.

 

This touches some of the string/brane thinking, however I am not into string theory. It seems several alternative approaches are sniffing each others backs and it's possibly not a conincidence either.

 

/Fredrik

Posted

I don't think modern blackholes destroy information since they are finite. The point singularity is the theorical limit and not reality. I would wager that GR and SR both face the exact same practical limitation; the point reference (like C) will require infinite mass/energy to occur. Blackholes in the universe are still trying to achieve that state, as reflected by them still pulling matter in. The point limit would occur when all the matter/energy is pulled in.

 

Look at it this way, if you fell into a blackhole one would never reach the point singularity in the center. That is because it doesn't exist. But also one would be accelerating causing the GR to undergo SR conversion. The SR can conserved information since relativistic mass behaves differently. The twin parodox conserves information but transposes the time. Or the information would be sort of useless since things would have evolved.

Posted

pioneer---your point is that black holes don't exist because they take an infinite amount of time to build. This is not the case.

 

For example,

 

Look at it this way, if you fell into a blackhole one would never reach the point singularity in the center.

 

This is flat out wrong. The infalling observer always reaches the singularity in a finite amount of time. A stationary external observer (at infinity) never sees the infalling observer reach even the horizon. But this is only true for an observer at infinity. Nearby, the outside observer watches the infalling observer fall into the hole, perhaps with some wierd effects.

 

But also one would be accelerating causing the GR to undergo SR conversion. The SR can conserved information since relativistic mass behaves differently.

 

Meaningless.

 

The twin parodox conserves information but transposes the time.

 

More meaningless crap.

 

Or the information would be sort of useless since things would have evolved.

 

Missing the point completely.

 

Like I said, pioneer. Information loss is EASY TO SHOW for a black hole. Your main argument, that black holes take an infinite amount of time to form so there are no black holes, is wrong.

Posted
Collapse in the more analytically soluble models occurs asymptotically in time, though the time constant is not that long; you tell me...

 

I'm not sure norm---I think this is only for observers at infinity. Locally, the observers see a black hole form, and information IS lost.

Posted

You may be confusing what happens when. Why can I not argue that the proper time experienced by those "falling through" never gets to happen, to the "outer observers"? This is not a simple mapping of delayed light pictures. Where it gets interesting, to my understanding, is that the asymptote is only logarithmic, so that within ponderable time, we get to quantum separations from the horizon. Otherwise, you cannot have your relativistic spacetime cake and eat it too.

Posted
Why can I not argue that the proper time experienced by those "falling through" never gets to happen, to the "outer observers"?

 

Probably because there is another frame which kills your argument :) Essentially what you are saying is correct, but only for an observer at infinity. For all other observers, the observer really does fall in.

 

Where it gets interesting, to my understanding, is that the asymptote is only logarithmic, so that within ponderable time, we get to quantum separations from the horizon.

 

Not sure what this means. Logs never asymptote? Either way, this sounds like exactly what I said.

 

Generally, what these GR thought experiments fail to mention is that the observers are point-like in these thought experiments. Having extended observers causes many problems. So, for example, a space-ship falling into a black hole causes the mass to increase, and the horizon to increase in area. This is why non-asymptotic observers actually see the ship fall in---the ship gets closer and closer to the horizon, then the horizon increases in size and swallows the ship.

 

Again, my GR is not that strong, so don't take any of this at face value. But I am fairly certain (i.e. I discussed this with a bunch of people who knew more than I do) that observers that aren't at infinity definitely see the infalling observer fall in.

 

This is also how Hawking has resolved the information paradox, at least in his mind.

Posted

I figure the expanding event horizon scenario holds for accretion in the large, but this is not at all clear for a "particle" falling in. My book solves this as: [math] r-2m=8\pi e^{-(t-t_0)c/2m}[/math] and the authors say, "It is thus apparent that r=2m is approached but never passed by the falling body if one uses t as a time label... it corresponds to the proper time of an observer at rest far away from the central body. Thus in the finite proper time in which a test body falls to r=2m we would expect that the entire evolution of the physical universe exterior to r=2m has already occurred, so that the physical meaning of further fall becomes questionable in the context of Schwarzschild geometry."

Posted
it corresponds to the proper time of an observer at rest far away from the central body.

 

This is the kicker. If observer is not very very far away, the the conclusion fails.

Posted

Maybe not. What is r? Is it the Schwarzchild radius? Or the distance of the observer from the horizon.

 

You'll need something like [math]1-r_s/R \sim 1[/math], I think. R is the observer, and r_s is the Schwarzchild radius.

Posted
1. Yes. That's why we never say mass is conserved.

2. If you plan to be taking ``sum'' college courses, you should learn some Enrish so you can m-press your professors.

 

ah yes...the tiresome attempt at trying to make one look foolish. very childish of you ben. you see i could easily spell it right if i wanted to, i'm just used to texting and IM'ing plus i was saying nothing scientific so therefore no reason for spelling correctly. and, query, if you were a real scientist would you not be encouraging the furthering of knowledge by college courses rather than trying to discourage by making snide remarks? hmm...confusing.... in any case hate not on those you have not met and who are not hating on you.

Posted

The far-field metric is assumed to be a Lorentzian flat-space so r is that radial coordinate. As you go inward, things are compressed by the appropriate metric expression; there is more there, there. If the starting point of an infalling object is not "really far away", it amounts to a lessening of the coeff. of [math]8\pi[/math]. It's a cute challenge to analyze this and I cussed for a while today, but it is a variation of the limit one can figure as r>=2m. You really don't want to see the full display, and I don't want to compose it. It ought to be clear why I said that we logarithmically approach the horizon... but also that "we get" infinitesimally close in a reasonable coordinate time. Physics here is a glorious mess. . . .MY APOLOGIES BUT DYSLEXIA STRIKES. . .where I wrote [math]8\pi[/math] it should read 8m.

 

For a test body falling from rest at radius [math]r_0[/math], the trajectory is described by: [math] c(t-t_0) = \frac{2}{3\sqrt{2m}}(r_0^{3/2} - r^{3/2} +6m\sqrt{r_0} - 6m\sqrt{r}) \;\; -2m\log{\frac{(\sqrt{r_0}+\sqrt{2m})(\sqrt{r}-\sqrt{2m})}{(\sqrt{r}+\sqrt{2m})(\sqrt{r_0}-\sqrt{2m})}} [/math]. When we consider approaching the event horizon, the singularity from the 'zero' in the numerator of the logarithm takes over, and you can have fun working out the correct [math] \lim_{r\to 2m} [/math].

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.