Jump to content

Distances in space time, "now" vs past and future.


Recommended Posts

Posted

I'm aware, (or at least it's my layman's understanding), that there is no such thing as a universally agreed point in time known as the present between any number of independent observers. However, is it correct to say that any observer has a current point in space time, which could be marked as the present, by which the distance to all other points in space time could be measured against?

 

Is this still allowed if the observer moves in space-time himself, or doesn't it matter since in the relativistic sense he isn't moving, but everything else is?

 

If someone defined the point in space time as the present, being centered on the self, does that point then become the origin 0,0 which defines the coordinates of all other points?

 

Does the size of the point, for example if we narrowed it to the eyes and brain, say 10cm3 define the resolution, accuracy and result of any conclusions drawn using it as 0.0?

 

Does this mean that, being 35, a sphere of 35 light years exists around me at which point I can observe the universe relative to myself as it was at the moment of my birth?

 

If I had an extremely powerful supercomputer and created a 3d model of the present moment from t=0 from when I begin observing and recorded the position of everything in the universe at a distance spherically recording the positions of everything as it expanded at the speed of light, and continued to do so for 10 years, could I create a 3d slice of my present moment 10 years ago?

 

Is there any work where simulations are made which plot in 3d relative to the earth, the stars as the actually are at a given point in time. How much different would familiar sights, say for instance Orion, look if displayed as a relative spacial moment, rather than as we see it normally (space-time dependent).

 

I hear it often said that if we look at stars 65 million light years, we are seeing them as they were during the final days of the dinosaurs. I always found it odd that people were highly impressed by this, as if we are looking at the same stars that the dinosaurs could see, because hasn't that region of space changed for 65 million years, while we can presently look at the slice of time from 65 MYA, it is impossible for us to observe any region of space unchanged which the dinosaurs also saw the same.

 

 

______________________________________________________________

 

Split this is you want - related to the cosmological principle.

 

This all leads to the question, that if all of cosmology and the observations on which its conclusions are based is limited to only those parts of the past which we can observe, isn't it a massive assumption that the current state of the universe actually looks anything like we'd predict it to?

 

Why do we have so much faith in the assumption our observations backwards through time are an accurate representation of the universe. Aren't we really drawing massive conclusions from a tiny fraction of existence. Isn't it like guessing the plot of a movie from a still frame, but one which is made up of a tiny slice of each of the movies frames all stuck together? I know it's all we've got, but do those who study this area acknowledge that the future almost certainly will reveal as new information arrives, from past moments, a picture of the universe which changes our current view. Why isn't it assumed that the information we have is incomplete and new information will very likely change the "facts", how can any hypothesis be tested in a meaningful way with a view of the universe distorted, as we look further out, in time towards the past, never giving any complete snap shot of any moment.

 

For instance, Andromeda is 2.5 million light years away and we predict it merging with our galaxy in about 4 billion years, what certainty do we have that within the 2.5 million years between it's current state and our observed state of it, that there have been no major changes to its structure or course, from unseen (intergalactic black holes, intergalactic clumps of dark matter, for instance), or not yet observable/unpredicted sources. Since our prediction is based off information 2.5 million years old, isn't there an area of space 2.5 million light years around it which could contain objects which before then could interact, why do we have such certainty, to what level of detail have we searched this area?

 

 

As a rough example, let's assume there was an object in line with us and Andromeda, but behind it by 1 million light years, so, 3.5 million light years away from us. That objects travelling towards it and us at 50% c, so it will reach it and effect Andromeda's course in 2 million years, altering the current model, and we won't even know it for another 500,000 years. Now I know that any visible object which is within any likely collision distance of Andromeda has probably been factored in to the models, but that was just an example. Let's assume we try to model the fate of a galaxy or a star several billion light years away, suddenly the possible number of visible objects within range of altering any models we construct increases (and also enables them to be travelling at more likely speeds). So as we increase in distance from the earth, the accuracy of any models of the fates of objects decreases, simply due to the increase in the number of possible objects which could potentially interact, which too if modeled have their own causality sphere in which there can be objects that might possibly interact, changing our predictions of the original interactions, and so on.

 

We are only able to see slices of the present moment from here to 13,8 or so billion years ago, giving a smeared picture of the universe changing over time, while preventing us from directly observing, any entire 3D slice of the past. Rather every slice of the past is a ring and even a planck length is a moment different. It would only be possible to view a slice of the universe in an earth observers present if we began recording from now, storing a huge ammount of data, which at an estimate become unsustainable over a distance no where near close to the size of our current observable universe or require a resolution so great that it would produce an image of very little use at all. My estimate, and requirement for a limit on either the radius, or the resolution (effectively restricting the value of r), comes from the fact that the number of points and thus data in a sphere increases at the rate of 4/3πr3

Posted (edited)

The considerations you've posted amount to not understanding how the FLRW metric works. The FLRW metric takes into consideration time dilation and observer influences.

 

Due to being a homogeneous and isotropic metric though time dilation isn't a problem. You need a mass density gradiant at a point in time for time dilation. So even though the past the Universe is denser. There isn't a time dilation.

 

As far as observer is concerned the FLRW metric uses a fundamental observer in a commoving coordinate system.

 

A fundamental observer is one in which the average energy/density is the same as the average energy/density of his present. Ie not in a gravity well.

 

Cosmology doesn't base all its data on distance measurements. A large portion of determining expansion etc lies in measuring the thermodynamic properties at a given time period and applying the ideal gas laws.

 

Everyone that hasn't truly studied the metric tends to assume we determine expansion strictly by redshift. This is wrong.

 

The truth is we always look for correlating evidence in multiple theories (thermodynamic, particle physics, distance measurements, GR etc) to determine the dynamics of our universe and how it evolves.

 

If say the thermodynamic data doesn't match the distance measurements etc we know something is wrong.

 

 

For example this formula determines the rate of expansion at any given time period. You can see the ideal gas laws applications in the terms under the square root.

 

[latex]H_z=H_o\sqrt{\Omega_m(1+z)^3+\Omega_{rad}(1+z)^4+\Omega_{\Lambda}}[/latex]

 

Your right that we can't see a 3d slice of now but we can see how the Universe evolves over time. As its consistent over 13 billion years it's highly unlikely to change its trends anytime soon

Edited by Mordred
Posted (edited)

Taking all of these thoughts together, and looking at the cosmological principles assumption that the universe is homogeneous on large scales, brings that conclusion into question. How can we assume we have a complete picture of the universe on a large scale, when we actually have a continuous overlaid sequence of events showing only a small section of any one moment in time. We have no complete picture of the universe on a large scale which isn't biased by the scale of time extending further in comparison to the smaller time scales which are conditional for smaller spatial scales. This means we cannot compensate for the inevitable changes in position of objects over time, movement in time ensures that a comparison between a smaller more static picture of the universe will almost certainly be less homogeneous. It is my hypothesis therefore, that a dynamic picture of space and time, is statistically more likely to be homogeneous, than a static picture of a 3D moment of time, provided the 3D size is equivalent, the larger the scale of the picture proportionally the greater the observational artifact. Thus just as a comparison of a still moment of the solar system shows the planets dotted around with little ordered comparison between one another with respect to their orbits. Over a longer ammount of time however, say 165 year, every planet would have completed an orbit and the picture would be of a series of concentric rings encircling the sun. The moment of space evens itself out over time, because as it moves it covers a higher proportion of all possible positions.

 

Currently we have the ability to observe is a picture of the universe which is "smeared out" over the course of all observable history. As an analogy let's compare our knowledge of the universe to observing a game of pool, if you were to take a long exposure picture of a long game of pool, this can be said to be analogous of our view of the universe back towards the CMB, (A more accurate analogy is a concentric series of rectangles from the center of the pool table at the beginning of the game, to the edge at the end, either way both show and hide information which likely, but the selective yet continuous nature of what we are able to see gives the illusion of homogeneity).

 

What we really are looking at is a picture which only shows us the details of sequential moments, because the pool balls over the entire course of the game cover a larger area of the table than at any given moment. The long exposure picture will produce as an artifact a pattern of pool balls on the table which appears at large scales to be homogeneous, but in reality the largest observable scale is made of units which show a frame of the pool table at any given moment over the course of the game, and the largest scale is actually a complete video of the game. The long exposure, smears our sample out, which creates the illusion that it is a representation of the largest scale.

 

When we compare the 3:

 

1. The long exposure : Our observable universe as a temporal picture receding into the past.

 

This by its very nature selectively shows us a linear progression of time in a block form, obscuring the true nature of the universe. What should be shown is any moment in time, or the entirety of time. This being only a sample of the entirety of time, which we use to base our assumption of the universes homogeneity at large scales, is actually a misrepresentation, because the image show events which are separated in both time and space, just like the pool game long exposure this increases the apparent distribution of the balls (which represent matter) in uniformity, because it could be the way the information is shown causes the conclusion, we should get a larger more accurate sample,

 

2. The snap shot of a moment: This would be the 3D slice of the universe as it would appear in a present moment I talked of earlier. It would take 10 years to create a 10 light year radius picture and would by my estimation require massive amounts of data storage, or force the use of a low data point resolution.

 

This hypothetical snap shot, would provide a better sample from which to conclude if the universe is homogenous or not, this is a large scale picture of the universe, if left to grow in detail for a few millenia, even though it would lack the scale in depth of time of our current map of the universe, it provides a full 3d view of a moment, which on our current map is simply a thin ring of sky, outside which are, sequentially, only slivers of a universe which existed before.

 

If we created an image like this, it would enable us to compare it to our current map, and allow, with an accuracy proportional to the size of the snap shot, for us to control for any artifacts created by the inherent "smeared out" nature of the large scale image of the universe. Even if a map of this sort did show the need to question the cosmological principle, it would also be put in doubt because it would require a length of time equal in years to the size in light years of our current moments Hubble volume, which is greater than the age of the universe and could possibly, be cut short by a future end of the universe.

 

3. The full video. This would be a continuous series of snap shots run together, so we could explore a moment in time in 3D and also have the image of the following moment and the minor details of its changes, The frame rate would be determined by the resolution within each snap shot, which in turn would be dependent on our limit of the capacity to store information, Fortunately by synchronising the frame rate, with the resolution, we could create overlap between data sets, meaning only 2 extra data points would be needed per 3D slice of space. The frame rate, which is the distance in time between consecutive 3D maps of a moment and the resolution, which is the distance in time and space between data points of an individual 3D map would need to share the same value. For instance the a photo of the light ariving at a given location could be taken each night, 12 hourly, starting from t=0 0,0, building a 3D image with a resolution of 12 light hours, then the next frame of the video would begin at t=0hrs 0,0,0. and our second frame to be t=12hrs 0,0,0, then the first and 2nd frame only differ, by the addition of 1 frame and the subtraction of 1 frame. Choosing instead frame rates and resolutions which shared common factors or common denominators would also save on storage, however these would be less efficient, but yield a higher diversity of data points. However since the goal would be to determine the effect on how scale can effect homogeneity from way we currently map the sky, and our attempt is to control for the assumption that a still shot of the universe over an outwardly receding, contiguous space time, is representative of any given moment of space, then the best way to be sure would be to have as much unique data as possible, so in that case, it would be best to keep, the frame rate and resolution instead out of phase.

A video showing the change over time between one 3D slice of time to the next, giving a complete 4D representation, can depending on its complexity, provide vastly more information, and a more detailed map in which large scales and small scales can be compared with more accuracy, which would allow the question of homogeneity to be answered. Taking a picture of all the past moments as they progress back in time, studying them for homogeneity at successively greater scales and then determining those scales, which are the largest are homogenous, needs to be ruled out as being an outcome which occurs simply because of the data which is used. Data which are also conveniently a linear (smoothed out), contiguous (by nature a smooth sample) representation of all causally linked past events up to the moment.

 

Convieniently ignoring all other inhomogenous results at all the smaller sample sizes, and stating on large scales the universe is homogenous is counter intuitive, how can alot of uneven objects make a smooth object without very intricate fine tuning? The answer is that they don't, because the scales that are observed don't take into account that time causes change, where we see galaxy clusters and voids on a smaller scale OF TIME, at another point in time there may have been a cluster where the void was and a void where the cluster was. This is not a large scale measurement, the data points and spread throughout a causally connected 4D image, objects tend to move into places which have space for them to be, and spaces tend to be created when objects move. Homogeneity over space and time shouldn't be be any surprise at all, but it shouldn't be any great revelation either, because its a consequence of how we measure, not a measurement of what we're measuring. By simplifying the universe while ignoring the complexities only creates wrong ways of looking at things.

Edited by Sorcerer
Posted (edited)

In all honesty you will never understand unless you understand how the thermodynamics work.

 

You keep basing your assertions on visual limitation. That's not how this works.

 

Every major equation in the FLRW metric has thermodynamic correlations.

 

We can measure and test those trends over time quite well.

 

Measurements of the CMB gives us an extremely wide view snapshot of the cosmological principle. It's highly uniform. Which was well predicted long before it was discovered.

 

This is the aspect your overlooking.

 

Much of the thermodynamic properties including big bang nucleosynthesis was predicted well before they were confirmed.

 

You might not understand it but the equations do make highly accurate predictions that match observational evidence.

 

That's without needing a large 3d snapshot of now.

The only question is will you take the time to understand it or will you continue to deny that the metrics works due to not understanding how they work?

 

The other aspect ppl ignore is the cosmological principle wasn't merely assumed to be correct.

 

There was billions of dollars of research performed over several decades of measurements to confirm the cosmological principle.

 

The truly neat part is using supercomputers we can test our known theories and run a simulation.

 

http://www.cfa.harva...du/news/2014-10

 

http://www.illustris-project.org/

 

this simulation tested the LCDM model to an extremely high degree, including producing the types of galaxies we observe today.

 

How would you explain that degree of accuracy if our metrics were wrong?

Or for that matter all the predictions that LCDM made prior to observation evidence such as the correct % of elements in the CMB.

 

Keep in mind the FLRW metric also employs the Einstein field equations.

Edited by Mordred
Posted

Keep in mind, sorcerer, that causal effects also propagate at the speed of observation ( light ).

Nothing can affect you faster than you can see it coming.

 

When a pitcher throws a ball at you, the batter, the ball will not reach you until you see it reach you, and you swing accordingly.

Anything could happen to the ball along the way.

Similarly, if Andromeda is on a collision course with the Milky Way, it seems kind of foolish to throw away the laws of motion because we don't know what will happen in the intervening 4 bill yrs.

Posted

In all honesty you will never understand unless you understand how the thermodynamics work.

 

 

That's without needing a large 3d snapshot of now.

 

 

The other aspect ppl ignore is the cosmological principle wasn't merely assumed to be correct.

 

The truly neat part is using supercomputers we can test our known theories and run a simulation.

 

http://www.cfa.harva...du/news/2014-10

 

http://www.illustris-project.org/

 

 

 

How would you explain that degree of accuracy if our metrics were wrong?

Or for that matter all the predictions that LCDM made prior to observation evidence such as the correct % of elements in the CMB.

 

Keep in mind the FLRW metric also employs the Einstein field equations.

Doesn't FLRW still work, without the assumption of isotropy and homogeneity? If it is dependent on them, one isn't proof of the other, instead, if someone required isotropy and homogeneity, then there would be incentive to find evidence of it at all costs. Why is it only at the very largest of scales that any evidence appears? I suggest that because larger scales are representative of more positions of space within time, it is not in reality that smaller scales aren't homogeneous or isotropic, they are, but they must just be observed for an amount of time which is proportional to their size.

 

Similarly, it isn't actually true the large scales are isotropic and homogeneous, large scales of space, independent of time, say the 3D slice of the greatest point in size of the universe (supposedly this moment) wouldn't show homogeneity and isotropy as would be expected, when compared to the homogeneous nature of all the moments in time that precede it, which when overlaid on each other, by the very nature of their change over time give a more even distribution, because the same amount of matter existing at more moments will be measured in more possibles places than that matter spread out over a large area but in a single moment alone.

 

 

You keep basing your assertions on visual limitation. That's not how this works.

I think I initially began with a purely visual-spatial model in my mind in an attempt to cement the difference between a scale in space time and simply a scale in space. I have since began basing my assertions on the dynamic nature of space time vs the static nature of space without time. The limits of our observations aren't of consequence here, dynamic systems will naturally form a greater number of possible configurations and thus homogeneous and isotropic observations of large scales are primarily a consequence of the extent of the time dimension. The ratio of the size of space observed to how much time that space encompasses is proportional to its probability for being uniform (getting sick of homogeneous and isotropic, I'll use uniform from now on).

 

 

 

Every major equation in the FLRW metric has thermodynamic correlations.

Isn't it possible that thermodynamic's laws are a consequence of a forward time direction combined with an expanding space dimension. For instance entropy increases simply because the initial condition confined entropy to a minimum, and when those confines were loosened by a universe that expanded over time, the only direction is towards less order. Spaces expansion ensures that hot areas expand and thus cool, so the natural direction is from hot to cold and the natural direction of order is from order to disorder. So, that would suggest that a universe which expands over time is a requirement for the laws existence. It could be then that by requiring a large scale uniform universe for FLRW, and it's correlation to thermodynamic share a common prerequisite in that rather than being explanations of an expanding universe, and expanding universe explains why they exist.

 

 

We can measure and test those trends over time quite well.

We are biased to measure things over the scale of that which is easiest to observe. We could attempt to measure what if any trend they'd show over only a moment of space, and how that would differ as the spatial scale increases. It would control for the time factor which by it's very nature ensures change and larger amounts of change increase the chances of more a more uniform universe. If we know by what ammount the uniform nature of the universe is influenced simply by an overlap in possible configurations of matter over multiple moments, it would allow us to create models of the universe which could be independent of scale, sucessfully modelling the quantum, to the entire universe, and even possibilities beyond.

 

 

 

Measurements of the CMB gives us an extremely wide view snapshot of the cosmological principle. It's highly uniform. Which was well predicted long before it was discovered.

 

The problem with the CMB is that its current state is dependent of the universe at its release,yet it also has had the most ammount of time of any object in existence to travel free of interaction. The very first time it interacts is when we measure it, this is implied because if it was absorbed by something else and re-emitted, it would no longer be CMB.

 

The universe when the CMB was released was vastly smaller than today, between the size it was then and the size it is now, the universe has provided the CMB with a huge number of ways it could possibly arrange itself and still continue on it's path to our observing it without interaction. Even with extreme precision of measurement, a uniform CMB isn't 100% conclusive evidence of the universe being uniform. Rather the uniformity of the CMB could be a consequence of its history, the first released source of energy with the greatest ammount of time which is shared in common, all existing without interaction from a small and hot universe which cooled as it expanded in vast comparitive proportion to the current universe. The CMB naturally arose from a uniform plasma which condensed and allowed it's release, the conditions at the beggining were but for, small differences, uniform, but it is over time that those differences have amplified into the disordered universe. The uniformity of the CMB, rather than being evidence of the large scale uniformity of the universe is rather evidence of the universe progressing for a uniform state and gradually becoming more and more structured. Again, it is the addition of time which masks this trend, when the distant pasts high order is added to the near futures disorder, and an average taken, it is mistaken for uniformity on a large scale, rather than a progression from past uniform to future structured.

 

 

 

This is the aspect your overlooking.

As you can see I've thought about that one before actually. It's not overlooked, it might be wrong? Maybe we're saying the same thing, but in different ways which seems like we're disagreeing. 2+2=4, but so does 1+3.

 

 

 

Much of the thermodynamic properties including big bang nucleosynthesis was predicted well before they were confirmed.

A uniform universe which arises from an extrapolation back to a point is a necessity of said initial condition, it must have the greatest possible degree of being uniform. If it were disagreed that the universe was on a large scale uniform back when these predictions were made, how would it effected the prediction by changing the assumption to the universe arising as uniform, thus the scope of time and space and its nature uniform or not after nucleosynthesis is rendered moot.

 

 

 

You might not understand it but the equations do make highly accurate predictions that match observational evidence.

If the assumption they require to do so is false though, couldn't it mean that they are simply equations which have been altered from a better set of equations to compensate for a universe which is described as uniform, but yet isn't. How do the equations fare at making predictions over small scales?

 

If the universe isn't actually uniform on large scales and our observation is actually an artifact of the passage of time and the permutations of matter positions having overlap with past permutations. Couldn't there be another set of equations which might also make predicitons, maybe more accurate ones, which match the evidence to even a closer degree of certainty?

 

I recall Newtons equations are quite accurate at making predictions that match observations, but Einsteins are more accurate, why stop the progression?

 

If the accuracy of our observations is biased by our interpretation of the data they give, because we must necessarily assume that they are uniform, for the predictions to match. If we remove that bias, and assume the universe isn't on the large scale uniform, how do the other possible interpretations of A. a non uniform universe on large scales or B. An initially uniform universe which tends towards less uniform and more structure over time. influence how the observations are interpreted, how in turn does that effect the accuracy of the predictions. We must be sure not the beg the question, prevent ourselves seeing other options and confuse our assumptions for the only possibility.

 

 

The only question is will you take the time to understand it or will you continue to deny that the metrics works due to not understanding how they work?

I'll have a look at them later, going to help a friend move now...... JOY.... not. When I do look I'll continually be considering how it would be possible for the assumption of uniformity to be altered. I doubt my level of understanding will actually allow me to find it though.

 

 

 

There was billions of dollars of research performed over several decades of measurements to confirm the cosmological principle.

The idea that large scale uniform universe observations are an artefact of the nature of large scales necessarily including more time with space. Thus simply averaging the positions of matter over time, rather than and observing a progression from the most uniform to a steadily increasing complex, non uniform, structure, was free, but were any of those billions spent, ruling this simple concern out? Or was the necessity of the assumption of uniformity so crucial that, it was the only thing tested, the only thing which gave results to be interpreted and thus the only thing of which conclusions were made? How did they rule out the artefact of extended time averaging non uniform matter when viewed out of context of the moment, and in context of the continuous past?

 

 

 

this simulation tested the LCDM model to an extremely high degree, including producing the types of galaxies we observe today.

Yes but can't you fine tune models with extra parameters to get the results you desire. Where there any unexplained tweaks in this model? To what degree did it really model reality, when we barely know what 95% of all stuff is.

Posted (edited)

I don't think you realize that the LCDM model already predicts that the Universe will have less uniformity as the Universe expands. Same goes with greater entropy.

 

These details are already considered in the model. Large scale structure formation and void size increase is an obvious consequence.

 

But consider this, the main thermodynamic player today isn't matter or radiation distribution.

 

The main player is the cosmological constant itself which is uniform.

 

We're now in the Lambda dominant era. The less dense matter and radiation becomes the greater influence the cosmological constant has.

 

This is one of the reasons why the homogeneous and isotropic ideal gas laws still work today.

 

We may have to change the scale at which one can consider the Universe uniform but those terms already involve scaling.

 

So in your last post you haven't stated anything that isn't already considered. Granted I might have missed something in your lengthy post

The equation I posted is a reflection of how matter, radiation and the cosmological energy/density evolved over time. They don't evolve at the same rate.

 

 

I've already explained we don't naturally assume uniformity.

It always amazes me that more often than not the ones declaring a model is wrong. More often than not, never studied the model in the first place. They never look at the math, they never see how interconnected one model is with other theories. Nor have they ever picked up an introductory text book but instead make their determination based upon pop media videos, and literature.

 

For example do you know the formula for entropy in Cosmology applications?

Do you know what the thermodynamic properties of an adiabatic and isentropic fluid is ?

 

Are you familiar with the term thermal equilibrium?

 

The reason I ask these questions is that they directly relate to entropy density. Judging from above your under the assumption that the arrow of time itself is the determining factor. Along with expansion. Yes they are involved but certainly not the full story.

 

 

Let's take for example during the Planck epock the Universe is in thermal equilibrium. No particles can be distinguished from photons and antiphotons. What is the entropy value in that state?

Edited by Mordred
Posted

Fair enough, I briefly studied entropy, enthalpy and thermodynamics as it relates to physical biochemistry my final year of uni, however having not used it for so long, I remember very little.

 

To your final question I would at a guesd answer infinite only if there was an increase gradually to it, because that would follow as the end value. If it was a sudden phase change however I would say it's undefined since it relies on the being no way to differentiate thus no point which its value can be measured against.

Posted

No photons and antiphotons are different. They are not the same.

Chapter 3.

 

 

Are they just going in different directions? https://van.physics.illinois.edu/qa/listing.php?id=1153

 

The short answer to "are there anti-photons" is "yes", but the disappointment here is that anti-photons and photons are the same particles. Some particles are their own antiparticles, notably the force carriers like photons, the Z boson, and gluons, which mediate the electromagnetic force, the weak nuclear force, and the strong force, respectively. Particles that are their own antiparticles must be electrically neutral, because an antiparticle has the opposite electrical charge as its partner particle.

Is that a fair description?.

Posted

Yeah but if they're indistinguishable what's the point in labelling them as such, does it simply fulfil the requirements for the standard model in which every particle must have a corresponding anti particle?

 

Are there energies at which they become seperate and distinguishable? Can we reach these currently, are they likely to be ever within our reach, or are they purely hypothetical energies which are predicted only in the early universe?

 

Your link regarding antiphotons is dead BTW Mordred

 

If 2 photons could be isolated and forced to collide at high energies and produce a e/p pair must it be assumed one was anti and the other not? Even though which is which is unknown, or is the some symmetry in the produced pair which reverses depending on the direction to the collision of each photon, which would show us which is which?

 

If we try this with instead 2 photons and they fail after repeated attempts at required energies to produce an e/p pair, does it then follow they must be both the same.

 

Or is it as I said before, is it moot because they're indistinguishable?

____________________

 

Anyway back to the original topic Mordred, what I was getting at is that large scales which produce a uniform distribution are time dependent. A large area of space measured for a brief interval, is far less likely to be uniform than the same area over a large interval.

Posted (edited)

http://www.wiese.itp.unibe.ch/lectures/universe.pdf

That fixes the link for some reason the copy paste screws up

One of the common misunderstanding is charge on antiparticles. It doesn't strictly mean electric charge. Flavor and color are also charges

Are they just going in different directions? https://van.physics.illinois.edu/qa/listing.php?id=1153

Is that a fair description?.

Yes the description is fair

Yeah but if they're indistinguishable what's the point in labelling them as such, does it simply fulfil the requirements for the standard model in which every particle must have a corresponding anti particle?

 

Are there energies at which they become seperate and distinguishable? Can we reach these currently, are they likely to be ever within our reach, or are they purely hypothetical energies which are predicted only in the early universe?

 

Your link regarding antiphotons is dead BTW Mordred

 

If 2 photons could be isolated and forced to collide at high energies and produce a e/p pair must it be assumed one was anti and the other not? Even though which is which is unknown, or is the some symmetry in the produced pair which reverses depending on the direction to the collision of each photon, which would show us which is which?

 

If we try this with instead 2 photons and they fail after repeated attempts at required energies to produce an e/p pair, does it then follow they must be both the same.

 

Or is it as I said before, is it moot because they're indistinguishable?

____________________

 

Anyway back to the original topic Mordred, what I was getting at is that large scales which produce a uniform distribution are time dependent. A large area of space measured for a brief interval, is far less likely to be uniform than the same area over a large interval.

Ask yourself the question "what is a particle?"

 

A particle is an excitation of a field. If you provide enough energy the coupling constants ie weak,strong, electromagnetic become the same.

 

https://en.m.wikipedia.org/wiki/Coupling_constant

 

This is all part of unification theory and symmetry breaking.

Please don't tell me you have no interest in understanding how LCDM predicted BB nucleosynthesis prior to measurements but assume it's wrong.

 

Or for that matter why the first equation I posted this thread can be used to calculate proper distance to any object?

(Including distance now and distance to time of emission (distance then))

 

Use the cosmocalc in my signature the equation I posted is part of those values.

 

If you compare the graph functions in the calc they are an exact match to Lineweaver and Davies.

 

http://arxiv.org/pdf/astro-ph/0402278v1.pdf[/url

 

Yet the calc uses all of 7 key equations. It can handle both the WMAP and Planch datasets

Edited by Mordred

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.