Jump to content

Markus Hanke

Resident Experts
  • Posts

    2081
  • Joined

  • Days Won

    61

Everything posted by Markus Hanke

  1. I’d bet there’s something to it. In fact, I’d go so far as to say that complexity, chaos and emergence in general are seriously underrated and under-utilised in modern physics. Just my opinion though
  2. I wouldn’t put it so strongly, it just means that the data places strong constraints on which models might be viable or not. On the other hand though we have good reason to believe that there is entropy associated with the horizon of BH’s - and since the concept of entropy only really makes sense for a system that exhibits discrete microstates in some form or anither, the interior region cannot be smooth and continuous empty space everywhere. So I’d still bet my money on some deeper structure that underlies classical spacetime, even if that turns out to not have anything to do with spin foams.
  3. Shame, that…I had some hopes for LQG. But this is just how science works.
  4. This is the trouble, really; all the known and proposed alternative models of gravity have some form of problem. Generally speaking, they tend to be able to model one particular (class of) phenomenon better than GR, but then fail spectacularly in other situations. Most of them are also mathematically complicated, and unwieldy to work with, and oftentimes they rely on additional assumptions (extra fields etc) for which we have no evidence.
  5. Sabine Hossenfelder, for instance - though I wasn’t immediately able to find a reference (have to look some more later). The idea isn’t new, and isn’t mine either, but I think never really came to the forefront, since it’s essentially untestable given the current limitations in computing power.
  6. I have experienced this switch from LaTeX to RTF numerous times also - for me this happens when I simply press “Edit” after submitting a post that contains LaTeX.
  7. I can not, of course, be sure about such specifics either, nor even about whether or not anything special would happen at all in an n-body problem. It’s really just an hypothesis, based on emergent dynamics in non-linear systems. Basically I’m sceptical about both the particle as well as alternative-model options, so it’s good to have a third alternative. I agree with DanMP’s earlier comment that DM is a big part of our current model of the universe, so this is a very important issue. Well, I never have been trained. I simply base my thinking on what we do know - on situations with small n that can be exactly solved. For example the n=2 case; the spacetime of a binary body system is nothing like our naive idea of two Schwarzschild metrics superimposed. What happens for n= ~billions is anyone’s guess, because I don’t think those non-linearities smooth out. So maybe DM is really a chaos-theoretic problem.
  8. Yes I get you, but to me this simply is extrapolating a model which we know works very well on solar system scales, to larger scales. After all, there is no immediate reason to assume that gravity works differently on galactic scales than on solar scales - while that could be so, we have no evidence that that’s actually the case. Therefore do I think it’s important to try and find ways to figure out what GR actually predicts for large n-body systems, rather than just simplified models with an unknown error factor. One might also say that possible alternatives such as MOND etc are “cheating”, because all of those models postulate things (extra fields, new universal constants etc) for which we have otherwise no evidence. At the very least, GR is the simplest possible metric model of gravity that is fully relativistic.
  9. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. PS. I have no idea why the above post appears several times in a row?? This has never happened before, and I defo didn’t type it that way…
  10. These are all excellent points. Unfortunately I’m up to my eyeballs in the real life at the moment, so I’ll need to come back to this at a later point. Consider the following though. Suppose you have an alien scientist whose species lives down in the ocean of a water-world (no solid land). One day he notices some sand on the bottom of the ocean, and begins to wonder: what would happen if you had a very large amount of grains of sand, without water, just under the influence of wind and gravity? He knows Newtonian physics, and he knows the Navier-Stokes equations. Based on these, he figures that each grain is blown about by the wind, pulled down by gravity, bounces about a bit in pretty much chaotic patterns, and might come to rest somewhere. Over large areas and long times, each point on the sand plain is equally likely to become the resting spot of a sand grain - so it’s reasonable to expect that all inhomogeneities smooth out over time, and you end up with a more or less flat expanse of sand eventually. So now he jumps into his (water-filled) UFO and visits Earth. He lands in a desert, and imagine his surprise when he sees this: A naive application of Newtonian gravity and Navier-Stokes fluid dynamics would give no indication that a large number of essentially isolated sand grains undergoing essentially chaotic dynamics would give rise to large scale ordered structures such as these. So our alien scientist would be forgiven in concluding that there must be some other influence that leads to the formation of dunes. The situation in GR is similar. Each star or galaxy taken in isolation is locally near-Newtonian, and would thus be expected to behave that way on all scales. However, an n-body system with very large n undergoing chaotic dynamics under the laws of GR might form global spacetime geometries that are not immediately predictable, just like sand grains and the formation of dunes (which is meant just an analogy, btw). This holds for stars in a galaxy, or for the interaction between galaxies, or for galaxies in the universe. The point is we don’t know if that’s the case or not, because we don’t have the computing power necessary to model a GR n-body problem with very large n. So this is just a hypothesis, based on the fact that metrics don’t add linearly; the overall metric of an n-body system is not the sum of n metrics for the n constituent bodies. So it’s possible at least in principle that the actual global spacetime might look like it contains more mass than we can observe, even though in actual fact it doesn’t. That’s not really what I’m saying. We can use GR quite accurately so long as it is permissible to make enough simplifying assumptions to render the maths manageable. For example, a single body that can be considered isolated (asymptotic flatness) and is symmetric enough can be easily modelled, and the result matches observation very closely. I think the problems arise only if we are dealing with n-body systems, because the non-linearities inherent in GR may not smooth out and become negligible; they might in fact compound in large enough systems. And the trouble is we don’t have enough computing power to actually run such simulations, for large n.
  11. The current overall state of affairs seems to be that: 1. All efforts in directly detecting the more plausible types of DM particles have come up negative, and particle physicists are forced to consider more and more exotic extensions to the Standard Model to come up with workable alternatives 2. There appears to be little to no statistically significant evidence to support any one of the various alternative gravity models, since they all suffer from more or less significant problems While I think the current evidence isn’t strong enough to definitively rule out either the particle or the alternative gravity model option, I personally tend towards the third option, which avoids both of those - namely that DM is actually an artefact of our inability to produce solutions to the ordinary GR field equations that aren’t idealised. For example, even the best numerical approaches to modelling a spiral galaxy in the context of the GR equations need to be idealised - it’s going to be some sort of continuous dust distribution, with appropriate density curves and initial and boundary conditions. But a real galaxy is not that - it’s a discreet set of a large number of individual sources of gravity, all of which interact gravitationally and often also mechanically. Due to the non-linear nature of the GR equations, it is really not possible (with our current tools) to tell what kind of an error is introduced by idealising this situation to make it model-lable. We don’t have nearly enough computing power (by many orders of magnitude) to numerically solve a GR n-body problem with n on the order of ~100 billion, plus realistic boundary conditions. At least in principle the discrepancy between model and observation which we call Dark Matter could just be the error introduced by idealising a real-world scenario in a non-linear model. DM could be nothing more than a mathematical artefact.
  12. I think this is the issue. From what I know, running an exact numerical solution with realistic background parameters (as opposed to idealised metrics) yields merging times far in excess of the age of the universe, so we shouldn’t be seeing SMBH mergers. Yet we do, so it seems we’re missing some mechanism or another that bleeds away angular momentum quickly enough.
  13. The physical content of Maxwell’s equations is invariant under Lorentz transformations, so any physics predicted by them is guaranteed to be compatible with relativity. As you say, this effect is well known - the speed here is the phase velocity of the wave packet, not the propagation speed of any individual wave front. There’s no upper limit to phase velocity, and no information can be transmitted superluminally in this way.
  14. Sorry, I don’t immediately know of any good treatments on this particular subject. I’m rather busy in real life for the time being, but if I come across something, I’ll post it here. This deserves a +1 from me Learning new things and improving our understanding is what these forums are fundamentally about.
  15. The calculation is correct, but it’s a local approximation, and thus not applicable across distant frames in a curved spacetime, which is what we’re dealing with here. The global transformation is not a Lorentz transformation. So why did you open up a thread on a discussion forum if you’re not interested in what others have to say? What is your intent for being here? Proper length is defined in an object’s rest frame, it has nothing to do with simultaneity of remote observers in a curved spacetime. Even within that frame only, it doesn’t depend on simultaneity. As said before, the locally measured speed of light in all frames is always c, whereas distant coordinate speeds can vary depending on observer. We’re not in Minkowski spacetime here. No one said this. It is always possible to foliate classical spacetimes into a family of space-like hypersurfaces of simultaneity, which is the basis for the ADM formalism of GR. The thing with this is that each physical observer will foliate spacetime differently, according to his own clocks and rulers. There are in fact infinitely many such foliation schemes for any given spacetime. That’s because there’s no single notion of simultaneity that is globally agreed on by all observers. So what we are saying is not that no notions of simultaneity exist, but rather that such notions are not globally shared by distant observers separated across a curved spacetime. In essence, each observer has his own natural hypersurface of simultaneity, which isn’t shared by other observers. I suggest the section on the ADM formalism in Misner/Thorne/Wheeler for further study. Natural Lemaître observers are “raindrop” observers - they are in free fall starting from rest. Therefore \(a^{\mu}=0\) at all times. Free fall is defined as the vanishing of proper acceleration at all points - meaning the resulting world line is a solution to the geodesic equation. This is coordinate-independent, and clearly holds for raindrop observers, irrespective of coordinate basis. As it so happens, I have in fact gone through the procedure of formally solving the geodesic equations for free fall in Schwarzschild spacetime, back when I first learned GR, so I know first hand that time-like free fall geodesics and null geodesics do in fact exist in this spacetime. Unsurprisingly so. We’re not in Minkowski spacetime here - for the I-don’t-know-how-many-th time. It is how distant events in a spacetime are causally related, ie how they can be connected by smooth, everywhere differentiable curves, and the tangent spaces on those curves (ie time-like, null, space-like). This structure is independent of coordinate choices. —- To return to the original topic of this thread: your claim was that GP coordinates do not constitute a valid frame of reference. This thread is in the mainstream sections, so I’ve used the agreed-upon textbook definition of what constitutes a reference frame to show that the original assertion doesn’t hold. If you wish to suggest an alternative definition, the “Personal Theories” section would be the appropriate place. Your question has been answered, even if you don’t agree with the answer.
  16. At this point in time, AIs are not considered a valid source of scientific information, because they make too many mistakes. I suggest you stick to proper textbook sources. But regardless, we’re not dealing in local approximations at a single event here, since the frame of the falling observer is spatially removed from the Schwarzschild observer, so you need to use a global transformation that accounts for curvature. So once again - the global map between these charts is not a Lorentz transformation, not even approximately. Again - there is no notion of global simultaneity in curved spacetimes. This statement is meaningless. The causal structure of spacetime is not coordinate dependent. This has already been addressed - proper vs coordinate speed. No, what you gave wasn’t the same. No it does not. The definition is from the text I quoted, which is an authoritative source. I didn’t just make this up. You might not agree with the accepted definition, but that’s your own issue. The point is that, according to this definition, the GP observer constitutes a valid frame - which is what every single textbook on GR says. Working in the GP frame (sometimes called “raindrop coordinates”) is a standard exercise, you know. …and then, a few lines later: You seem to contradicting yourself now. No such thing exists. There are only hypersurfaces of simultaneity, as used for example in the ADM formalism - but these are specific to a given observer, and are not generally shared by other observers in that curved spacetime. So once again, there’s no global notion of simultaneity that all observers agree on.
  17. It would still happen regardless, the only difference is where. For very massive BHs you’d need to get far below the horizon before tidal forces become noticeable, whereas for small ones this might happen long before you even reach the horizon.
  18. The transformation between the charts is explicitly given here, under the “Metric” section. This is not a Lorentz transformation. This is not the meaning of “isotropy”. You talk about synchronisation as if there was some meaningful notion of global simultaneity here. But there isn’t, because we’re in a curved spacetime. A clock stationary far away will never be synchronous with a clock in free fall towards the horizon, irrespective of coordinate choices. A Schwarzschild spacetime diagram is a diagram of Schwarzschild spacetime - unsurprisingly. You are free to choose your coordinates as you wish, but it still remains Schwarzschild spacetime. If you draw the diagram in GP coordinates, the cones both rotate and distort; if you draw it in SS coordinates, the cones just become narrower, but don’t rotate. That’s a consequence of how these coordinate charts work, but you’re always in the same spacetime. The locally measured speed of light is always c. It’s only the coordinate speed that will differ in (eg) Rindler coordinates - which is why I pointed out earlier that one must carefully distinguish between these. In curved spacetime, notions of space and time are purely local. Schwarzschild coordinates represent an observer who remains stationary far away from the central mass, and this coordinate system describes well the local physics associated with this observer. But the point is that they are only locally physical - if you try to use Schwarzschild coordinates to draw physical conclusions about distant frames (like eg a test particle in free fall), you’ll quickly run into problems. So I wouldn’t say they are unphysical, you just need to be very careful how you apply them in practice. In curved spacetimes, the difference between local and global is crucial. In particular, you can’t use Schwarzschild coordinate time to draw conclusions about what distant clocks record in their own frames; there’s simply no global notion of simultaneity here that can form a basis for this. —- Let’s return to your original claim that GP coordinates can’t be associated with a physically valid reference frame. I think we agree that the GP metric is a mathematically valid solution to the Einstein equations; if you disagree, it’s up to yourself to provide mathematical proof that it’s not. The question then is first and foremost what “reference frame” even means, mathematically speaking. The precise definition is given in (eg) Wu/Sachs, General Relativity for Mathematicians (1977), which is the one I’m using below: Suppose we are given a spacetime, being a semi-Riemannian manifold endowed with a metric and the Levi-Civita connection. An observer in that spacetime is then defined to be a future-oriented time-like curve that is everywhere smooth and differentiable. Finally, a reference frame is a vector field in that spacetime whose integral curves are observers. Straight away we notice that a reference frame isn’t the same as a coordinate chart. So what is a Gullstrand-Painlevé observer? It’s a free-fall geodesic of our spacetime (not necessarily purely radial) that connects an event far away to another event spatially closer to the central mass in a time-like manner, with the express boundary condition that at t=0 the observer be at rest. This geodesic gives our future-oriented time-like curve. So what is the vector field? It’s simply the 4-velocity field given by those very geodesics in spacetime. Recall that there’s no proper acceleration in free fall, thus (we’re in a curved spacetime, so covariant derivatives must be used): \[\frac{D^2x^{\mu}(\tau)}{d\tau^2}=0\] This system of equations, along with our boundary conditions, determines both the geodesic of our observer, and the associated 4-velocity field. But here’s the thing - we know that geodesics parallel-transport their own tangent vectors, and, since 4-velocity precisely is the tangent vector at every point of our motion, we are by the above equation already guaranteed that the geodesic is in fact an integral curve of its own 4-velocity field. This is hardly surprising! Thus, the GP observer (more precisely - his 4-velocity field) does indeed constitute a valid reference frame. If you still don’t agree, you need to show us explicitly and mathematically how a free-fall geodesic (which is what a GP observer is) is not in fact an integral curve of its own 4-velocity field.
  19. Ok, if you’re looking at a purely local (ie small and short enough) region, then you’ll have Minkowski spacetime, and only Minkowski spacetime - it makes no sense to refer to GP coordinates at all there. While curved spacetimes are everywhere locally Minkowskian, globally these frames at different events are related in non-trivial ways, so you need to account for this. The transformation between the Lemaître chart and the Schwarzschild chart isn’t a Lorentz transformation. No, it’s not. In Schwarzschild coordinates, the coordinate (!) speed of light goes to zero as the horizon is approached. In curved spacetimes you have to be very careful with coordinate and proper quantities, and their physical meanings. GP coordinates naturally correspond to the frame of a test particle freely falling from far away towards a central mass in Schwarzschild spacetime; this is very much a valid frame, irrespective of which coordinates you choose to describe it in. Note also that orthogonality of coordinate axis is not a necessary condition for the existence of reference frames in GR; many useful metrics have off-diagonal components, another obvious example being the Kerr metric. That statement makes no sense. For example, if you’re standing still on the surface of the Earth, you can describe this in spherical coordinates, or you can choose to describe it in Cartesian coordinates centered on the Sun - all coordinate expressions are different here, but they still describe the same physical situation. Both coordinate charts share the same time coordinate, with proper time between events just being the geometric length of the free fall world line. This is obviously different from the coordinate time of a clock left stationary far away - unsurprisingly so, since there’s no global notion of simultaneity in curved spacetimes. Of course, that’s the whole point of being in a curved spacetime - there’s no global concept of simultaneity. Time is a purely local concept here. Yes it is - it’s from visualrelativity.com, and originally taken from a Scientific American article by Roger Penrose. It’s important to remember though that the shape and orientation of the cones on the various diagrams you find in textbooks depend on your chosen coordinate system, since light cones are always drawn in a particular set of coordinates. It’s always possible to find coordinates where the cones don’t get rotated. PS. It seems to me that you’re forgetting that we’re in a curved spacetime here. Locally things are Minkowski, but the relationships between those local patches are non-trivial - you can’t just Lorentz-transform between distant frames, there’s no global notion of simultaneity, and the difference between coordinate and proper quantities is even more crucial than in SR. Like studiot has pointed out to you, a “reference frame” - and the relationships between them - are richer than just having a isolated set of coordinate axis. You might also wish to take a look at the tetrad formalism of GR, which builds on this fact.
  20. By “tilted” we mean a change of orientation of the “cone opening” when drawn onto a spacetime diagram. Here’s such a spacetime diagram for Schwarzschild spacetime, with light cones drawn in: As you can see, the opening of light cones at events with smaller r-values is both rotated in spacetime and narrower, relative to light cones at larger r-values; on the diagram they are thus facing more towards the horizon/singularity. Whether you wish to call this “tilting” or “narrowing” is really just a matter of semantics. A light cone encodes causal structure - everything “inside” the cone are time-like world lines, whereas the “walls” are traced out by null geodesics. As you approach the horizon, there are fewer and fewer future-directed time-like paths available that would lead you back out and away from the black hole (necessitating more and more extreme acceleration profiles), whereas more and more of the physically available paths lead into the black hole - IOW, the future light cones in the above diagram are increasingly facing towards the horizon. Very close to the horizon, only some null geodesics - and only those - lead potentially back out. Below the horizon, the singularity is always in the future of any time-like or null world line, so there’s no escape. The walls of the cone are always null geodesics by definition, so the locally measured value of c is of course not affected. I have no idea what you mean by this, especially since we’re not in Minkowski spacetime. Can you restate this mathematically? What exactly do you mean by “equivalence”? They are not the same coordinate chart, but they are related by a valid diffeomorphism (so equivalent perhaps in that sense), so they describe patches of the same spacetime. The physical meaning of the time coordinate in Lemaître coordinates is the same as in GP coordinates, but the physical meaning of the radial coordinate is different between the two.
  21. The choice of coordinates is not necessarily the same as a choice of physical observers. Of course, particular coordinate choices may naturally correspond to the “point of view” of particular observers (in the sense that the coordinates correspond to what clocks and rulers in that frame physically read), making the maths simpler; for example, Schwarzschild coordinates naturally describe the point of view of a stationary observer very far from the central mass, whereas Gullstrand-Painlevé coordinates correspond to the point of view of an observer freely falling towards such a mass. But you’re not restricted to those choices - you are free to pick different coordinates to describe the same physical scenario, so long as they are related by valid diffeomorphisms. Thus you’ll have many reference frames coexisting in the same spacetime. So there’s no problem with describing the same world line in different coordinate charts - you’ll find that its geometric length is an invariant, it is always the same no matter what coordinates you choose. There are also valid coordinate charts that nevertheless don’t correspond to any physical observer - such as eg Kruskal-Szekeres. The Schwarzschild chart and the Gullstrand-Painlevé chart are diffeomorphisms of one another, so they cover the exact same spacetime (though the latter covers a larger part of the manifold than the former does). This isn’t true - in curved spacetimes, light cones will naturally “tilt” along geodesics.
  22. Yes, another excellent question. The thing here is that the Alcubierre metric as a solution to the Einstein equations can in some global sense be considered, mathematically speaking, to be a modification of flat Minkowski spacetime; more precisely, it modifies its geodesic structure by mapping what are non-geodesic paths in ordinary Minkowski spacetime (ie events connected by space-like paths) into proper light-like geodesic paths in the presence of the warp bubble. IOW, the ship enclosed in the warp bubble follows a proper geodesic everywhere, but without the warp bubble, the same spatial trajectory could not be a geodesic. For this reason it’s actually meaningless to ask what level of background curvature the Alcubierre metric can tolerate - if the background outside the bubble isn’t Minkowski, there is no Alcubierre solution. IOW, you can’t construct an Alcubierre drive if there are other sources of gravity anywhere, just like you can’t ever have a proper Schwarzschild black hole in a universe that isn’t otherwise completely empty. So in that sense the Alcubierre solution is highly unstable. Unfortunately the same is also true for all of the other known warp solutions to the Einstein equations, and there are a few of those by now. Most notably you have the Natario metric (a moving warp bubble but volume-preserving, ie without contraction and expansion regions), and the Lentz metric (a warp metric that does not require exotic matter, but is restricted to subluminal effective speeds). All of these require a Minkowski background; I’m not aware of any solutions that work generically irrespective of background curvature. As I said before, I think one would have to work out a specific warp solution for a given, specific flight route and gravitational background; working the equations backwards, this would allow us to construct a specific (not necessarily unique) energy-momentum distribution to realise such a journey. This is mathematically difficult, but should be possible in principle, given enough computing power. But it would mean that there is no generic warp drive à la Star Trek, not even in principle; you’d have to re-configure the entire drive geometry for each individual journey you wish to undertake. And of course then there’s the sheer amount of energy required for such a drive. To put it into perspective - the Lentz drive, which allows travel without exotic matter at subluminal speeds, requires the energy equivalent of ~1/10 solar mass to create a single bubble that can enclose a ship of approx ~100m length. Not very feasible or desirable, in my opinion. Yes, pretty much. But again, this is for an Alcubierre warp drive - it may or may not be possible to construct other warp-like solutions that are not restricted in the same way. It’s hard to tell, due to the complexity of Einsteins equations. At least you’ve found a way to end the journey at all Because I don’t see how such warp bubbles, once created, could be influenced (steered, decelerated,…) from the inside, where your ship is.
  23. That’s an excellent question. First and foremost, the original Alcubierre metric requires that spacetime outside and inside the “warp bubble” is Minkowskian, and thus flat; it is only the “bubble wall” which exhibits non-trivial curvature. If you take away this condition of asymptotic flatness by allowing non-negligible background curvature, the Alcubierre metric is no longer a valid solution to the Einstein equations under those circumstances. This is because GR is a non-linear theory, so one can’t simply add metrics together and expect the result to again be a valid solution to the field equations. IOW, the warp bubble wouldn’t remain stable if it came under the influence of a gravitating body; you might suddenly get strong tidal forces acting on your ship, or the bubble might simply break down and disperse. Which begs the question - is there any kind of topological construct that behaves similar to Alcubierre’s warp bubble, but can exist in the presence of strong background curvature? I don’t know the answer for sure, but potentially this is possible. But then, such a construct would depend on the specifics of the gravitational environment, so if it propagates from a region of strong curvature to a region that is nearly flat, it would almost certainly not remain stable, so you’d have the same problem. So is it possible to have a warp bubble metric that remains stable irrespective of the gravitational background? Due to how the Einstein equations work, I would say almost certainly not. What may be possible though is to find a specific warp metric for a specific flight path through a given, specific gravitational environment. You’d have to know where you want to start and where you want to end up, and the exact spacetime curvatures in all regions in between. If you then had a powerful enough computer, you could try and find a metric that describes a stable warp bubble propagating through this setup. You would have to perform this calculation anew for every journey you want to undertake, since it’s specific to the parameters describing each journey. It’s another interesting question to ask whether it is guaranteed that there always exists a solution; perhaps some routes cannot be flown at warp speeds…?
  24. It’s meaningless, and thus not a valid concept.
  25. Indeed. There’s also the issue of circumnavigating Antarctica. On real Earth, that’s a distance of some ~16000km when done on a boat, and you have to make course corrections towards land. On flat Earth, the distance would be at least ~40000km, and you have to course-correct away from land. Obviously this has been done many times, so we know which option is right.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.