michel123456 Posted November 10, 2015 Posted November 10, 2015 (edited) And in this (physically unrealistic) scenario it is still not isotropic - there will be increasing separation between the drops in one direction but not the other. (And did you ever show us the maths to prove that speed of separation is proportional to distance? Or did I just miss it?) Also, sooner or later, the drops will hit the ground. So none of this seems relevant to cosmology. I say that the system is expanding. From here http://www.physicsclassroom.com/class/1DKin/Lesson-1/Acceleration ---------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------- Now look at the table above considering 2 objects falling at 1 sec interval. You are the first to fall and look behind you. Take the values from the right column in the table _after 1 sec you have traveled 5m, your friend behind you is still at point zero. The apparent average velocity is the difference between the velocities = 5m/s _after 2 sec you have traveled 20m, your friend 5m, difference equals 15m (that corresponds to your average velocity 15m/s). The apparent average velocity is 15-5=10m/s. _after 3 sec you have traveled 45m, your friend 20m, difference equals 25m (that corresponds to your average velocity 25m/s). The apparent velocity is 25-15=10m/s _after 4 sec you have traveled 80m, your friend 45m, difference equals 35m (that corresponds to your average velocity 35m/s). The apparent velocity is 35-25=10m/s My conclusions are _that the system is expanding because the difference in traveled distance increases. _that the apparent average velocity between you & your friend is constant. You are observing the one behind you going away from you at constant velocity. In this case 10m/s And _If you are observing another falling friend at 2 sec interval behind you, the apparent velocity will be larger. In this case 20m/s. It is obviously a function of time (the delay). Now you are asking me to prove that it is a function of distance. That would have been easier for me thirty years ago... Edited November 10, 2015 by michel123456
Strange Posted November 10, 2015 Posted November 10, 2015 There will be increasing separation between the falling objects in one axis but not the others. Also, sooner or later, falling objects will hit whatever it is they are falling towards. So none of this seems relevant to cosmology.
michel123456 Posted November 11, 2015 Posted November 11, 2015 There will be increasing separation between the falling objects in one axis but not the others. Also, sooner or later, falling objects will hit whatever it is they are falling towards. So none of this seems relevant to cosmology. Relevant is that in cosmology time (the delay) is a function of distance. Suppose you have a galaxy "falling" side by side with you, in the same "present time", at the same acceleration, at the same velocity. Because the galaxy is far away, you do not observe it "in the present" but as it was in the past. The galaxy in ancient time had the same acceleration, but not the same velocity. IOW, wathever the direction, the galaxy will look like going away. Of course, the velocity that you will observe will not be the same as the velocity of another galaxy at the same delay and aligned with your motion. The velocity will be less. But because you don't know out of hand the distance to the galaxy, the result of the observation will remain consistent with any other galaxy getting away from you.
Strange Posted November 11, 2015 Posted November 11, 2015 (edited) Because the galaxy is far away, you do not observe it "in the present" but as it was in the past. The galaxy in ancient time had the same acceleration, but not the same velocity. IOW, wathever the direction, the galaxy will look like going away. You keep saying this but provide no justification. Why would seeing the light delayed by X years cause it to appear to be moving away? Feel free to show the appropriate maths. Edited November 11, 2015 by Strange
michel123456 Posted November 11, 2015 Posted November 11, 2015 Because the source of the light has a lower velocity than you do.
Strange Posted November 11, 2015 Posted November 11, 2015 Because the source of the light has a lower velocity than you do. And ... ? Are you thinking of Doppler shift or time dilation? Or something else? Whichever it is, please show (mathematically, no more handwaving please) that this effect (a) exists; (b) is the same as that you claim for objects ahead and behind you and ( c) changes linearly with distance.
DevilSolution Posted November 12, 2015 Posted November 12, 2015 Whens it due? must be soon, has its waters dropped yet? ()(O.o)()
michel123456 Posted November 12, 2015 Posted November 12, 2015 (edited) Lets go to the beginning again. You have 3 objects One has velocity V1, the other V2, the last V3 Where V1<V2<V3 They are aligned and heading into the same direction with the configuration V1-V2-V3 Do you agree that V2 will observe V1 and V3 going away? Edited November 12, 2015 by michel123456
Strange Posted November 12, 2015 Posted November 12, 2015 Do you agree that V2 will observe V1 and V3 going away? Maybe. If you mean that they are all moving on the same path, one behind the other, then yes. But: 1) That has nothing to do with what I was asking. 2) In the case of free fall, you haven't yet shown mathematically that their speed of separation is proportional to separation distance. But I am happy to leave that for the time being.
michel123456 Posted November 12, 2015 Posted November 12, 2015 Maybe. If you mean that they are all moving on the same path, one behind the other, then yes. But: 1) That has nothing to do with what I was asking. 2) In the case of free fall, you haven't yet shown mathematically that their speed of separation is proportional to separation distance. But I am happy to leave that for the time being. In case of free fall I don't know. It may not be proportional to the distance. It is proportional to the delay and that is sufficient to me. I am escaping from the free fall example at this point and jumping into cosmology where we know that the delay is linearly proportional to distance.
Strange Posted November 12, 2015 Posted November 12, 2015 In case of free fall I don't know. It may not be proportional to the distance. It would be very easy to work out (but I am not going to do it for you). I am escaping from the free fall example at this point and jumping into cosmology where we know that the delay is linearly proportional to distance. Well you might say that, of course delay is proportional to distance - because the speed of light is constant. But hang on, no it isn't: because the universe is expanding, it takes proportionately longer for light to get to us from more distant objects. And you still haven't explained why the raindrops (or galaxies) either side of you would appear to be receding ...
michel123456 Posted November 12, 2015 Posted November 12, 2015 (edited) At the risk of repeating myself: because a long time ago their velocity was less than ours today. Or because a long time ago their velocity was more than ours (in my paradigm that is not allowable) The only thing that could eventually happen is to find a galaxy that had a long time ago the exact same velocity that we have today. You must imagine the coincidence: the same velocity at the exact right time and spot. But this situation is not allowable either, because in my paradigm, all galaxies are in a state of same acceleration and same velocity for a particular instant in the history of the universe. And travelling on parallelpaths towards the same direction. Which means that all observable galaxies around us have less velocity than us, simply because their are ancient. Edited November 12, 2015 by michel123456
Strange Posted November 12, 2015 Posted November 12, 2015 At the risk of repeating myself: because a long time ago their velocity was less than ours today. At the risk of repeating myself: why would this make them look as if they were receding? It is no use simply repeating that it would. You need to explain what mechanism is involved and then show the maths for how you calculate the speed at which they appear to be receding.
michel123456 Posted November 13, 2015 Posted November 13, 2015 (edited) At the risk of repeating myself: why would this make them look as if they were receding? It is no use simply repeating that it would. You need to explain what mechanism is involved and then show the maths for how you calculate the speed at which they appear to be receding. There is no need. Take random velocities in parallel motion. Take a same starting point in time. Don't you see that it is an expanding model? What maths do you need to show that? ----------- There is no need to CALCULATE the velocity, the fact that velocities are different is enough and sufficient. Edited November 13, 2015 by michel123456
Strange Posted November 13, 2015 Posted November 13, 2015 There is no need. Take random velocities in parallel motion. Take a same starting point in time. Don't you see that it is an expanding model? No. It seems a completely random claim. I see absolutely no basis for it at all. And we are not talking about random velocities, but two things moving side by side at the same velocity. What, exactly, will make them appear to be moving apart? What mechanism are you suggesting? What maths do you need to show that? The maths that shows that they appear to be moving apart, even though they aren't. (And that the speed of this apparent movement is proportional to distance.) There is no need to CALCULATE the velocity, the fact that velocities are different is enough and sufficient. So you keep claiming. If it is so obvious, why
michel123456 Posted November 13, 2015 Posted November 13, 2015 (edited) No. It seems a completely random claim. I see absolutely no basis for it at all. Take 2 objects. They start at the same instant in the same direction with velocities V1 & V2, where V1 is different from V2. Question Do the 2 objects 1. keep a standard distance bewteen each other? 2. get closer to each other? 3. go away from each other? Edited November 13, 2015 by michel123456
Strange Posted November 13, 2015 Posted November 13, 2015 Take 2 objects. They start at the same instant in the same direction with velocities V1 & V2, where V1 is different from V2. Question Do the 2 objects 1. keep a standard distance bewteen each other? 2. get closer to each other? 3. go away from each other? But we are talking (I thought) about two objects falling side by side and therefore with the same velocity (at any instant). (Obviously if they have different velocities then they will not stay the same distance apart. But that does not appear to be relevant.)
michel123456 Posted November 13, 2015 Posted November 13, 2015 (edited) But we are talking (I thought) about two objects falling side by side and therefore with the same velocity (at any instant). (Obviously if they have different velocities then they will not stay the same distance apart. But that does not appear to be relevant.) OK. Answer 3. ----------------- Now The 2 objects falling parallel side by side will stay at the same distance to each other. But Do you agree that if the 2 accelerated objects have a delay, say 1 sec, they will observe each other going apart? Edited November 13, 2015 by michel123456
Strange Posted November 13, 2015 Posted November 13, 2015 Do you agree that if the 2 accelerated objects have a delay, say 1 sec, they will observe each other going apart? No. That is why I keep asking you to explain why you think that.
michel123456 Posted November 13, 2015 Posted November 13, 2015 (edited) No. That is why I keep asking you to explain why you think that.Your ununderstanding is above my understanding. If there is a delay, that means the 2 objects have a different velocity, which takes us back to the former example. Edited November 13, 2015 by michel123456
zapatos Posted November 13, 2015 Posted November 13, 2015 Your ununderstanding is above my understanding. If there is a delay, that means the 2 objects have a different velocity, which takes us back to the former example. What do you mean when you say 'there is a delay'? A delay between what and what? Are you talking about a delay in the time it takes for a signal to be sent from object 1 to object 2? Also, are you talking about objects that are starting with the same velocity?
michel123456 Posted November 13, 2015 Posted November 13, 2015 (edited) What do you mean when you say 'there is a delay'? A delay between what and what? Are you talking about a delay in the time it takes for a signal to be sent from object 1 to object 2? Also, are you talking about objects that are starting with the same velocity? A delay between the start time. Object A starts with acceleration a. Object B starts with same acceleration a but 1 sec after. In the same direction. On a parallel path. --------------------------- The 1 sec delay will remain the same all along. But since the objects are accelerating, the 1 sec gap will correspond to an increased distance. Edited November 13, 2015 by michel123456
Strange Posted November 13, 2015 Posted November 13, 2015 If there is a delay, that means the 2 objects have a different velocity, which takes us back to the former example. But the former example is for objects in a line: A -----> B ---------> C ----------------> Where I agree that the distance between them will increase. But you are talking about things side by side (Scenario 1): A ----> B ---> We are on A we look at B and (as you say) we will see it as it was in the past. So we should see it having a lower velocity than it actually does now. But it is travelling parallel to us. Why does it look as if it is moving away? Now, I can see that you might argue that we will see it fall behind because we are seeing it when it was moving more slowly. So after a while we see (Scenario 2): A -----> B ----> BUT... 1) We don't see distant galaxies moving, we only see red-shift. There would be no red shift when the galaxies are next to each other (Scenario 1) 2) I don't believe (and you need to demonstrate) that this effect would be the same for galaxies to the side of us as it is for galaxies the same distance ahead and behind us.
zapatos Posted November 13, 2015 Posted November 13, 2015 (edited) (Edit: This was in response to michel's last post) And the assumption is their acceleration is continuous? If so, then I agree the distance between them will increase. In the first second, object A's average velocity is greater than the average velocity of object B. In the second second, object A's average velocity is again greater than the average velocity of object B. And so on. Therefore the distance between then will increase. Edited November 13, 2015 by zapatos
michel123456 Posted November 13, 2015 Posted November 13, 2015 (edited) (Edit: This was in response to michel's last post) And the assumption is their acceleration is continuous? If so, then I agree the distance between them will increase. In the first second, object A's average velocity is greater than the average velocity of object B. In the second second, object A's average velocity is again greater than the average velocity of object B. And so on. Therefore the distance between then will increase. Thank you. Yes the acceleration is continuous, see post 14 http://www.scienceforums.net/topic/91928-universe-contracting/?p=890460 But the former example is for objects in a line: A -----> B ---------> C ----------------> Where I agree that the distance between them will increase. But you are talking about things side by side (Scenario 1): A ----> B ---> We are on A we look at B and (as you say) we will see it as it was in the past. So we should see it having a lower velocity than it actually does now. But it is travelling parallel to us. Why does it look as if it is moving away? Now, I can see that you might argue that we will see it fall behind because we are seeing it when it was moving more slowly. So after a while we see (Scenario 2): A -----> B ----> The scenario is the following Start Time0 : A0 -----> A1 ---------> A2 ----------------> A3------------------------------> A4-------------------------------------------------> A5 Start Time1 : B1 -----> B2 ---------> B3 ----------------> B4------------------------------> B5 Start Time2 : C2 -----> C3 ---------> C4 ----------------> C5 At Time2, A2 observes B2 & C2 At Time5, A5 observes B5 & C5, the distance has increased. BUT... 1) We don't see distant galaxies moving, we only see red-shift. There would be no red shift when the galaxies are next to each other (Scenario 1) 2) I don't believe (and you need to demonstrate) that this effect would be the same for galaxies to the side of us as it is for galaxies the same distance ahead and behind us. 1) In cosmology, the delay is function of distance, IOW the other galaxies did not "start" at another time, the delay is caused by SOL. AND the redshift is function of distance also. So indeed we observe less redshift for galaxies closer to each us. 2) that is your question from the beginning. I think for galaxies side by side, maybe the small diagram above will help. For galaxies ahead, what we observe behind is the same with what we observe in the front. The system is reversible. I mean, what A5 observes looking at B5 is the same with what B5 observes looking at A5. Except that the cosmologic delay restricts us to see only galaxies in the past. That is to say only galaxies behind us. That is not easy to put on a diagram, because on paper the delay in time results in a shift in space and make things blurry. Edited November 13, 2015 by michel123456
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now