I am not very knowledgable on this subject, but from what I understand, a clock in motion at a certain velocity compared to a synchronized clock on the ground that is not in motion will "slow down" and be out of sync with the clock on the ground after the motion, and this was something predicted by special relativity.
Now my question is regarding how time is actually measured, and I guess relates to the nature of time itself. Now a clock itself does not measure time by somehow tapping into whatever we call "time," it is just an arbitrary measure of intervals, using a quartz crystal, or an atom or whatever other measure you want to use. If you were to measure the velocity of a stream of water, you can literally put something into the stream, or "tap" into it to find out its velocity. Clocks do not tap into the "stream" of time do they? They are controlled and run by energy, or a battery that powers the clock. If the battery weakens, the clock may slow down or stop all together, but this does not mean that time is slowing down or stopping. So how exactly does the slowing of a clock in the above situation show that time itself is changing or slowing down. Is it not just a slowing of the physical mechanisms operating the clock? The quartz crystal is vibrating or ticking more slowly, or the atom (I dont know how atomic clocks work) is just "ticking" more slowly. If an object is travelling at a certain velocity, and then slows to a lesser velocity, that does not mean that time has slowed down. I just can't seem to grasp how a slowed clock can provide evidence for slowed time. I'm not trying to come up with a new theory about time or trying to disprove relativity at all, I am just confused on this particular issue, and any help is welcome.