herme3 Posted February 6, 2005 Posted February 6, 2005 I was wondering, is there a limit on how far radio waves can travel? Is there ever a point where a radio wave signal disappears completely, and can't be recovered no matter how powerful the receiver is? For example, my computer's wireless network card and the wireless receiver connected to my other computer both have a maximum distance of 1 mile. This means that if I moved the computers more than one mile apart from each other, I would not be able to send a file from one computer to another. However, if I had a stronger receiver on my other computer, would I be able to increase the distance the computers could be apart, or would the radio wave transmitted by my computer completely disappear after one mile?
timo Posted February 6, 2005 Posted February 6, 2005 There is no general limit on how far electromagnetic waves (radio waves) travel. However, due to absorption in the air and the fact that radio waves do not travel in a straight line but spread out in all directions the intensity of the signal will be reduced with increasing distance. The intensity I(d) at distance d should in a 1st approximation behave something like I(d) = C/d² * exp(-kd), where C is a constant that depends on the power of your sender and k is a constant that describes the absorbtion in the air. so as the intesisty of your signal is reduced with increasing distance you have two options to still read the signal: a) A better reciever. The better your reciever the smaller signal intesities you can still recieve. Due to background noise there shoudl however be a lower limit to the intensities you can recieve. I´d guess that today´s recievers are able to reach that lower limit. Below that level you can still detect the signal by filtering if you have some information on it before (like knowing it´s a periodic signal) but that´s another topic. b) A more powerful sender. Given an unlimited power source you could simply increase the original power of your signal so much that it still has sufficient intesity when it reaches the reciever. Theoretically you can send any distance given a sufficient power source. To sum it up: The signal your one computer sends does not completely dissapear but it´s intensity will be so low that it becomes undetectable due to background noise.
herme3 Posted February 6, 2005 Author Posted February 6, 2005 So, if I send a file from one computer to another using a wireless network, could someone on a planet billions of miles away receive the file if their receiver was strong enough and they filtered out all the background noise?
Cap'n Refsmmat Posted February 6, 2005 Posted February 6, 2005 Only if the reciever had a huge amplifire and could filter out EVERYTHING else. The signal would be ridiculously weak by the time it got to the recieving alien.
timo Posted February 6, 2005 Posted February 6, 2005 Clasically: Yes. But at some point the intensity will be so small that quantization of radiation will play a role (I did not take into account this in above). So when the intensity of the signal falls below a certain level there is a chance to miss the wave (or part of) - regardless of the quality of the reciever. EDIT: But I´d like to add that this quantum mechanical limit is probably only a theoretical one. I would be very surprised if today´s recievers came even close to that limit.
herme3 Posted February 7, 2005 Author Posted February 7, 2005 I just find it interesting that radio waves never disappear completely. I thought radio waves and light were made of photons. Don't the photons in a radio wave break off or get absorbed into other matter until the radio wave completely disappears?
timo Posted February 7, 2005 Posted February 7, 2005 They do get absorbed in matter, yes. The absorbtion was accounted for in the exp(-kd) term in my 1st post. It´s just an approximation that assumes that for each distance interval traveled a certain percentage of the photons are absorbed. However, there will allways be a certain percentage of photons that aren´t absorbed, no matter how far the wave has travelled. This is in analogy of radioactive decay. The number of objects N(t) that didn´t decay till time t is N(t) = N(t=0)*exp(-kt). At no time there will be a certain chance that all objects decayed. However, since N(t=0) is a finite number there will be a time T where N(T)<1 which means (a bit losely speaking) that you cannot be sure to have any object left. This is what I meant with the chance to miss the wave. The wave will disappear due to absorbtion but for all distances there is a finite chance that it doesn´t.
YT2095 Posted February 8, 2005 Posted February 8, 2005 try to think of the radio transmitter like a Candle, when you`re close to the candle you can see the light quite well, the further away you get the less light will strike you and so on, eventualy the candle could be so far away that you`de need a telescope to see it (like an aplifier in a radio). radio waves like light can be "blocked" by particles (you can see the candle 100 metres away, but not if it were foggy outside). so there are 2 ways that a signal maybe diminished. in a Vacuum (like outerspace) "Fog" wouldn`t be the issue, it would be the distance only that decreases the light (signal strength).
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now