Hello,
I've come upon a (probably very easy) problem while writing about information theory and transmission of information, but be that as it may, I'm stuck. Since it is a rather specific area of science, I'll give a brief introduction, so maybe somebody who isn't familiar with it can help me as well.
This is an excerpt that I am having trouble understanding.
"For simplicity, we will assume that our source emits symbols into the channel at a rate of one per second.
The transmission rate R is defined to be the average number of bits of information transmitted across the channel per second (the observant reader will note that R is nothing other than mutual information between source and receiver).
Suppose that we transmit information for t seconds, then we transmit, on average, tR bits of information.
If we are using a binary code, then the average total number of symbols transmitted in t seconds is floor(2tR)."
My problem is this.
Let's take the simplest case, where t=1, and we have a noisless channel so R=1 as well.
So, according to the forumula, we get that the average total number of symbols transmitted in one second is 2, but the source emits symbols into the channel at a rate of one per second. The source sends one symbol per second, but two are transmitted. How can that be?
Or, if t=3, and R=1. We transmit information for three seconds, and a source emits one symbol per second, but somehow average number of symbols transmitted is 8? What am I missing?
Thank you in advance!