Jump to content

Measuring Information


Quetzalcoatl

Recommended Posts

Hi,

 

Recently, I've been looking into the subject of information. After seeing a video on fractals I figured that a fractal is, in some sense, a dimension with missing pieces. The missing pieces can then be interpolated, if one likes, though without adding any new information.

 

The example I've been looking at is the function f:R->R, f(x)=sin(x).

It seems obvious that f being a periodic function would contain less information than some other function, say, sinc(x) (defined at zero to be equal to the limit).

It also seems logical that the function f(x)=0 would have even less information than both.

 

You could say that whenever there is less information spread on an entire dimension (like the x-axis), the limited information needs a rule/symmetry that would tell us how to fill in the gaps. In the sine's case we say that sin(x)=sin(x+2pi). Of course, one has to somehow count the information resulting from the law itself.

 

All this led me to the conclution that the more unpredictable a function is, the more information does it contain, so, noise actually has the most information one can get. :confused:

 

Maybe physical laws and symmetries are just a cover for the lack of information in the universe, or to an equivalent description - maybe the universe's dimensions are not full, but are actually fractals?

 

I'd like to hear any thoughts and comments you people have about this... :)

Link to comment
Share on other sites

so, noise actually has the most information one can get.
Yup.

 

Of a simple thing like a string of characters, it contains more information if it takes more data to describe it. This is an okay introductions.

 

Maybe physical laws and symmetries are just a cover for the lack of information in the universe
Kinda. There's an idea in biology that a lack of information causes a default towards symmetry.
Link to comment
Share on other sites

All this led me to the conclution that the more unpredictable a function is, the more information does it contain, so, noise actually has the most information one can get. :confused:

 

Correct. It takes more information to accurately describe noise. Whether the noise conveys anything or not is a different question. Incidentally, compression results in a string of data that looks almost like noise.

Link to comment
Share on other sites

How do you measure the "surprise" in a message/function? My first guess was that predictability has something to do with auto-correlation, but that doesn't seem enough. There is correlation and there is independence. Independence implies zero correlation, but zero correlation doesn't necessarily imply independence. I feel I need to measure the auto-dependence of a function to measure its predictability.

 

But how do I do that? How do I measure the auto-dependence (the "surprise") of a function?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.