Jump to content

Recommended Posts

Posted

Hi all,

I know the equations and defenitions etc of entropy in information theory, but wanted to know the meaning of entropy 0.875 bits and 30 bits etc. I heard that passwords need to have higher entropy. But english language has an entropy 2.5 bits means what?confused.gifohmy.png
What do these bits convey?



-Devanand T

Posted (edited)

I haven't done this since back in school. But if my memory serves me, the values of entropy in this case is the minimum amount of bits needed to encode one piece of information in a message.

 

So if you say the English language has an entropy of 2.5 (which sounds high btw), that means a perfectly encoded message will have the length in bits of n*2.5, where n is the length in number of characters.

 

Again, I'm pulling all this from memory and could be wrong.

 

http://en.m.wikipedia.org/wiki/Entropy_(information_theory)

Edited by pwagen

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.