dexterdev Posted April 8, 2013 Share Posted April 8, 2013 Hi all,I know the equations and defenitions etc of entropy in information theory, but wanted to know the meaning of entropy 0.875 bits and 30 bits etc. I heard that passwords need to have higher entropy. But english language has an entropy 2.5 bits means what?What do these bits convey?-Devanand T Link to comment Share on other sites More sharing options...
pwagen Posted April 8, 2013 Share Posted April 8, 2013 (edited) I haven't done this since back in school. But if my memory serves me, the values of entropy in this case is the minimum amount of bits needed to encode one piece of information in a message. So if you say the English language has an entropy of 2.5 (which sounds high btw), that means a perfectly encoded message will have the length in bits of n*2.5, where n is the length in number of characters. Again, I'm pulling all this from memory and could be wrong. http://en.m.wikipedia.org/wiki/Entropy_(information_theory) Edited April 8, 2013 by pwagen Link to comment Share on other sites More sharing options...
dexterdev Posted April 8, 2013 Author Share Posted April 8, 2013 the link you gave is related to thermodynamics not information theory Link to comment Share on other sites More sharing options...
pwagen Posted April 8, 2013 Share Posted April 8, 2013 Try again, the link was cut off. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now