Luke0292 Posted January 20, 2019 Posted January 20, 2019 Hi everybody, i have a question about machine learning. I'm not sure about how joint entropy and mutual information work. Since: https://imgur.com/mZU383m For the first equation, we have that: https://imgur.com/C1zIHFT H(x,y) seems to be: 'everything that are not in common between x and y'. But for the second: https://imgur.com/a/3iWIyps In this case H(x,y) can not be 'everything that are not in common between x and y', otherwise the result would not be the mutual information I(x,y). So, how should I read H(x,y)?
Sensei Posted January 21, 2019 Posted January 21, 2019 (edited) It looks very alike to bitwise operations: e.g. AND, OR, XOR, NAND, NOT etc. https://en.wikipedia.org/wiki/Bitwise_operation If you will do operation: 1 | 2 = 3 (binary %01 | %10 = %11) 3 & 2 = 2 (binary %11 & %10 = %10) etc. etc. Bitwise operations can be used not only on single bits, but vectors, lists, arrays, images, sounds etc. etc. (which are also just a bunch of bits). XOR operator. %110 ^ %011 = %101 Edited January 21, 2019 by Sensei
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now