Jump to content

Recommended Posts

Posted

We Now From Information Theory That Entropy Of Functions Of A Random Variable X Is Less Than Or Equal To The Entropy Of X.

25s4i95.jpg

 

Does It Break The Second Law Of Thermodynamic?

 

 

Posted

No relationship. Information theory calls it "entropy" by analogy.

And anyway, the increase of entropy applies under certain conditions.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.