Jump to content

Recommended Posts

Posted (edited)

I have some difficulties to make sense of Entropy. I mean if I inspect a system which is working in changing physical circumstances like increased or decreased temperatur, the system is constantly try to adapt to the new circumstances to maintain equilibrium in the system.

 

The question is why we call the state of equilibrium the maximum entropy (or maximum disorder) when It seems that it is the maximum order in proportion to the physical circumastances we inspect the system in. I mean if I inspect an isolated system as I understood this system will always approach to equilibrium. But why we call this state maximum disorder? In that inspected equilibrium I can predict the correlation of the components in the whole system. (like the density of a NaCl solution) But if we speak about maximum disorder wouldn´t that mean that the system behaves chaotic and I can not predict the properties of the system?

 

 

Where from my confusion originates:

 

Does information increases entropy?

 

If I inspect a human body and all of its molecular functionalities, the system is in constant change and motion. It is a very well orchestrated order, but the disorder of the system is more likely to happen presented as diseases. In this case I would say the the entropy a system increases with the amount of information presented in its operations.

 

An other example: Like the concentration of whiteblodcells when a bacteria is present in the circulation. The equilibrium of the general whiteblodcell concentration in the system changed (disorder increased) and so I would say entropy increased. As soon as the bacteria is eliminated the system turns back to equilibrium where the concentration of the WBC will be more balanced. Order increases. Or?

 

What I missunderstand on Entropy?

Edited by 1x0
Posted

 

1x0

Where from my confusion originates:

 

It depends what you understand the words, Entropy, Order and Disorder to mean.

 

 

1x0

But if we speak about maximum disorder wouldn´t that mean that the system behaves chaotic and I can not predict the properties of the system?

 

That's a rather extreme view.

Why would disorder being at a maximum make a system completerly unpredictable?

Could there not be a scale of predictability?

Posted

No living thing is ever even close to thermodynamic equilibrium.

trying to do much thermodynamics with the human body isn't going to get you anywhere- especially if you are not sure what entropy is.

Posted (edited)

Why do people always pick complex systems to try to understand entropy ?

 

Take a box, with a partition down the middle separating two differing gases. The difference between the gases is arbitrary, it could be temperature, composition, or even colour. It is in equilibrium, but there is only one way to 'organize' this separation.

Now remove the partition from the box and the gases quickly arrive at a new equilibrium, whether a median temperature, a mixed composition, or even a mixed colour. However there are now a multitude of ways, or 'organizations', for this mixing.

This increase in the number of ways to organize the system, or degrees of freedom, is a measure of the increase in entropy of the system ( note also that with gases of differing temps, work can be done, but after the partition is removed and equilibrium is reached, no more work can be done as there is no temp difference ).

 

This is what we mean by order and dis-order, and how entropy increases in non-reversible systems

Edited by MigL

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.