restfull Posted February 14, 2006 Posted February 14, 2006 I was wondering if anyone could explain the underlying significance of the form of Boltzmann's formulation of Entropy S = k ln (omega) where omega is the number of microstates. Why is it that the entropy increases with the log of omega?
timo Posted February 15, 2006 Posted February 15, 2006 Well, it´s simply a definition; although it´s suited to get in contact with the entropy defined in (nonstatistical-) thermodynamics. Some properties which make it look like a good choice to me: 1) As long as the map from the number of microstates to entropy is a monotonous rising one (A > B => log(A) > log(B) ), the very important "the system will be in the macrostate with the most associated microstates"-axiom still translates to "entropy will be at maximum". 2) It seems like a practical definition: I can imagine you often encounter problems where you multiply numbers of microstates. Since log(A*B) = log(A)+log(B), entropy is an additive number for those problems. Sry for being so vague in point 2 but I don´t have a good example in mind right now; perhaps someone else has. Either way: From the physics-side, it doesn´t really matter if you take the log or not; it changes the equations but it´s still the same physical entity. Perhaps it´s comparable to measuring temperature in Fahrenheit or Kelvin, only a tick more sophisticated.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now