Ambition On Acid Posted August 26, 2011 Posted August 26, 2011 Hello everyone. I have two questions, that I've been turning over in my head, I hope some one can clarify. The first one: When a particle (or some other quantum system) is in an uncollapsed superposition of states, do each of these states, when considered individually, have different (assumably very small) entropy values to one another? The second (unrelated) one: When the probability amplitudes of a wavefunction are normalised, must the sum always equal 1? Thanks!
ajb Posted August 26, 2011 Posted August 26, 2011 (edited) The second (unrelated) one: When the probability amplitudes of a wavefunction are normalised, must the sum always equal 1? I will answer this one first, I will have to think about the first question. Maybe someone else has a nice answer to hand for you. Anyway, we have decided that probability must lie between 0 and 1. "Zero means no chance and one means it must happen". As the wave function squared is interpreted as a probability density it makes perfect mathematical sense to normalise the wave function so that the probability of finding the particle somewhere is one. This is really tied to how we define probability rather than some deep quantum meaning. Edited August 26, 2011 by ajb
timo Posted August 26, 2011 Posted August 26, 2011 When a particle (or some other quantum system) is in an uncollapsed superposition of states, do each of these states, when considered individually, have different (assumably very small) entropy values to one another? Entropy is a property of a concept called "ensemble" (or "macrostate" in less math-oriented fields), not a property of what is called a state (->"microstate") in introductory QM books.
Ambition On Acid Posted August 26, 2011 Author Posted August 26, 2011 Entropy is a property of a concept called "ensemble" (or "macrostate" in less math-oriented fields), not a property of what is called a state (->"microstate") in introductory QM books. Okay, but if we were to treat the microstates like macrostates would each one (hypothetically) have independent entropy values?
timo Posted August 26, 2011 Posted August 26, 2011 If you define ensembles that each contain only a single state then the entropy of all those different ensembles is zero by definition - in an ensemble where each element is equally likely the entropy is just the logarithm of the number of elements (presumably the number of independent elements in QM, but that doesn't really matter here) times a constant.
ajb Posted August 26, 2011 Posted August 26, 2011 There is the notion of the von Neumann entropy, which uses the density matrix formulation of mixed states. The von Neumann entropy for pure state is zero.
Ambition On Acid Posted August 26, 2011 Author Posted August 26, 2011 There is the notion of the von Neumann entropy, which uses the density matrix formulation of mixed states. The von Neumann entropy for pure state is zero. So this pure state your referring to is just the wavefunction?
timo Posted August 26, 2011 Posted August 26, 2011 Stritcly speaking: Other way round, the wavefunction is a representation of a pure state. Very loosely speaking: yes, a pure state is just a wavefunction.
Ambition On Acid Posted August 26, 2011 Author Posted August 26, 2011 Stritcly speaking: Other way round, the wavefunction is a representation of a pure state. Very loosely speaking: yes, a pure state is just a wavefunction. So is it possible for a wavefunction to have a entropy higher than zero?
swansont Posted August 26, 2011 Posted August 26, 2011 This might help. http://www2.ph.ed.ac.uk/teaching/course-notes/documents/70/1453-lecturesp14.pdf
ajb Posted August 26, 2011 Posted August 26, 2011 So is it possible for a wavefunction to have a entropy higher than zero? If the state is not a pure state then the entropy is non-zero. The von Neumann entropy is a measure of how "mixed" the state is. For finite dimensional systems I think you can make this more precise. von Neumann entropy is an important idea in quantum information theory.
Ambition On Acid Posted August 26, 2011 Author Posted August 26, 2011 If the state is not a pure state then the entropy is non-zero. The von Neumann entropy is a measure of how "mixed" the state is. For finite dimensional systems I think you can make this more precise. von Neumann entropy is an important idea in quantum information theory. So to put this into simple, layman terms : A pure state is ONE wavefunction, (Ψ) while a mixed state is several ensemble 'wavefunctions.' (?)
ajb Posted August 26, 2011 Posted August 26, 2011 So to put this into simple, layman terms : A pure state is ONE wavefunction, (Ψ) while a mixed state is several ensemble 'wavefunctions.' (?) Loosely that is ok. A mixed state is a statistical ensemble of pure states.
Ambition On Acid Posted August 26, 2011 Author Posted August 26, 2011 Loosely that is ok. A mixed state is a statistical ensemble of pure states. As I said: layman terms.
ajb Posted August 26, 2011 Posted August 26, 2011 You should just think of a large collection of pure states.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now