-
Posts
4785 -
Joined
-
Days Won
55
Content Type
Profiles
Forums
Events
Everything posted by joigus
-
Physicists create compressible optical quantum gas
joigus replied to beecee's topic in Quantum Theory
"Degeneracy" in quantum mechanics means that states that differ in the values of other quantum numbers have the same energy. -
Physicists create compressible optical quantum gas
joigus replied to beecee's topic in Quantum Theory
"The photons begin to overlap" means that the photons, bosons that they are, begin to favour forming a common state (Bose condensation). The thing going on in BH's is the opposite quantum-degeneracy force: neutrons are fermions, so they cannot be in the same quantum state. In the case of BH formation, quantum-degeneracy force is overcome by gravitation. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
Thanks!!! These notes are quite more complete than the video. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
Yeah, that's a beautiful idea, and I think it works. You can sub-divide also reversible processes into infinitesimally small irreversible sub-processes, and prove it for them too. I think it was very soon that people realised that entropy must be a state function, and the arguments were very similar. I tend to look at these things more from the point of view of statistical mechanics. Boltzmann's H theorem has suffered waves of criticism through the years, but it seems to me to be quite robust. https://en.wikipedia.org/wiki/H-theorem#Criticism_and_exceptions I wouldn't call these criticisms "moot points," but I think all of them rest on some kind of oversimplification of physical systems. Note that some members may find interesting: I've been watching a lecture by John Baez dealing with Shannon entropy, and the second principle of thermodynamics in the context of studying biodiversity. It seems that people merrily use Shannon entropy to run ecosystem simulations. Baez says there is generally no reason to suppose that biodiversity-related Shannon entropy reaches a maximum, but there are interesting cases where this is true. Namely, when there is a dominant, mutually interdependent cluster of species in the whole array of initial species. -
Exiobiology and Alien life:
joigus replied to beecee's topic in Evolution, Morphology and Exobiology
My thoughts exactly. -
Yeah, there's an element of revenge there, I think. Tit for tat is very strongly etched in our behaviours, biologically. It's very hard to let go of it. And thanks, DR.
-
Brilliant points there!
-
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
Actually, I don't think you're overthinking this at all. It does go deep. The \( P\left(\alpha,t\right) \)'s must be a priori probabilities or propensities and do not correspond to equilibrium states at all. I'm not familiar with this way of dealing with the problem. The way I'm familiar with is this: These P(α,t) 's can be hypothesized by applying the evolution equations (classical or quantum) to provide you with quantities that play the role of evolving probabilities. This is developed in great detail, e.g., in Balescu. You attribute time-independent probabilities to the (e.g. classical mechanics) initial conditions, and then you feed those initial conditions into the equations of motion to produce a time flow: \[q_{0},p_{0}\overset{\textrm{time map}}{\mapsto}q_{t},p_{t}=q\left(q_{0},p_{0};t\right),p\left(q_{0},p_{0};t\right)\] That induces time map onto dynamical functions: \[A\left(q_{0},p_{0}\right)\overset{\textrm{time map}}{\mapsto}A\left(q_{t},p_{t}\right) \] So dynamics changes dynamical functions, not probabities. But here's the clever bit: We now do a tradeoff between averages of time-evolving variables weighed against a fixed probability density of initial conditions, and averages of fixed variables weighed against a time-evolving probability density on phase space. All of this is done by introducing a so-called Liouville operator that can be written in terms of the time-translation generator, the Hamiltonian, through the Poisson bracket. This Liouville operator produces the evolution of dynamical functions: \[\left\langle A\right\rangle \left(t\right)=\int dq_{0}dp_{0}\rho\left(q_{0},p_{0}\right)e^{-L_{0}t}A\left(q_{0},p_{0}\right)\] Because of the properties of this Liouville operator, you can swap its action by using integration by parts on this first-order Liouville differential operator and get, \[\int dq_{0}dp_{0}e^{L_{0}t}\rho\left(q_{0},p_{0}\right)A\left(q_{0},p_{0}\right)=\int dq_{0}dp_{0}\rho\left(q_{0},p_{0}\right)e^{-L_{0}t}A\left(q_{0},p_{0}\right)\] You can think of \(e^{L_{0}t}\rho\left(q_{0},p_{0}\right)\) a time-dependent probability density; while \(e^{-L_{0}t}A\left(q_{0},p_{0}\right)\) can be thought of as time-dependent dynamical functions. For all of this to work, the system should be well-behaved; meaning that it must be ergodic. Ergodic systems are insensitive to initial conditions. I'm sure my post leaves a lot to be desired, but I've been kinda busy lately. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
OK, I've been thinking about this one for quite a while and I can't make up my mind whether you're just overthinking this or pointing out something deep. I'm not sure this is what's bothering you, but for the \( P\left( \alpha \right) \) to be well-defined probabilities they should be some fixed frequency ratios to be checked in an infinitely-many-times reproducible experiment. So this \( -\int P\left(\alpha\right)\ln P\left(\alpha\right) \) already has to be the sought-for entropy, and not some kind of weird thing \( -\int P\left(\alpha,t\right)\ln P\left(\alpha,t\right) \). IOW, how can you even have anything like time-dependent probabilities? I'm not sure that's what you mean, but it touches on something that I've been thinking for a long time. Namely: that many fundamental ideas that we use in our theories are tautological to a great extent in the beginning, and they start producing results only when they're complemented with other ancillary hypotheses. In the case of the second principle "derived" from pure mathematical reasoning, I think it's when we relate it to energy and number of particles, etc., and derive the Maxwell-Boltzmann distribution, that we're really in business. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
Thanks, @studiot. From skimming through the article I get the impression that prof. Beh-Naim goes a little bit over the top. He must be a hard-line Platonist. I'm rather content with accepting that whenever we have a system made up of a set of states, and we can define probabilities, frecuence of occupation, or the like for these states; then we can meaningfully define an entropy. What you're supposed to do with that function is another matter and largely depends on the nature of the particular system. In physics there are constraints that have to do with conservation principles, which winds up giving us the Maxwell-Boltzmann distribution. But it doesn't bother me that it's used in other contexts, even in physics. I'll read it more carefully later. -
Yeah, they wish!! All Andalusian horses are Andalusian; but not all Andalusians are horses, as you well know. Amen. 🤣🤣🤣
-
A Sevillana. And she's singing with Andalusian pronunciation. Pepe Marchena: "La Rosa del Jardinero" (The Gardener's Rose). This is cante jondo (deep singing).
-
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
I haven't heard about it. I've just read a short review and it sounds interesting. Thank you. I would agree that entropy doesn't necessarily represent ignorance. Sometimes it's a measure of information contained in a system. An interesting distinction --that I'm not sure corresponds to professor Ben-Naim's criterion-- is between fine-grained entropy --which is to do with overall information content-- and coarse grained entropy --which is to do with available or controlled information. OTOH, entropy is the concept which seems to be responsible for more people telling the rest of the world that nobody has understood it better than they have. A distinguised example is Von Neumann*, and a gutter-level example is the unpalatable presence that we've had on this thread before you came to grace it with your contribution, Genady. *https://mathoverflow.net/questions/403036/john-von-neumanns-remark-on-entropy -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
Thanks both for your useful comments. I'm starting to realise that the topic is vast, so fat chance that we can get to clarify all possible aspects. I'm trying to get up to speed in the computer-science concept of entropy beyond the trivial difference of a scale factor that comes from logs in base e to logs in base 2. It may also interesting to notice that Shannon entropy is by no means only applied to messages, channels, senders, and receivers: (with my emphasis, and from https://en.wikipedia.org/wiki/Entropy_(information_theory)#Relationship_to_thermodynamic_entropy.) In the meantime I've reminded myself as well that people are using entropy (whether base 2 or e) to discuss problems in evolution dynamics. See, e.g., https://en.wikipedia.org/wiki/Entropy_(information_theory)#Entropy_as_a_measure_of_diversity https://www.youtube.com/watch?v=go6tC9VSME4&t=280s Even in the realm of physics alone, there are quantum concepts, like entanglement entropy, quantum entropy a la Von Neumann, defined as a trace of matrices, or even entropy for pure states, \( -\int\left|\psi\right|^{2}\ln\left|\psi\right|^{2} \). The unifying aspect of all of them is that they can all be written one way or the other, schematically as, Entropy(1,2,...,n)=-Sum(probability)log(probability) for any partition into cells or states of your system that allow the definition of probabilities p(1), ..., p(n) Entropy, quite simply, is a measure of how statistically-flattened-out whatever system you're dealing with, that admits a partition into identifyiable cells that play the part of "states", that is extensive (additive) for independent systems when considered together, and concave with respect to partitions into events: Being so general, it seems clear why it has far outstripped the narrow limits of its historical formulation. Control parameters Control parameters are any parameters that you can change at will. In the case of an ideal gas, typically we refer to control parameters as the extensive variables of the system, like the volume and the energy (those would be, I think, analogous to input data, program, ROM, because you can change them at will. Other things are going on within the computer that you cannot see. Result of a computation is unpredictable, as well as the running state of a program. As to connection to physics, I suppose whether a particular circuit element is in a conducting state or interrupt one could be translated into physics. But what's important to notice, I think, is that entropy can be used in many different ways that are not necessarily very easy to relate to the physical concept (the one that always must grow for the universe as a whole.) As @studiot and myself have pointed out (I'm throwing in some other I'm not totally sure about): No analogue of thermodynamic equilibrium No cyclicity (related to ergodicity) => programs tend to occupy some circuits more likely than others No reversibility No clear-cut concept of open vs closed systems (all computers are open) But in the end, computers (I'm making no distinction with message-sending, so potentially I'm missing a lot here) are physical systems, so it would be possible in principle to write out their states as a series of conducting or non-conducting states subject to conservation of energy, and all the rest. Please, @Ghideon, let me keep thinking about your distinctions (1)-(4), but for the time being, I think physics has somehow been factored out of the problem in (1) and (2), but is not completely not there, if you know what I mean. It's been made irrelevant as a consequence of the system having been idealised to the extreme. In cases (3) & (4) there would be no way to not make reference to it. Going back to my analogy physics/computer science, it would go something like this: Input (control parameters: V, E) --> data processing (equation of state f(V,P,T)=0) --> output (other parameters: P,T,) Message coding would be similar, including message, public key, private key, coding function, etc. But it would make this post too long. -
Which came first, the chicken or the egg?
joigus replied to Jalopy's topic in Brain Teasers and Puzzles
I'm a stickler for clear definitions. I just don't know what this is doing on Brain Teasers and Puzzles. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
I assume you mean that @studiot's system doesn't really change its entropy? Its state doesn't really change, so there isn't any dynamics in that system? It's the computing system that changes its entropy by incrementally changing its "reading states." After all, the coin is where it is, so its state doesn't change; thereby its entropy doesn't either. Is that what you mean? Please, give me some more time to react to the rest of your comments, because I think a bridge can be built between the physical concept of entropy, of which I know rather well, to the computer people like you, of which I'm just trying to understand better. Thank you for your inexhaustible patience, @Ghideon. I try to keep my entropy constant, but it's not easy. -
Great Oxygenation Event: MIT Scientists’ New Hypothesis
joigus replied to Genady's topic in Science News
No, I'm sure I made a mistake there. RuBisco is about Calvin-Benson cycle, right? Producing sugars. 😅 Thanks for catching me. OK. Thank you. I'm dabbling on chemistry/biochemistry, so some of the details went over my head, but I think I got the gist of it. Were those oxygenation events really that sudden? The banded iron formation suggests some kind of periodicity in that massive oxidation of the oceanic iron. Could a picture that you just sketched be compatible with some kind of periodicity? The closest I know from my past study of differential equations is the periodicity in populations that appears in models of competing species, like the Volterra model, which is about a predator-prey dynamics. Does a pattern like that make any sense at all in biochemistry of the oceans? I address this mainly to @CharonY and @exchemist, but feel free to answer other members, of course. Calvin-Benson always reminds me of underwear 😆 -
Great Oxygenation Event: MIT Scientists’ New Hypothesis
joigus replied to Genady's topic in Science News
Very interesting thread and comments... RuBisco is the single most abundant protein on the biosphere. I would be surprised that it didn't have to do with major oxygenation events in the past. But the origins of this oxygenation event may be buried in complexity. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
😆 I would seriously would like this conversation to get back on its tracks. I don't know what relevance monoidal categories would have in the conversation. Or functors and categories, or metrics of algorithmic complexity (those I think came up in previous but related thread and were brought from out of the blue by offended member.) Mentioned offended member then shifts to using terms as "efficient" or "surprising," apparently implying some unspecified technical sense. Summoning highfalutin concepts by name without explanation and dismissing everything else everyone is saying on the grounds that... well, that they don't live up to your expectations in expertise, I don't think is the most useful strategy. I think entropy can be defined at many different levels depending on the level of description that one is trying to achieve. In that sense, I think it would be useful to talk about control parameters, which I think say it all about what level of description one is trying to achieve. Every system (whether a computer, a gas, or a coding machine) would have a set of states that we can control, and a set of microstates, that have been programmed either by us or by Nature, that we can't see, control, etc. It's in that sense that the concept of entropy, be it Shannon's or Clausius/Boltzmann, etc. is relevant. It's my intuition that in the case of a computer, the control parameters are the bits that can be read out, while the entropic degrees of freedom correspond to the bits that are being used by the program, but cannot be read out --thereby the entropy. But I'm not sure about this and I would like to know of other views on how to interpret this. The fact that Shannon entropy may decrease doesn't really bother me because, as I said before, a system that's not the whole universe can have its entropy decrease without any physical laws being violated. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
My code seems to have worked. True colours in full view. -
Which came first, the chicken or the egg?
joigus replied to Jalopy's topic in Brain Teasers and Puzzles
Samuel Butler Yes. As many have said or implied before, the egg is the arrangement of genetic material to be tested against the environment. The chicken is but the sequence of later developmental stages of that egg, set against different kinds of environments: pre-natal, peri-natal, young, reproductive, post-reproductive. There's no chicken that didn't come from an egg. There are thousand upon thousands of eggs that never make it to become a chicken. -
How can information (Shannon) entropy decrease ?
joigus replied to studiot's topic in Computer Science
What's with the quotation marks? You seem to imply that no word in a code like Swedish would be surprising. What word in Swedish am I thinking about now? -
I thought the last one was a Namib elephant, but I don't think it is. It's probably a big tusker from Tsavo, in Kenya. https://www.theguardian.com/environment/gallery/2019/mar/20/the-last-of-africas-big-tusker-elephants-in-pictures Thanks for the pictures. That's exactly what it looks like. Thank you.
-
As to scale symmetries in general, studying the invariance properties of a certain theory under re-scalings can be a useful tool, but I wouldn't try to read too much into it physically*. The reason is that, while symmetries as rotations and translations have a very transparent, very direct interpretation, that's not the case for scale transformations. Rotations can be easily viewed from the active point of view. I can rotate a piece of experimental equipment; I can rotate the whole laboratory (active transformations). I can think of extrapolating this operation to include the whole universe. In that purely theoretical, ideal, scenario, actually rotating the whole universe would be mathematically equivalent to the inverse passive transformation (simple re-labelling of coordinates) of my frame of reference. Same goes for translations. You can't do the same with scalings. IMO, you would have to be very careful to explain how these observers could tell that their scales change from point to point or region to region (continuously as you move from one to the other?). I think @studiot has made this point before, and what I'm doing basically is rephrasing, or ellaborating a little bit on what he said: That's what I meant by 'slippery slope.' *By 'physically' I mean considering different observers that 'see' different scales. How do they know?