Leaderboard
Popular Content
Showing content with the highest reputation on 03/16/22 in all areas
-
https://www.cnn.com/2022/03/15/europe/ukraine-russian-prisoners-of-war-intl/index.html Seems more and more to be the case. Putin might have not so much have overestimated his military's competence, as underestimated the decency of the average Russian soldier. If humanity is going to survive long term we're going to need to realize we're all cousins, and we're all in this together.4 points
-
Hopefully Ukrainian nuclear power plants of recent years have built-in redundant safeguards against fires like Western power plants. I'd hate to see another Chernobyl.1 point
-
To expand on swansont's answer, scientific theories grow in certainty, over time and as they continue matching observational and experimental evidence, and making correct predictions. The theory of evolution has gained so much indistputable evidence, it is now a fact.1 point
-
1 point
-
Thanks for your input. I'll try to clarify by using four examples based on my current understanding. Information entropy in this case means the definition from Shannon. By physical entropy I mean any suitable definition from physics*; here you may need to fill in the blanks or highlight where I may have misunderstood** things. 1: Assume information entropy is calculated as per Shannon for some example. In computer science we (usually) assume an ideal case; physical implementation is abstracted away. Time is not part of the Shannon definition and physics plays no part in the outcome of entropy calculation in this case. 2: Assume we store the input parameters and/or the result of the calculation from (1) in digital form. In the ideal case we also (implicitly) assume unlimited lifetime of the components in computers or unlimited supply of spare parts, redundancy, fault tolerance and error correction so that the mathematical result from (1) still holds; the underlaying physics have been abstracted away by assuming nothing ever breaks or that any error can be recovered from. In this example there is some physics but under the assumptions made the physics cannot have an effect on the outcome. 3: Assume we store the result of the calculation from (1) in digital form on a real system (rather than modelling an ideal system). The lifetime of the system is not unlimited and at some future point the results from (1) will be unavailable or if we try to repeat the calculation based on the stored data we may get a different result. We have moved from the ideal computer science world (where I usually dwell) into an example where the ideal situation of (1) and (2) does not hold. In this 3rd case my guess is that physics, and physical entropy, play a part. We loose (or possibly get incorrect) digital information due to faulty components or storage and this have impact on the Shannon entropy for the bits we manage to read out or calculate. The connection to physical entropy here is one of the things I lack knowledge about but I'm curious about. 4: Assume we store the result of the calculation from (1) in digital form on an ideal system (limitless lifetime) using lossy compression****. This means that at a later state we cannot repeat the exact calculation or expect identical outcome since part of the information is lost and cannot be recovered by the digital system. In this case we are still in the ideal world of computer science where the predictions or outcome is determined by computer science theorems. Even if there is loss of information physics is still abstracted away and physical entropy plays no part. Note here the similarities between (3) and (4). A computer scientist can analyse the information entropy change and the loss of information due to a (bad) choice of compression in (4). The loss of information in (3) due to degrading physical components seems to me to be connected to physical entropy. Does this make sense? If so: It would be interesting to see where control parameters*** fits into "example 3 vs 4" since both have similar outcome from an information perspective but only (3) seems related to physics. *) assuming a suitable definition exists **) Or forgotten, it's a long time since I (briefly) studied thermodynamics. ***) feel free to post extra references; this is probably outside my current knowledge. ****) This would be a bad choice of implementation for this example in a real case, it's just used here to illustrate and compare reasons for loss of information. https://en.wikipedia.org/wiki/Lossy_compression1 point
-
We define what we mean by speculation But he didn’t call it “the theory of relativity” when he first aired it. The paper was “On the Electrodynamics of Moving Bodies” and Einstein later referred to it as the relativity principle.1 point
-
Not in any scientific sense. Theory is as strong as it gets in science. I suggest your suggestion renders the term meaningless. I'm assuming you define "hard" as "a bar I can set wherever I want". Theories have models and mountains of evidence to support them. When they lack observational support, such as with String Theory, we rely more heavily on the models, but they still have to have evidence if they're considered a theory. No evidence? Are you using your earlier definition of "hard evidence", because his work had plenty of observational evidence, and much of it was based on experiments carried out by others. His explanation solved the mystery with the orbital precession of Mercury. His explanations wouldn't have been credible at all if "he had no evidence". You make assertions rather than ask questions or pose problems, and many of those are either incomplete or just plain wrong. It makes you appear to be doing it consciously to waste time or provoke reactions. I wanted you to know so you can adjust your posting style, or at least know why you get so much pushback on many of your posts.1 point
-
No. Vax (2 shots plus booster) plus confirmed infection from the past equals strongest possible protection. What you have now is an immune system that saw the caterpillar when it was young and can fight against that caterpillar, but that same immune system can’t recognize the butterfly in its current state with its current colors. The caterpillar is no longer relevant and the butterfly is what’s flying around today. Just like getting the flu in 2020 doesn’t much protect you from getting flu in 2022, covid has evolved a LOT since you had your infection 2 years ago so should be mostly ignored in terms of predicting risk.1 point
-
You have some evidence for that or is it a willingness on your part to disparage the Russians? (I don't mean that Stalin didn't purge intellectuals,but that "gene pool" language sounds racist to my ears)1 point
-
https://www.theonion.com/oil-companies-lament-rising-price-of-joe-manchin-18486563041 point
-
1 point
-
I assume you mean that @studiot's system doesn't really change its entropy? Its state doesn't really change, so there isn't any dynamics in that system? It's the computing system that changes its entropy by incrementally changing its "reading states." After all, the coin is where it is, so its state doesn't change; thereby its entropy doesn't either. Is that what you mean? Please, give me some more time to react to the rest of your comments, because I think a bridge can be built between the physical concept of entropy, of which I know rather well, to the computer people like you, of which I'm just trying to understand better. Thank you for your inexhaustible patience, @Ghideon. I try to keep my entropy constant, but it's not easy.1 point
-
It is worth noting and remembering the difference between two derived quantities that sound almost the same. molarity and molality Not quite. 1 part per thousand of solution. So 1 g salt in 999 g water. Since the solution is very dilute the difference is small, but still important. Osmolarity is different again Edit Sleep well and I hope it resolves soon, without any serious effects.1 point
-
The lovely blue there...Why would they have joined NATO? I wonder why they didn't join Putin?1 point
-
That looks almost intelligent. What you've clearly forgotten is that languages are context-dependent. Strange too, that after dismissing the need for a sender, receiver and a channel, you invoke the concept of noise in a channel, and filtering. You don't realise how inane that is. You don't because you have immunity, right? The difference between information and data: there is no physical difference, it's entirely artificial; it's one of those things called a choice. You can't or won't agree of course, because this discussion is all about how much you can disagree with whatever you choose to disagree with. What fun you must be having,-1 points