-
Posts
18270 -
Joined
-
Last visited
-
Days Won
104
Content Type
Profiles
Forums
Events
Everything posted by studiot
-
Works for me, if you can take the distance when Q is directly above F V (ie on the y axis) to be (QV + FV). That is a doubling back. As an example if the parabola is (KISS) y=x2 (focus at 0,0.25) and the line L = 1 then the length is 1.25 or 5/4
-
Another adjustment please. A closed system does not allow mass (matter) to cross its boundaries. Energy, however, may do so. An isolated system bars both mass and energy.
-
How about an acorn cup falling to the ground, becoming detached from its acorn and becoming fossilised? Did you catch my post#20?
-
I have tried to avoid discussion about the cup because I think it is a distraction. However there are some points that need adjustment. Actually you can and we do. That is what thermodynamics is all about. You work from (equilibrium) state A to (equilibrium) state B... to (equilibrium) state C and so on. So you can and we do start at any defined state and finish on another defined state and not worry about what comes before or after. Actually it was a cup of tea.
-
Thermodynamic arrow of time. Equating Entropy and Disorder?
studiot replied to Sorcerer's topic in Classical Physics
puppypower you may like to look at Mollier diagrams. They plot the relationship between enthalpy and entropy. https://www.google.co.uk/search?hl=en-GB&source=hp&biw=&bih=&q=mollier+diagram&gbv=2&oq=mollier&gs_l=heirloom-hp.1.0.0j0i10j0j0i10l3j0j0i10l2j0.1593.4156.0.6875.7.7.0.0.0.0.219.985.0j6j1.7.0....0...1ac.1.34.heirloom-hp..0.7.985.aV-4v27-bBE It is not often explained, but the relationship between entropy and the 'arrow of time' is classical and arises thus. Most equations (note I said equations, there are a few laws but lots of equations) in thermodynamics do not contain time. Time is not a (thermodynamic) state variable. This is also true of the second law that leads to the arrow statement. However thermodynamics is about (thermodynamic) processes and processes are loosely about changes of (thermodynamic) state. Underlying the arrow of time is the assumption that a system cannot be in two (thermodynamic) states at once. So a variable is required that can be used as the independent variable in charting progress of a process. This variable is chosen to be time. So in changing from one defined state to another the second law suggests that entropy is a state variable that can never decrease in value at the end of any such change of state. A (thermodynamic) state is defined as complete set of values of all (thermodynamic) variables, so whatever happens the other variables must adjust to meet this requirement on entropy. So the only permitted changes are those which produce a compatible set of values of state variables for the second state in the process. This is colloquially known as the arrow of time. When you break it down there are several indirect steps to lead to the result. -
Scientific testing (split from goal of science)
studiot replied to Reg Prescott's topic in General Philosophy
Hear! Hear! +1 -
You asked for an equation. I do not have a suitable definition of disorder (or order for that matter) to offer one so I did the next best thing and gave you an answer specifying the conditions (as well as I could) to obtain an equation.
-
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
Need I say more? -
Entropy can be quantified by definition, but you would need a quantifying definition of disorder, using connecting variables, to achieve an equation. Do you have one?
-
Example of what please?
-
Thermodynamic arrow of time. Equating Entropy and Disorder?
studiot replied to Sorcerer's topic in Classical Physics
puppypower you have obviously studied some thermodynamics. Unfortunately you have some misconceptions mixed in there. No, internal energy and enthalpy are different properties. Enthalpy is sometimes called the heat content. Yes entropy is a state variable, but states are defined as equilibrium states. What if the system is not in an equilibrium state? Most of the time, most of the universe is not in an equilibrium state. -
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
You claim to have had philosophy training in logical thinking. But this and this are prime examples of this I believe the phrase is 'Hoist your your own petard'. Whilst you steadfastly ignore my actual point, You are the one who introduced probability in relation to both the Monty Hall problem and quantum mechanics. But the problem is you do not understand probability so your reasononing is flawed. I said your example was a poor one to use but you insisted on using it so I am trying to make do with it. The above analysis make no sense. Here is my version and I make no claims that it is perfect, I may have forgotten something vital. Your traveller approaches the junction for the first time in history. As a result you have no information about the probabilities as to which way she will turn. So your best estimate is equal probabilities. But you also know that people drop dead and let us say the probability is 0.2. So your probabilities are now 0.4 + 0.4 + 0.2 = 1 This is very important because these are known as prior or anterior probabilities. In this case they are subjective prior probabilities. There are also objective prior probabilities. Prior probabilities are assigned before the beginning of the event and are set on the basis of the best known information at that time. Thus they may not be 'truth' , whatever that means. The crux of the Monty Hall problem is what happens next. The traveller turns say left and the probabilities are now changed as a result of better infomation. This is also the key theorem that Bayes introduced in the early 1700s and Laplace developed in the late 1700s. The probabilities are now P(turn left) = 1 P(turn right) = 0 P(drop dead) = 0 But this cannot occur until after the event. But there is still more about probability. What does the probability P(event) = 1 mean particular in the case of future events, which is after all the reason for doing all this? Well there are several cases and each has a different meaning. If we can take our prior probabilities to be correct then we call them a priori probabilities (Laplace was french) and P(E) = 1 means that the event will (=must always )occur. If our prior probabilities are objectively acquired P(E) = 1 says that the event has always occurred in the past but does not mean that it will always (or ever) occur in the future. If our prior probabilities are subjective then P(E)=1 then it means that we think the event will occur in the future, but again does not imply that it will. As I said probability gets complicated as you go deeper. Finally if there is any more mud slinging you will need to find a new discussion partner. -
You need first to understand the word observing in this context means 'interacting'. Any interaction with the outside world, animate or inanimate is classed as an observation.
-
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
Rudeness does not improve the validity of your claim. Probabilities cannot be imposed. They are what they are. And yes your analysis is flawed. No, of course you did not say that, I did. Your analysis is flawed because there exist possibilities you cannot exclude. For instance your traveller may drop dead at the junction and simply not proceed. There is always a finite possibility of this, and therefore a finite 'a priori' probability (do you know what this means?) -
With the greatest respect, perhaps you need to consider taking your own advice. The (ancient) Greeks did indeed start with definitions. But they then introduced a set of statements they called postulates and what we now call axioms. They also introduced a second set of statements they called common notions and we also call axioms today. All three of these types are statements made without proof. The Greeks distinguised postulates as those special to the discipline under consideration and common notions as those with wider more general application beyond. We call both of these axioms because we start with the most general and say that the common notions are induced from the more general. What you have described above as postulates they called propositions. Propositions are statements deduced from the original statements by some system of proof. Today we call these theorems and lemmas.
-
Thermodynamic arrow of time. Equating Entropy and Disorder?
studiot replied to Sorcerer's topic in Classical Physics
Why have an analogy when you can have the real thing? Entropy refers to occupancy of energy states. What you say about always being at maximum entropy is true if all the energy states are fully occupied. But most systems are not like this. Most systems have many more unoccupied states than occupied ones so present a huge number of possible occupancy arrangements. -
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
Just to repeat this from my previous post. You really do need to gain a more complete grasp of the nature of probability before making assumptions about assigning probabilities. Every possibility has an associated probability. What ever the type of probability employed the total of all associated probabilities must sum to 1; Since there are more possibilities than left and right, the probabilities for left and right alone cannot sum to 1. With regard to assigning probabilities to a left or right turn, there are constraints in action that you are not accounts for so your model is flawed. -
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
I'd rather leave God out of the discussion. As regards the left v right turn only, this means that something is forcing the left or right turn, that is there are additional undefined constraints involved. If you can quantify the relative strengths of the left and right forcing functions you can use these to create an estimated frequency value for each turn. This frequency can be interpreted as an 'a priori' probability. There I have gone and mentioned one of the recognised forms of probability so let me observe that I don't think you have fully appreciated the types of probability available in your response to the other main point in my last post. -
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
Before discussing probabilities in general you need to set out what exactly you mean. Mathematics recognises more than one form of probability and the meaning of a statement such as 'the probability of event X is 1' will depend upon which definition you are using. Further to establish a probability you have to be able to identify the beginning and the end of an event. , With regards to your example, In the quantum world there is a probability that the traveller will turn neither right nor left but 1) Be reflected back along her path 2) Go straight on regardless and reappear at the other side of the square (this is called quantum tunnelling). Funnily enough using Sunshaker's response in relation to traffic flow can also lead to these responses. 1) Years ago I was cycling around the Fens and came to a T junction that had a signpost point left that said Cambridge as well as a sighpost pointing right that said Cambridge. Since my destination was not Cambridge I turned round and went back. 2) When working in the desert I found many fomer nomads who had converted from camel transport to truck would go stright on across such a junction. They said "This is what we would have done with our camels so why not our trucks" -
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
I thought as much. So why didn't you say so in the first place? I think your hastily thrown together example is a poor one to use for this purpose. I also have this trouble when constructing examples because making a good one is very difficult and usually involves lots of work. -
Probability, Options, and Determination...
studiot replied to Scott Mayers's topic in General Philosophy
Given the title, I will be interested to see where this opening post is leading and what Scott has not yet told us. -
Scientific testing (split from goal of science)
studiot replied to Reg Prescott's topic in General Philosophy
Thank you for completing the references, I can now re read them fully. So did you miss the rest of my post or have you set yourself to silent mode again? If you can't be bothered to answer my comments, how do you expect me to reciprocate? -
Scientific testing (split from goal of science)
studiot replied to Reg Prescott's topic in General Philosophy
You are almost doing it again. That is extrapolating from the qualified to the definite. It would be helpful if you could include the post number in references to older posts, I cannot remember who said those exact words. Then the context can be checked. Without this, I would make of it that the originator qualified his or her statement. This actually ties in with the discussion I have been trying to promote, about statistics in relation to scientific testing. The instances where science can actually state the unequivocal truth are few and far between, and most of these are in hindsight as in my examples. Of course much of science takes place in the present or future and we can find even less certainty there. It is not impossible to make a certain mathematical statement about the future, for example The probability of RedRum winning the 2016 Derby is precisely and exactly zero. I can be certain about this because RedRum died a few years ago. But I cannot correctly make any certain statement about the other horse. Moving on to wider applications, suppose I wish to build a house. So I test the bearing capacity of the soil. Obviously I perform scientific testing to be of any use. But the result that I measure will not be the certain 'truth'. I will never know the exact bearing capacity, and my figure will be reported as a range, say between 75 and 150 kN/m2 My report may even include a statistical probability that the 'true value' lies between these limits. Much, if not most, scientific testing is of this nature. -
Scientific testing (split from goal of science)
studiot replied to Reg Prescott's topic in General Philosophy
It's not so much refusal to discuss than inability to discuss that of which I know nothing. I've read a little on Bayesian confirmation with respect to scientific methodology, but I'm not quite sure what "Non bayesian statistics" refers to in this context. I'm all ears if you'd care to explain. Thanks. Hindsight: The probability of RedRum winning the 2015 Derby is precisely and exactly zero. That is a mathematical statement, also applicable in that other silly thread curently running about applying mathematics to reality, of non bayesian statistics. It uses after the event knowledge (hindsight) to make a statistical statement. I could make another mathematical statement in simialr vein. The probability of Dettori winning the 2015 Derby is precisely and exactly 1. I have chosen examples as simple as I can make them, a good technique when testing a theory or methodology. This line of thinking has enormous implications in the field of testing and can be developed much further. Oh and +1 for a sensible response at last. -
Scientific testing (split from goal of science)
studiot replied to Reg Prescott's topic in General Philosophy
You respond little enough to my posts as it so so please don't mistake what I said or draw false conclusions from it. I said, "...may never know the truth.... " No conclusion as to certainty can be drawn from such phraseology. On occasion, we may also know the truth in hindsight. Non bayesian statistics is about this as is some of the other aspects of scientific testing I offered and you have refused to discuss.