Function Posted July 5, 2014 Posted July 5, 2014 Hi everyone Let me get to the point: in my opinion, the equality between "chance" and "probability" isn't very well chosen... Let me illustrate this with an example: Imagine a sheet of paper, of whatever area. On this paper, a part is covered with honey or some other sort of sweet stuff. The chance that a wasp, or a bee, a fly, whatever, lands on a very specific point of the sheet of paper, not covered with honey, is almost 0 (1/infinity). This chance is equal to the chance it will land on a point covered with honey. In my opinion, though, the probability that it will land on a point covered with honey, is much larger than it will land on a point which isn't covered with honey. Is there someone who can follow my reasoning? If not, what the hell am I talking about? Is this still a math-related subject? Thanks. A confused Function.
Prometheus Posted July 5, 2014 Posted July 5, 2014 It's a good question, very much maths related. I think you are confounding two slightly different questions. So let's assume the bee is point sized. The probability that it will land on any one specific point is zero, whether that point is covered in honey or not. The the chance it lands on any specific point covered in honey is the same it will land on any specific point on the un-honeyed area. Here we have two conditional probabilities: the chance it will land on a specific point given it lands somewhere honeyed and the chance it will land on a specific point given it lands somewhere un-honeyed. Both zero. This is different to asking what the chance is of the bee landing on honeyed versus un-honeyed places. We are no longer concerned with specific points but a sum of many points. The sum of all points covered in honey adds up to some area. We then assign a probability that the bee will land in that area - assigning greater probability to it landing on honeyed areas than un-honeyed areas as we know bees like sweet things. However, given it has landed somewhere honeyed one time, and given it has landed somewhere un-honeyed another time, the chances it landied on specific points is the same. Zero. This is a consequence of dealing with infinitesimals. Chance and probability are synonyms - at least i've not come across any distinction in probability texts. I'm not rigorous with my maths but hope that makes sense.
Function Posted July 5, 2014 Author Posted July 5, 2014 Well yes, we dealed with this stuff in high school, but I really do mean, that the "psychological chance" of the bee landing in a honeyed place, is larger than that of it landing on a non-honeyed place, even if the honeyed place is much smaller than the not-honeyed place? Because it does matter for this bee to land on a specific type of surface (food or nothing), but then I'm afraid we aren't in mathematics any longer?
studiot Posted July 5, 2014 Posted July 5, 2014 (edited) Good morning, I typed out a reply, but when I look back here it was gone. Sigh! I think your difficulty is due to language differences. In English chance and probability both have multiple meanings. In particular chance usually conveys the idea of 'without apparent cause or direction', whereas probability compares different outcomes, edit : regardless of cause Scientific English has taken one particular meaning for both chance and probability and codified this. The particular meanings codified are those stated above. You should also look up the words, fluke and random. Both are connected to this subject. Unfortunately the scientific definition of random has (at least) two different meanings. Edited July 5, 2014 by studiot 1
Prometheus Posted July 5, 2014 Posted July 5, 2014 Damn you go to a good school if you learn this at high school. By 'psychological chance' do you mean your personal degree of belief that it will land on a specific point? If so we are getting onto Bayesian probabilities (not my forte), which it sounds like you know about. But Bayesian probability is just as rigorous as traditional probability, they come from the same axioms, so the answer is the same. I've not read around the subject but my lecturer made a distinction between 'subjective' Bayesian stats and 'empirical' Bayesian stats. The former involves probabilities plucked from thin air, while the latter tries to justify why a particular probability was picked. So I would say it is still mathematics, but the end of mathematics that tries to meet with the real world. Physicists probably do this best - maybe one would like to comment.
Function Posted July 5, 2014 Author Posted July 5, 2014 Damn you go to a good school if you learn this at high school. Well, just Belgian high school By 'psychological chance' do you mean your personal degree of belief that it will land on a specific point? If so we are getting onto Bayesian probabilities (not my forte), which it sounds like you know about. But Bayesian probability is just as rigorous as traditional probability, they come from the same axioms, so the answer is the same. I've not read around the subject but my lecturer made a distinction between 'subjective' Bayesian stats and 'empirical' Bayesian stats. The former involves probabilities plucked from thin air, while the latter tries to justify why a particular probability was picked. So I would say it is still mathematics, but the end of mathematics that tries to meet with the real world. Physicists probably do this best - maybe one would like to comment. I mean the chance that the bee will be 'convinced' of landing on the honeyed surface
studiot Posted July 5, 2014 Posted July 5, 2014 (edited) Chance and probability are synonyms - at least i've not come across any distinction in probability texts. I'm not rigorous with my maths but hope that makes sense. Can't agree with this, they are not the same, even in scientific English. Function, please look at my edit to post#4. To further continue, random ( in the statistical sense) implies that the probabilities are the same for all outcomes. It the probabilities are not the same it implies that there is some preference or selection or other driving agent involved. Edited July 5, 2014 by studiot 2
Function Posted July 5, 2014 Author Posted July 5, 2014 (edited) Can't agree with this, they are not the same, even in scientific English. Function, please look at my edit to post#4. To further continue, random ( in the statistical sense) implies that the probabilities are the same for all outcomes. It the probabilities are not the same it implies that there is some preference or selection or other driving agent involved. Ah, now that's what this example is all about. Selection and preference. Thanks. +1 It's like: "pick a random number from 1 to 10". There's an infinitesimale chance of picking the number 2, and the same chance for pi or e. Yet, the probability, in my opinion, of picking 2, is much larger than picking pi or e. The problem was that on a Dutch forum, everyone said that "kans" (chance) and "waarschijnlijkheid" (probability) are both the same. No matter what. Edited July 5, 2014 by Function
Prometheus Posted July 5, 2014 Posted July 5, 2014 Can't agree with this, they are not the same, even in scientific English. I've searched through my probability texts. In one the only time the word chance is used is when referring to gambling probability problems. In another it is used in several different ways one similar to how you have used it. Is there a firm consensus to how the words are used in scientific English?
studiot Posted July 5, 2014 Posted July 5, 2014 Here is what I think is a clearer example than your honey trap. Suppose you play one-handed catch with your little sister, and throw the ball to her 1000 times. Suppose she catches it 700 times with her left hand and 300 times with her right hand. Then the probabilities that she will catch it left handed are 0.7 and right handed 0.3. The implication that can be drawn is that she is left handed and that the hand she chooses to catch with is not by chance. Now suppose the outcome is different. This time she catches the ball 500 times with each hand. Now the implications are that she is ambidextrous and that it is pure chance which hand she uses to catch a particular ball. So in outcome (2) she catches the ball purely at random with either hand. but in outcome (1) her choice of hand is not random.
petrushka.googol Posted July 5, 2014 Posted July 5, 2014 From semantics it appears that the two are analogous by not synonymous. A chance event could crystallize with some probability (random in this case, not definable by say, a simple distribution) but a probabilistic event could be defined as a plot on some underlying curve or distribution. (i.e to say that it is not totally random, unlike in the first instance, but "pseudo-random", if you get my drift).
Bignose Posted July 5, 2014 Posted July 5, 2014 To further continue, random ( in the statistical sense) implies that the probabilities are the same for all outcomes. It the probabilities are not the same it implies that there is some preference or selection or other driving agent involved. This is not the common use in statistics today. The variable that is the sum of 2 fair 6 sided dice is obviously random, but the probability of rolling a 2 is not the same as rolling a 7. Random variables can be distributed in all manner of ways. And only when a random variable is described as 'uniformly random' can you take that to mean that every outcome has the same probability. The word random in the scientific sense simply means not-determinisitic. 1
studiot Posted July 5, 2014 Posted July 5, 2014 (edited) From semantics it appears that the two are analogous by not synonymous. A chance event could crystallize with some probability (random in this case, not definable by say, a simple distribution) but a probabilistic event could be defined as a plot on some underlying curve or distribution. (i.e to say that it is not totally random, unlike in the first instance, but "pseudo-random", if you get my drift) Not even analagous. They embody different concepts. People often (wrongly say) say "what are the chances of....?", when they should say "what is the probability of...?" This is both bad Science and bad English because there is only one probability value per outcome. Asking in the plural makes no sense in either English or Science. To continue my example further suppose we change the rules again, and remove chance entirely. Now, instead of being the nice big brother you are the nasty big brother and you tie your little sister's right hand behind her back. There is no longer a chance that she will catch the ball with the right hand so chance no longer enters into this game. The probability still exists for a catch with either hand, however. 1.0 for the left hand and 0.0 for the right. Edit This is not the common use in statistics today. The variable that is the sum of 2 fair 6 sided dice is obviously random, but the probability of rolling a 2 is not the same as rolling a 7. Random variables can be distributed in all manner of ways. And only when a random variable is described as 'uniformly random' can you take that to mean that every outcome has the same probability. The word random in the scientific sense simply means not-determinisitic. I am describing single events, not combinations of events. But you are correct that for compound events the probabilities need not be evenly distributed. Would you say that a probability of 1.0 is not deterministic? How does your statement above fit with the statistics of a random walk and diffusion? Edited July 5, 2014 by studiot
Bignose Posted July 5, 2014 Posted July 5, 2014 I am describing single events, not combinations of events. But you are correct that for compound events the probabilities need not be evenly distributed. Would you say that a probability of 1.0 is not deterministic? How does your statement above fit with the statistics of a random walk and diffusion? It doesn't matter between single events or combinations. The fact is that it is exceptionally rare for uniform probabilities of events to occur. My example, a random variable that describes the sum of 2 fair 6 sided dice is a 'single event', because I have defined it as so. Sure, in this example, it can also be looked at as two random variables with uniform distributions, but that is an exceptional case, not something that can happen every time. A more complex example: What are the chances a pill manufactured by a drug company has the correct amount of medicine in it? Hopefully that isn't a uniform distribution of 50% yes and 50% no. The assumption of a uniform distribution is usually pretty terrible, all in all. I don't understand what you are asking with "Would you say that a probability of 1.0 is not deterministic?" And the above fits perfectly fine with the statistics of random walks and diffusions. In those case, it is usually explicitly stated up front that a particle has a uniform distribution of directions it can walk in. And then when you get a very large number of those particles obeying that rule, you get diffusion which can be exceedingly well described by a deterministic equation. Another equivalent example would be quantum mechanics and everyday mechanics. Sure, a baseball is made up of atoms and quarks and all these things that obey to-the-best-of-our-current-knowledge random quantum mechanical things. But, when I throw it 97 mph with a little spin at the end, I can still make it dart away from the batter's bat and get the strikeout following deterministic drag and lift fluid mechanics equations. The difference between determinisitc and non-determinisitic is many times a question of scale and a question of numbers. But all this is an aside to my original point that the word 'random' does NOT mean all outcomes are equally likely, and that that is typically a fairly terrible assumption. If that is what was meant, you need to use the term 'uniformly random'. 1
studiot Posted July 5, 2014 Posted July 5, 2014 I don't understand what you are asking with "Would you say that a probability of 1.0 is not deterministic?" And I, in turn, don't understand your difficulty with the question. How can I make it any plainer? Note that I introduced the word 'random', not any of several phrases incorporating theat word, all of which have special meanings. Your phrase random variable is one such and I have already acknowledged that. OED page 2474 Random: Governed by or involving equal chances for each of the actual or hypothetical members of a population; produced or obtained by a process of this kind (and therefore completely unpredictable in detail) Random number A number selected from a given set of numbers in such a way that all the numbers in the set have the same chance of selection. Random walk The movement of something in successive steps, the direction etc of each step being governed by chance independently of preceding steps Random variable whose values are distributed in accordance with a probability distribution. So I see nothing wrong with what I said. Nor do I see how it is not the common usage. I also said that Science has more than one definition for the word random. I was trying to avoid confusion by not including that second usage but are you familiar with the phrase 'random access'? I would venture a guess that there are more computer engineers using this than statisticians in our modern world, but would not claim that this entitles their definition to override all others. I am very open to the idea of cooperation for the benefit of the Op and his thread. So if you have better definitions and/or explanations please state them. Your expanded detail about random variables takes things on beyond what I said, but in no way detracts from the validy of my statements, so why present them as an argument?
Bignose Posted July 5, 2014 Posted July 5, 2014 To further continue, random ( in the statistical sense) ... studiot, it starts from here. You invoked the clause "in the statistical sense", implying this was the definition in use in the branch of mathematics known as statistics. But it isn't. Wikipedia's definition is actually pretty good: http://en.wikipedia.org/wiki/Random_variable In probability and statistics, a random variable, aleatory variable or stochastic variable is a variable whose value is subject to variations due to chance (i.e. randomness, in a mathematical sense).[1]:391 A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability (if discrete) or a probability density function (if continuous), in contrast to other mathematical variables. I don't give a hoot about what the Oxford English Dictionary says, it is not a mathematical text. You invoked the 'statistical sense'; I merely corrected what you said because you misrepresented what is actually done by the mathematicians who work in this area. I thought it was important to get this correct.
studiot Posted July 5, 2014 Posted July 5, 2014 (edited) Again the insulting condescension. I note your quote is identical in substance to mine, but your whole persepective is too narrow. Did you not read my post 15? The word 'random' is an adjective. When applied to one particular noun (as you and I have both agreed) it has a particular meaning. When, in the full gamut of Science and Engineering it is applied to other nouns it has other meanings. Do you actually disagree with any of the reasoning or statements in the example I developed? Edited July 5, 2014 by studiot
Bignose Posted July 5, 2014 Posted July 5, 2014 (edited) Again the insulting condescension. ... Do you actually disagree with any of the reasoning or statements in the example I developed? Yeah. I disagree being called condescending when I correct what I think are mistakes. If you have a problem with me, report it to the mods; there is not need for personal name calling. My quote was NOT identical to yours. You used the OED to say "Governed by or involving equal chances for each of the actual or hypothetical members of a population". But, as I have said several times now, is NOT how it is used in science. I mean, look, let's go back to the example of the variable being the sum of 2 6-sided die. The members of the population this variable can take are 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, and 12. But it is completely and wholly wrong to assume that just because there are 11 members in this population, that this random variable takes the value of them 1/11th of the time (i.e. equal chances for each member). The definition as you cited in the OED is wrong in this case, and is typically wrong in all but a very few cases. And, I'm sorry, but if we're on a science forum, in the mathematics subforum, and someone has invoked "in the statistical sense", I really think the word ought to be used correctly. I apologize if my pointing this out has offended you in any way, that was not my intention. My sole intention is to make sure the word is used per its most commonly used definition in mathematics and science. Edited July 5, 2014 by Bignose
Cap'n Refsmmat Posted July 5, 2014 Posted July 5, 2014 Random events most certainly do not need to have equal probabilities. Probability theory is based on the idea of a measure space, where a measure (a function) assigns probabilities to different events. The measure need not be -- and usually isn't -- a constant function. Bignose is right to say this is the special case of "uniformly random" or a "uniform distribution." I learned probability theory out of Probability & Measure Theory by Ash and Doleans-Dade, if you're interested. But I also teach basic probability to undergraduates, and we cover events with unequal probabilities. They are nonetheless random. The OED is plainly wrong. Steve Stigler's book Statistics on the Table discusses the history of this idea, I think. Some of the earliest work in probability did assume all events have equal probability; cases where events had unequal probabilities were decomposed into combinations of cases where they did. But advances in the mathematical theory made this completely unnecessary. Basically, studiot, your interpretation is about 200 years behind. I've searched through my probability texts. In one the only time the word chance is used is when referring to gambling probability problems. In another it is used in several different ways one similar to how you have used it. Is there a firm consensus to how the words are used in scientific English?The two are synonymous. But gamblers like to say "chance" and statisticians like to say "probability." If I say "the event has a 20% chance of occurring" and "the event has a 20% probability of occurring," I mean exactly the same thing each time. Now, odds are different from chances. An odds is the probability of an event, divided by the probability that it won't occur. Hence "3 to 1 odds", corresponding to a 75% chance. 1
Prometheus Posted July 6, 2014 Posted July 6, 2014 (edited) The two are synonymous. But gamblers like to say "chance" and statisticians like to say "probability." Thanks for clarifying. I use the books Probability and Measure by Billingsley and A First Look at Rigorous Probability Theory by Rosenthal. So how do we interpret the OP in light of the above discussion? It's like: "pick a random number from 1 to 10". There's an infinitesimale chance of picking the number 2, and the same chance for pi or e. Yet, the probability, in my opinion, of picking 2, is much larger than picking pi or e. The problem was that on a Dutch forum, everyone said that "kans" (chance) and "waarschijnlijkheid" (probability) are both the same. No matter what. Let's take two continuous probability distribution functions: the normal and the uniform. This is the best image i could easily find, we want the pink and grey curves (U and N), ignore the rest. For small intervals around x=0 there is more chance that an outcome from the normal distribution would occur, but the chances of x=0 is zero, and so the same, for both distributions. Why? Because [latex]\int_a^a f(x)dx=0[/latex]. I still can't help feeling this is where the problem lies, and the chance /probability definitions are a red herring. Edited July 6, 2014 by Prometheus
Bignose Posted July 6, 2014 Posted July 6, 2014 (edited) For small intervals around x=0 there is more chance that an outcome from the normal distribution would occur, but the chances of x=0 is zero, and so the same, for both distributions. Why? Because [latex]\int_a^a f(x)=0[/latex]. I still can't help feeling this is where the problem lies, and the chance /probability definitions are a red herring. Prometheus, you are correct in your integral math here. But not your interpretation that the definitions are messed up. What happens is that because there are an infinite number of numbers on any interval on the real line, the chances of getting any individual number does indeed go to zero. That is, 1.99999999999 [math]\ne[/math] 2.0 [math]\ne[/math] 2.000000000001. And so on. So, given that the above is the answer to the question as asked, does asking that question make sense? What I am driving at is that basically, usually one doesn't care that the exact value of the variables is 1.98435638726323636752387532276354... One normally only cares if it is in some range. "Is the value within 5% of the mean?", "Is the value more than 3 standard deviations from the mean?". Or, similarly, every measuring device we have has some margin of error. If I put a ruler down, the best I can say with confidence is that the length is between a pair of tick marks. So, the question that needs to be asked is: how likely is the variable to be in some range? Or how likely is it that the fly lands in a certain area? That changes your integral not from a to a, but from a to some b not equal to a. Then, if you use your graph there, you can plainly see that [math]\int^{1}_{-1} f_{Gaussian}(x)dx > \int^{1}_{-1} f_{Uniform}(x)dx [/math] as the chances of the variable taking a value near the mean are much higher in a Gaussian distribution than a uniform one. In short, the definitions work as intended, we just need to take care in asking the right questions. Edited July 6, 2014 by Bignose
Prometheus Posted July 6, 2014 Posted July 6, 2014 So, the question that needs to be asked is: how likely is the variable to be in some range? Or how likely is it that the fly lands in a certain area? Yes. Then asking what the probability is for exactly x=0 is meaningless in this context, and so should satisfy the OP. I was wondering about this problem though: It's like: "pick a random number from 1 to 10". There's an infinitesimale chance of picking the number 2, and the same chance for pi or e. Yet, the probability, in my opinion, of picking 2, is much larger than picking pi or e. I agree with Function that there is more chance of someone selecting 2 than of them selecting some irrational number. My initial thought was that we are not actually dealing with a continuous distribution, but a large discrete one. If someone states a number, they are not selecting from an infinite set - people would select from quite a narrow set (i imagine), and even if they tried to pick an irrational number, they couldn't - physically stating it would prove prohibitive. Then i considered the case where people are given a line measured 1 to 10 in some way and asked to pick a point. We could then consider it continuous, but the limits of precision in measuring that point would provide intervals to give a meaningful probability. Then haven't we in some way made the case discrete, each sample element some interval on a continuous scale?
Bignose Posted July 6, 2014 Posted July 6, 2014 I agree with Function that there is more chance of someone selecting 2 than of them selecting some irrational number. Correct. What is really interesting is that give someone "pick a number between 1 and 10" and there is a much larger than 10% chance they will say 7. But, this is much more a question of psychology and how the human mind works more than a mathematical or statistics problem.
Function Posted July 6, 2014 Author Posted July 6, 2014 Correct. What is really interesting is that give someone "pick a number between 1 and 10" and there is a much larger than 10% chance they will say 7. But, this is much more a question of psychology and how the human mind works more than a mathematical or statistics problem. Indeed. Suggestion plays a big role here: say a lot of sentences (in a subtle way, of course), with the sound "eight" in it, and the person will most likely say 8: wait, late, gate, ... That's why I never let this game decide my fate people may have tricked my mind.
overtone Posted July 7, 2014 Posted July 7, 2014 In ordinary English, a chance is usually a probability strictly between 0 and 1 of some event happening in some way. Both "event" and "way" are informally defined and not rigorously differentiated, in ordinary English. More than one chance, "chances", means more than one way with non-zero probability is involved. If the probability is zero, you have no chance. If it is 1, no chance is involved. So in normal usage we hear such as "what is your chance of having a heart attack from shoveling snow today" and "what are your chances of having a heart attack this winter". or such as: "if you see them in time, you have a chance"; "what are my chances of seeing them in time"; and as a joke: "you have two chances: slim and none". You do not have two probabilities, of any given event.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now