uncool Posted September 14, 2019 Posted September 14, 2019 (edited) 19 minutes ago, Conjurer said: The probability of getting heads or tails is 1/2! P(H or T) = 1/2 You have a 1 out of two possible outcomes to get either a heads or a tails. The numerator is the number of outcomes for a success, and the denominator is the number of possible outcomes. What you are asking for doesn't seem to make sense. I don't even think the mathematical tools to even describe what you are asking has even been discovered yet. You are just asking me to discover this. I don't know how you could incorporate a probability into the law of large numbers to show that. I gave you my best guess, but maybe you would have to subtract the probability from an average probability instead... P(H or T) means the probability of getting a result that is either heads or tails. In other words, if the flip is heads, or if it's tails, it's accepted. Which means any result (other than "landing on the side") is accepted, so P(H or T) = 1. What I'm asking you for is basic probability that has been known for literal centuries. It's a generalization of the following question: What is the probability that after 4 flips of a fair coin, the number of heads is between 2 and 4, inclusive? (The answer will be 11/16) To be blunt: you are attempting to make pronouncements when you don't know the basics of this field of mathematics. You don't know what you are talking about, and are striking out randomly, resulting in exactly what I said before: quasi-randomly putting formulae together without understanding what they are for Not knowing probability theory isn't a problem. Everyone was there at some point. The problem is that you don't know probability theory, but you are confidently making assertions about it anyway. Probability theory is a very well-studied area of mathematics, and has been a part of it for centuries. The law of large numbers is a basic theorem in probability, and has been mathematically proven, again, for centuries. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 26 minutes ago, uncool said: Not knowing probability theory isn't a problem. Everyone was there at some point. The problem is that you don't know probability theory, but you are confidently making assertions about it anyway. Probability theory is a very well-studied area of mathematics, and has been a part of it for centuries. The law of large numbers is a basic theorem in probability, and has been mathematically proven, again, for centuries. This is enough to put someone into an insane asylum, if you don't have any reference for it. I see no reason to take your word for it. Where is this proof you had for centuries?
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) 3 minutes ago, Conjurer said: This is enough to put someone into an insane asylum, if you don't have any reference for it. I see no reason to take your word for it. Where is this proof you had for centuries? From Wikipedia: " A special form of the LLN (for a binary random variable) was first proved by Jacob Bernoulli.[2] It took him over 20 years to develop a sufficiently rigorous mathematical proof which was published in his Ars Conjectandi (The Art of Conjecturing) in 1713. He named this his "Golden Theorem" but it became generally known as "Bernoulli's Theorem". " The actual proof (using modern terminology and notation) of the weak law of large numbers is also in the article: https://en.wikipedia.org/wiki/Law_of_large_numbers#Proof_of_the_weak_law Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 2 minutes ago, uncool said: From Wikipedia: " A special form of the LLN (for a binary random variable) was first proved by Jacob Bernoulli.[2] It took him over 20 years to develop a sufficiently rigorous mathematical proof which was published in his Ars Conjectandi (The Art of Conjecturing) in 1713. He named this his "Golden Theorem" but it became generally known as "Bernoulli's Theorem". " The actual proof (using modern terminology and notation) of the weak law of large numbers is also in the article: https://en.wikipedia.org/wiki/Law_of_large_numbers#Proof_of_the_weak_law Where does this have the probabilities of events occurring in a row in it?
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) 7 minutes ago, Conjurer said: Where does this have the probabilities of events occurring in a row in it? It doesn't directly, because the law of large numbers isn't about events occurring in a row. If you mean does it consider the possibility (by including it in probability calculations), then yes, it does - the possibility is included in the probability space. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 13 minutes ago, uncool said: It doesn't directly, because the law of large numbers isn't about events occurring in a row. You should hold on to that thought right there. I think you almost got to the point I was trying to say. Then forget about what else you may I thought I meant by it, because I didn't. That is basically what I have been trying to explain this entire time. There is no proof of probabilities being accurate, because there is no connection between them and the law of large numbers. You cannot add or multiply a series of them in such a way where you get the probability of an event occurring one single time.
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) If what you mean is that there's always a chance of getting it wrong, of a fair coin coming up heads, heads, heads, heads, etc., then yes, that's how probabilities work. There's a chance. But it's really unlikely, and as the number of flips gets higher, the outcome becomes more and more unlikely. And that has very little to do with some of the things you were saying earlier. Otherwise, I still have no idea what you are trying to say. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 4 minutes ago, uncool said: If what you mean is that there's always a chance of getting it wrong, of a fair coin coming up heads, heads, heads, heads, etc., then yes, that's how probabilities work. There's a chance. But it's really unlikely, and as the number of flips gets higher, the outcome becomes more and more unlikely. And that has very little to do with some of the things you were saying earlier. Otherwise, I still have no idea what you are trying to say. That isn't one of my concerns. I accept that as being a likely scenario. My concern was that if you add or multiply probabilities together to find the most likely outcome over an increasing number of series of trials, you wouldn't get something that represents anything close to what you should arrive at from the law of large numbers, which is approximately half of them being heads or tails.
uncool Posted September 14, 2019 Posted September 14, 2019 And your concern is misplaced, as shown by the law of large numbers itself (which is, I repeat, a proven theorem).
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 Just now, uncool said: And your concern is misplaced, as shown by the law of large numbers itself (which is, I repeat, a proven theorem). I am not trying to say that the law of large numbers needs to be proven. I am saying that the way we calculate probabilities needs to be proven, and arriving at the law of large numbers could be considered that proof. You just turned everything I said up on side of it's head and arrived at a conclusion which was completely backwards of my intent.
studiot Posted September 14, 2019 Posted September 14, 2019 I can't see that 'the law of large numbers has any place in this dicussion - it is a diversion. @Conjurer I think you basically have failed to distinguish between a) The probability of a head or T in any particular throw and how it varies with the number of throws n. b) The probability of throwing any particular pattern (or sequence) of heads and tails and how that varies with n. An also what happens when n becomes transfinite.
uncool Posted September 14, 2019 Posted September 14, 2019 @studiotIf anything, I see it the other way around. The law of large numbers is what Conjurer has referred to over and over (including in the OP, with "the outcome of flipping more and more coins in a row approaches closer and closer to half of the coins being heads or tails."); if anything, I see "n becoming transfinite" as a diversion. 16 minutes ago, Conjurer said: I am not trying to say that the law of large numbers needs to be proven. I am saying that the way we calculate probabilities needs to be proven, and arriving at the law of large numbers could be considered that proof. You just turned everything I said up on side of it's head and arrived at a conclusion which was completely backwards of my intent. What do you mean by "the way we calculate probabilities"? Do you mean, for example, the axiom that the probabilities of disjoint events should add?
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 (edited) If you wanted to accomplish that goal, wouldn't you then try to take the average of the probabilities of an event occurring as it continued on to infinity? I am saying that first, you would have to consider how many combinations there are of the event. nCr = (n!)/(r! (n - r)! Then to find the probability of that event you would have to divide by the total number of possible outcomes or permutations with replacement. nPr = n^r Then you get nCr/nPr = (n!)/ (n^r(r! (n - r)!) Then you would want add all of those probabilities together so you can take the average lim_(n->∞) integral[(n!)/(n^r (r! (n - r)!)) dn] Then you would want to divide by the total number of trials to find the average probability. lim_(n->∞) 1/n integral[(n!)/(n^r (r! (n - r)!)) dn] This would be like saying that 1/n (nCr/nPr + nCr/nPr + ...) = lim_(n->∞) 1/n integral[(n!)/(n^r (r! (n - r)!)) dn] Then today, wolfram alpha seems to suggest that it is actually 0 How could the average probability be zero when it should come out to what we would expect the probability for an event to occur a single time? Edited September 14, 2019 by Conjurer
studiot Posted September 14, 2019 Posted September 14, 2019 (edited) 13 minutes ago, uncool said: @studiotIf anything, I see it the other way around. The law of large numbers is what Conjurer has referred to over and over (including in the OP, with "the outcome of flipping more and more coins in a row approaches closer and closer to half of the coins being heads or tails."); if anything, I see "n becoming transfinite" as a diversion. Yes but he is wrong to quote this, just as he is wrong to introduce integration and 'the calculus' in general. As I understand the question and opening argument it runs thus Because the probability of getting a head in a fair toss is 1/2 there probability of getting an equal number of heads must tend to 1/2 as n increases. This is a false argument. I consider the required probability does not contain a fraction as suggested but reduces to [math]{}_r^nC{p^n}[/math] : n even 0 : n odd. I make this for up to 12 tosses n P(equal heads & tails) 2 0.500 4 0.375 6 0.313 8 0.273 10 0.246 12 0.226 Edited September 14, 2019 by studiot
uncool Posted September 14, 2019 Posted September 14, 2019 7 minutes ago, Conjurer said: I am saying that first, you would have to consider how many combinations there are of the event. nCr = (n!)/(r! (n - r)! This is the number of sequences of n flips such that r of them are heads. 8 minutes ago, Conjurer said: Then to find the probability of that event you would have to divide by the total number of possible outcomes or permutations with replacement The "possible outcomes" are sequences of n flips; the number is 2^n. The number of outcomes does not depend on r. 9 minutes ago, Conjurer said: Then you would want add all of those probabilities together So you take a sum. Not an integral, a sum. Further, the sum is over r, not over n. sum_{range of r} nCr/2^n Further, there is no reason to be taking a limit at this point. 11 minutes ago, Conjurer said: Then you would want to divide by the total number of trials to find the average probability. No, you already have a probability. The division by n was to get an average value, not an average probability.
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 22 minutes ago, studiot said: Yes but he is wrong to quote this, just as he is wrong to introduce integration and 'the calculus' in general. https://www.khanacademy.org/math/ap-calculus-ab/ab-integration-new/ab-6-1/v/introduction-to-integral-calculus
studiot Posted September 14, 2019 Posted September 14, 2019 5 minutes ago, Conjurer said: https://www.khanacademy.org/math/ap-calculus-ab/ab-integration-new/ab-6-1/v/introduction-to-integral-calculus 18 minutes ago, uncool said: So you take a sum. Not an integral, a sum. Further, the sum is over r, not over n. Read this three times.
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 19 minutes ago, uncool said: sum_{range of r} nCr/2^n Further, there is no reason to be taking a limit at this point. Wouldn't you have to take the derivative of the integral to get the same result as the sum? 20 minutes ago, uncool said: Further, there is no reason to be taking a limit at this point. No, you already have a probability. The division by n was to get an average value, not an average probability. You already forgot what we just talked about. The law of large numbers doesn't consider probabilities! 3 minutes ago, studiot said: Read this three times. Did you even watch the video?
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) 3 minutes ago, Conjurer said: Wouldn't you have to take the derivative of the integral to get the same result as the sum? This is what I mean by "quasi-randomly putting formulae together". There is no reason to be integrating in the first place. 3 minutes ago, Conjurer said: You already forgot what we just talked about. The law of large numbers doesn't consider probabilities! You didn't read my answer closely. The law of large numbers is entirely about probabilities. It's a theorem in probability theory. It talks about convergence in probability. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 5 minutes ago, uncool said: This is what I mean by "quasi-randomly putting formulae together". There is no reason to be integrating in the first place. I wanted to add all of the possible outcomes. I wasn't concerned with it only applying to that one single example. I am looking for the proof of the addition and multiplication of probabilities in a series. 6 minutes ago, uncool said: You didn't read my answer closely. The law of large numbers is entirely about probabilities. Where does it have a variable for probabilities in it?
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) 10 minutes ago, Conjurer said: I wanted to add all of the possible outcomes. In which case you use a sum, because the outcomes are discrete. As I have repeatedly shown. 10 minutes ago, Conjurer said: Where does it have a variable for probabilities in it? In the expression "converge in probability" (for the weak law, which is all that is necessary here). The "Pr" in "lim_{n -> infinity} Pr(...) = 0" denotes probability. Edited September 14, 2019 by uncool
studiot Posted September 14, 2019 Posted September 14, 2019 14 minutes ago, Conjurer said: I am looking for the proof of the addition and multiplication of probabilities in a series. Then you need to learn how to get the one you are dealing with correct.
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 7 minutes ago, uncool said: In which case you use a sum. As I have repeatedly shown. I don't think you have. You just keep saying that without giving a real reason. You should start watching this video at three minutes. He says they are the same. https://www.khanacademy.org/math/ap-calculus-ab/ab-integration-new/ab-6-1/v/introduction-to-integral-calculus 8 minutes ago, uncool said: In the expression "converge in probability" (for the weak law, which is all that is necessary here). The "Pr" denote probability. That still says nothing about how an increasing number of probabilities could come into play here. It is just stating that the expected value is the same as the average of random occurrences of data.
uncool Posted September 14, 2019 Posted September 14, 2019 4 minutes ago, Conjurer said: You should start watching this video at three minutes. He says they are the same. "But instead of taking the sum of a discrete number of things, you're taking the sum of an infinite number". We're looking at discrete numbers of heads. 6 minutes ago, Conjurer said: That still says nothing about how an increasing number of probabilities could come into play here. That comes into play because of the expression. |bar(X)_n - \mu| > \epsilon is equivalent to |(X_1 + X_2 + ... + X_n)/n - \mu| > \epsilon is equivalent to |X_1 + X_2 + ... + X_n - n \mu| > n * \epsilon The right-hand side grows with n. It says that the sum can be n*epsilon away from n* mean.
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 6 minutes ago, uncool said: "But instead of taking the sum of a discrete number of things, you're taking the sum of an infinite number". We're looking at discrete numbers of heads. To me, it is like you are just fantasizing about some kind of mathematics where summation is used in probability to show something. It should be a matter of choice, because the integral is just the summation as it approaches infinity for all the numbers in between. Why should I just use summation to to satisfy your opinion? I wanted something that can be true for all cases, and I didn't want to make a statement that is only true for one individual case and have to go back and use an integral to make it true for all cases after doing that. Even though, it could possibly make it easier with how stubborn you are about this. Then that problem could also be solved by me just trying to figure this out on my own. 11 minutes ago, uncool said: That comes into play because of the expression. |bar(X)_n - \mu| > \epsilon is equivalent to |(X_1 + X_2 + ... + X_n)/n - \mu| > \epsilon is equivalent to |X_1 + X_2 + ... + X_n - n \mu| > n * \epsilon The right-hand side grows with n. It says that the sum can be n*epsilon away from n* mean. I take it this means to imply that you are able to make a summation of probabilities that can show P(H)=1/2 (IFF you can even be satisfied with that statement)
Recommended Posts