Conjurer Posted September 13, 2019 Author Posted September 13, 2019 On 9/13/2019 at 10:25 PM, Ghideon said: did you see the rightmost column in my result? Expand That's what I would have expected, because a random number generator simulates true randomness. Then there is no such thing as true randomness that can be coded into a computer, as far as we know. Possibly, a quantum computer could generate true randomness. I don't know if they have programmed one to do that yet. That has more to do with the basis in how a quantum computer gains it's bits. I guess I should restate my problem, my difficulty with accepting probability theory, is based on: The law of large numbers has not been been proven based on the probabilities of all the possible outcomes. I state this because, in the example of a die roll, it doesn't use the probabilities of getting a certain die roll, which would be 1/6 for each side. Then it says X_1=X_2=... I don't think X_n can stand for a probability. If it can, I don't see how it could or how someone would show any example where it does use probabilities to obtain the expected value from an increasing number of outcomes.
uncool Posted September 13, 2019 Posted September 13, 2019 On 9/13/2019 at 10:45 PM, Conjurer said: The law of large numbers has not been been proven Expand It has been proven. The law of large numbers is a theorem. On 9/13/2019 at 10:45 PM, Conjurer said: based on the probabilities of all the possible outcomes. Expand I have no idea what you mean by this. On 9/13/2019 at 10:45 PM, Conjurer said: If it can, I don't see how it could or how someone would show any example where it does use probabilities to obtain the expected value from an increasing number of outcomes. Expand "I don't see how" doesn't mean it can't be done.
Ghideon Posted September 13, 2019 Posted September 13, 2019 On 9/13/2019 at 10:45 PM, Conjurer said: That's what I would have expected, because a random number generator simulates true randomness. Expand Ok. And the middle column? Do you claim that it does not agree with the law of large numbers? On 9/13/2019 at 10:45 PM, Conjurer said: Then there is no such thing as true randomness that can be coded into a computer, as far as we know. Expand Does that make the simulation invalid? On 9/13/2019 at 10:45 PM, Conjurer said: I guess I should restate my problem, my difficulty with accepting probability theory, is based on: The law of large numbers has not been been proven based on the probabilities of all the possible outcomes. I state this because, in the example of a die roll, it doesn't use the probabilities of getting a certain die roll, which would be 1/6 for each side. Then it says X_1=X_2=... Expand Can you explain the problem in detail so a simulation can be performed or a proof provided? Which theorem(s) of probability theory do you distrust? On 9/13/2019 at 10:45 PM, Conjurer said: Possibly, a quantum computer could generate true randomness. I don't know if they have programmed one to do that yet. That has more to do with the basis in how a quantum computer gains it's bits. Expand How is the above applicable to the discussion in this thread?
Conjurer Posted September 13, 2019 Author Posted September 13, 2019 On 9/13/2019 at 11:13 PM, uncool said: It has been proven. The law of large numbers is a theorem. I have no idea what you mean by this. "I don't see how" doesn't mean it can't be done. Expand It has been proven experimentally, but not mathematically. I am sure you would have no idea about a basic math concept. I don't understand why you are trying to help me with math, when you cannot even take a basic average to see that the numbers do not add up.
uncool Posted September 13, 2019 Posted September 13, 2019 (edited) On 9/13/2019 at 11:18 PM, Conjurer said: It has been proven experimentally, but not mathematically. Expand The law of large numbers has been proven mathematically. I could provide a basic proof, if you want. On 9/13/2019 at 11:18 PM, Conjurer said: I don't understand why you are trying to help me with math, when you cannot even take a basic average to see that the numbers do not add up. Expand Please explain what you mean. Edited September 13, 2019 by uncool
Conjurer Posted September 13, 2019 Author Posted September 13, 2019 (edited) On 9/13/2019 at 11:17 PM, Ghideon said: Ok. And the middle column? Do you claim that it does not agree with the law of large numbers? Expand I don't think it or the third does really. I thought it was a 1 with a decimal in front of it, but it was actually a comma. The average of heads/tails=1. On 9/13/2019 at 11:17 PM, Ghideon said: Does that make the simulation invalid? Expand It makes it to where it only proves what the programmer was capable of coding to simulate randomness where it actually didn't exist. If he/she went to these forums and asked a public opinion on how that program should act, then assuredly, yes, it would make it invalid. On 9/13/2019 at 11:17 PM, Ghideon said: Can you explain the problem in detail so a simulation can be performed or a proof provided? Which theorem(s) of probability theory do you distrust? Expand I don't it is possible to add probabilities and take an average to get the expected value. In the example, they just used the variation of the data to calculate something using the law of large numbers. Then it is just an average of the date values, not the actual probabilities themselves that are involved in the equation. On 9/13/2019 at 11:17 PM, Ghideon said: How is the above applicable to the discussion in this thread? Expand You appear to be under the impression that what I am saying is impossible, because you have a computer spitting out possible outcomes. Edited September 13, 2019 by Conjurer
uncool Posted September 13, 2019 Posted September 13, 2019 (edited) On 9/13/2019 at 11:31 PM, Conjurer said: I don't it is possible to add probabilities and take an average to get the expected value. Expand That (or at least, some reasonable version of that: take the sum of the products of the probabilities and the values) is the definition of the expected value of a random variable. Edited September 13, 2019 by uncool
Conjurer Posted September 13, 2019 Author Posted September 13, 2019 (edited) On 9/13/2019 at 11:37 PM, uncool said: That (or at least, some reasonable version of that: take the sum of the products of the probabilities and the values) is the definition of the expected value of a random variable. Expand I think we should start over. The law of large numbers is just showing how someone would get the expected value from an experiment. They would add all the outcomes and take the average to determine the expected value. I am saying that it is not possible to use the probabilities of a series of heads or tails to obtain the expected value, or the answer you would get from the law of large numbers. Say I was going to use the law of large numbers, I would assign 0 to heads and 1 to tails. I got 100 heads and 101 tails after a total of 201 flips. I add all of the X_n to get 101, since all the heads add up to zero. I then divide that by 201. Then I get approximately 1/2 Then by that experiment, I proved that the probability of the coin is 1/2 and it is a fair coin. Now I want to calculate what is the probability of me getting 100 head and 101 tails. It comes out to some ridiculously low chance that will ever happen... Edited September 13, 2019 by Conjurer
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) On 9/13/2019 at 11:43 PM, Conjurer said: Now I want to calculate what is the probability of me getting 100 head and 101 tails. It comes out to some ridiculously low chance that will ever happen... Expand But you would also accept 101 heads and 100 tails, right? And if you flipped 2001 times, you would probably accept anywhere between 991 and 1010 heads as evidence of a fair coin. And if you flipped 20001 times, you would probably accept anywhere between 9901 and 10100 heads. And so on. So the question isn't about the probability of a specific number of heads, but a range that grows with the number of flips. And the theorem says that the total probability of landing within that range limits to 1. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 On 9/14/2019 at 12:04 AM, uncool said: So the question isn't about the probability of a specific number of heads, but a range that grows with the number of flips. And the theorem says that the total probability of landing within that range limits to 1. Expand No, it approaches the expected value which is 1/2 You would have to assign values to heads or tails, so that the number of flips is twice the number of possible outcomes for a single flip.
uncool Posted September 14, 2019 Posted September 14, 2019 On 9/14/2019 at 12:11 AM, Conjurer said: No, it approaches the expected value which is 1/2 Expand What does "it" refer to here? "the total probability of landing within that range"?
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 On 9/14/2019 at 12:20 AM, uncool said: What does "it" refer to here? "the total probability of landing within that range"? Expand The law of large numbers By definition, a larger number of flips should approach the expected value or probability of getting heads or tails, which is 1/2
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) On 9/14/2019 at 12:23 AM, Conjurer said: The law of large numbers Expand "No, the law of large numbers approaches the expected value which is 1/2" makes no sense. On 9/14/2019 at 12:23 AM, Conjurer said: By definition, a larger number of flips should approach the expected value or probability of getting heads or tails, which is 1/2 Expand That isn't a definition, it's a garbled version of the law of large numbers itself. And it doesn't address my point: as the number of flips increases, the range of "acceptable" outcomes also increases. Which means you don't only calculate the probability of the "perfect" outcome (i.e. the closest number of flips to 1/2), but of an entire range of numbers of heads. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 (edited) On 9/14/2019 at 12:28 AM, uncool said: "No, the law of large numbers approaches the expected value which is 1/2" makes no sense. Expand https://en.wikipedia.org/wiki/Law_of_large_numbers For example, a fair coin toss is a Bernoulli trial. When a fair coin is flipped once, the theoretical probability that the outcome will be heads is equal to 1/2. Therefore, according to the law of large numbers, the proportion of heads in a "large" number of coin flips "should be" roughly 1/2. In particular, the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity. On 9/14/2019 at 12:28 AM, uncool said: That isn't a definition, it's a garbled version of the law of large numbers itself. And it doesn't address my point: as the number of flips increases, the range of "acceptable" outcomes also increases. Which means you don't only calculate the probability of the "perfect" outcome (i.e. the closest number of flips to 1/2), but of an entire range of numbers of heads. Expand Is the coin going to start landing on its side or something? I don't see why you are so caught up in this. When I did my calculation earlier I just used the variables that could represent any possible range of the number, including the ones that were not even close to the expected value. The average of all the combinations per permutations was 1/r! for the entire range of possible outcomes of any replaceable event as it approached an infinite number of trials. Edited September 14, 2019 by Conjurer
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) On 9/14/2019 at 12:51 AM, Conjurer said: In particular, the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity. Expand Yes, and that doesn't contradict the statement I made: "And the theorem says that the total probability of landing within that range limits to 1." (In fact, it says exactly the same thing) On 9/14/2019 at 12:51 AM, Conjurer said: Is the coin going to start landing on its side or something? Expand No, and that's not even close to what I said. On 9/14/2019 at 12:51 AM, Conjurer said: I don't see why you are so caught up in this. Expand Caught up in what? The fact that the theorem deals with a range, and not only "perfect" outcomes? Because you keep on talking only about the "perfect" outcomes. On 9/14/2019 at 12:51 AM, Conjurer said: When I did my calculation earlier I just used the variables that could represent any possible range of the number, Expand You did not (or at least, did not do so correctly). If you had, you would have gotten the formula I stated. Here's an exercise for you: write a formula for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 (edited) On 9/14/2019 at 1:04 AM, uncool said: Yes, and that doesn't contradict the statement I made: "And the theorem says that the total probability of landing within that range limits to 1." (In fact, it says exactly the same thing) Expand No, it almost surely converges to 1/2, which means that it is guaranteed to approach a 1/2 probability or it has a probability of 1 of approaching 1/2. On 9/14/2019 at 1:04 AM, uncool said: Caught up in what? The fact that the theorem talks about a range? Because you keep on talking only about the "perfect" outcomes. Expand You don't ever seem to be able to grasp anything while only dealing with perfect outcomes. How could you be capable of understanding something more complex than that to set up? On 9/14/2019 at 1:04 AM, uncool said: Here's an exercise for you: write a formula for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010. Expand n=2000 990<X_n<1010 X_n=(1/2000)(X_1+...+X_n) Edited September 14, 2019 by Conjurer
uncool Posted September 14, 2019 Posted September 14, 2019 On 9/14/2019 at 1:23 AM, Conjurer said: No, it almost surely converges to 1/2 Expand Once again: what does "it" refer to here? On 9/14/2019 at 1:23 AM, Conjurer said: You don't ever seem to be able to grasp anything while only dealing with perfect outcomes. Expand Hahaha whatever you say On 9/14/2019 at 1:23 AM, Conjurer said: n=2000 990<X_n<1010 X_n=(1/2000)(X_1+...+X_n) Expand ...this is nowhere near the formula I asked for. To demonstrate: please show me how you would use this formula to calculate the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010.
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 On 9/14/2019 at 1:36 AM, uncool said: Once again: what does "it" refer to here? Expand the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity. On 9/14/2019 at 1:36 AM, uncool said: ...this is nowhere near the formula I asked for. To demonstrate: please show me how you would use this formula to calculate the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010. Expand 990/2000=0.495 1010/2000=0.505 0.495<x<0.505
uncool Posted September 14, 2019 Posted September 14, 2019 (edited) On 9/14/2019 at 2:08 AM, Conjurer said: the proportion of heads after n flips will almost surely converge to 1/2 as n approaches infinity. Expand Yet again, that not only doesn't contradict what I said, it means the precise same thing: "And the theorem says that the total probability of landing within that range limits to 1." "The proportion of heads after n flips" is not "the total probability of landing within that range". On 9/14/2019 at 2:08 AM, Conjurer said: 990/2000=0.495 1010/2000=0.505 0.495<x<0.505 Expand And? I still don't see an answer for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010. Edited September 14, 2019 by uncool
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 On 9/14/2019 at 2:11 AM, uncool said: I still don't see an answer for the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010. Expand (n!/(n^r (r! (n - r)!)) - 1/r!) < P(H or T) < (n!/(n^r (r! (n - r)!)) + 1/r!)
uncool Posted September 14, 2019 Posted September 14, 2019 That's still not an answer, and worse, it is notationally nonsense. What do you mean by P(H or T)? As I said before: On 9/13/2019 at 1:03 AM, uncool said: It looks to me like you are quasi-randomly putting formulae together without understanding what they are for Expand
Ghideon Posted September 14, 2019 Posted September 14, 2019 On 9/13/2019 at 11:31 PM, Conjurer said: I don't think it or the third does really. I thought it was a 1 with a decimal in front of it, but it was actually a comma. The average of heads/tails=1. Expand Can you provide some evidence? "I dont't think so" is not a strong argument. You claim that mainstream math is incorrect and reject current proofs in links provided. Is it maybe the very definitions of the math of probabilities you question? That would make discussions about existing proofs not very fruitful. Note: The decimal comma "," should be read as a decimal point "." . It is the language setting of computer running simulation that causes confusion; for instance 0.997 is written 0,998 in that locale. On 9/13/2019 at 11:31 PM, Conjurer said: I don't it is possible to add probabilities and take an average to get the expected value. In the example, they just used the variation of the data to calculate something using the law of large numbers. Then it is just an average of the date values, not the actual probabilities themselves that are involved in the equation. Expand You want a result based on every possible outcome instead of an outcome based large number of randomly selected samples? On 9/13/2019 at 11:31 PM, Conjurer said: It makes it to where it only proves what the programmer was capable of coding to simulate randomness where it actually didn't exist. If he/she went to these forums and asked a public opinion on how that program should act, then assuredly, yes, it would make it invalid. Expand A discussion about random number generation in computing is better suited for a separate thread. I deliberately used secure random numbers and specified so in my note to reduce this kind of issues. csrc.nist.gov has information about certifications and research.
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 On 9/14/2019 at 4:52 AM, uncool said: That's still not an answer, and worse, it is notationally nonsense. What do you mean by P(H or T)? As I said before: Expand That is the proper notation of the probability of getting a heads or tails. That should be one of the first things you learn about probability... It shows where you are having this problem of it just looking like I am quasi-randomly putting formula together... On 9/14/2019 at 7:08 AM, Ghideon said: Can you provide some evidence? "I dont't think so" is not a strong argument. You claim that mainstream math is incorrect and reject current proofs in links provided. Is it maybe the very definitions of the math of probabilities you question? That would make discussions about existing proofs not very fruitful. Expand I am not looking for a fruitful discussion about it. I am looking for answers. I don't think the law of large numbers has been proven from the basis of considering the probabilities of events occurring in a row. I am wondering if I have already answered my own question, but I don't understand why n^r doesn't seem to work with the coin flip problem. It seems like the permutations with replacement should be r^n. Then you could get 2^(#flips), which would give you the total possible number of outcomes. If my math was correct, then 1/(r!) could be the average probability. Then you could say that the average probability of getting heads or tails would just be 1/(2!)=1/2. Then n^r didn't seem to work in this example...
uncool Posted September 14, 2019 Posted September 14, 2019 On 9/14/2019 at 6:09 PM, Conjurer said: That is the proper notation of the probability of getting a heads or tails. That should be one of the first things you learn about probability... It shows where you are having this problem of it just looking like I am quasi-randomly putting formula together... Expand You are making my point. Why are you looking at the probability of getting a heads or tails (which will be 1), when what I asked you for was "the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010"?
Conjurer Posted September 14, 2019 Author Posted September 14, 2019 On 9/14/2019 at 6:30 PM, uncool said: You are making my point. Why are you looking at the probability of getting a heads or tails (which will be 1), when what I asked you for was "the probability that after 2000 flips of a fair coin, the number of heads will be between 990 and 1010"? Expand The probability of getting heads or tails is 1/2! P(H or T) = 1/2 You have a 1 out of two possible outcomes to get either a heads or a tails. The numerator is the number of outcomes for a success, and the denominator is the number of possible outcomes. What you are asking for doesn't seem to make sense. I don't even think the mathematical tools to even describe what you are asking has even been discovered yet. You are just asking me to discover this. I don't know how you could incorporate a probability into the law of large numbers to show that. I gave you my best guess, but maybe you would have to subtract the probability from an average probability instead...
Recommended Posts