Shadow Posted February 1, 2011 Posted February 1, 2011 Say I have an ideal coin, and I keep throwing it until I get heads. What would be the average number of times I would have to throw the coin? I'd say the answer is one; I'm not familiar with the terminology so I'll have to use my own words, but since the chance of getting either heads or tails is 50%, the number of times I get heads should equal the number of times I get tails (after an unlimited number of throws), which means that I can think of the outcomes of the throws as alternating between heads and tails. Or so I think.
timo Posted February 1, 2011 Posted February 1, 2011 Sometimes you have to throw only once (which is half of the time where you immediately get heads), sometimes you have to throw more often. So obviously the average is more than one. The proper way to calculate a statistical average is to sum up all possible results weighted with their (absolute) probability. For example: let's assume you'd throw the dice twice and want to know the average number of heads showing up: each possible combination (heads-heads, heads-tails, th, tt) has the same probability 1/4. The number of heads in the combinations is 2, 1, 1 and 0, respectively. The average number of heads is [math]\bar h = 2\frac 14 + 1 \frac 14 + 1 \frac 14 + 0 \frac 14 = 1[/math]. Getting the average number of coin flips is very analogous, and only slightly more complicated because the number of relevant combinations is infinite (but the result will still be finite).
Mr Skeptic Posted February 1, 2011 Posted February 1, 2011 All you have to do to get your answer is sum up an infinite series. 1/2 the time when you flip it once you get a heads. 1/2 the time when you flip it the second time you get a heads, but you only have 1/2 chance of getting to this step ... So the sum is 1/2 the time once + 1/4 the time twice + 1/8 the time 3 times + ... + 1/2^n th time n times. To convert all the fractions to how many flips you would actually get, you would multiply this by 2^n, but then to get the average you would divide by 2^n, so both those parts can be neglected. Then take the limit as n goes to infinity. So, [math]\lim_{n \to \infty} \sum^{i=0}_{i=n} \frac{i}{2^i}[/math] Eh, watch out for errors. I'll check up on this later, I have to go no. Hm, that would be an interesting result although it might be infinite.
michel123456 Posted February 1, 2011 Posted February 1, 2011 (edited) And the result is... Edited February 1, 2011 by michel123456
ajb Posted February 1, 2011 Posted February 1, 2011 And the result is... Taking the limit that Mr Skeptic suggest gives......wait for it..... drum roll please......2. I think we could have had an intuitive guess at that.
ajb Posted February 1, 2011 Posted February 1, 2011 First you need to prove that [math]\sum_{i=0}^{n}\frac{i}{2^{i}} = 2 - 2^{-n}(2+n)[/math], then take the limit. 1
TonyMcC Posted February 1, 2011 Posted February 1, 2011 How about a simple, almost non mathematical, look at the problem. Imagine a large number of people and each spins a coin. About half will get a "head". Record the number and tell them to drop out. Those left spin again and half of them get a "head". Record this number and tell them to drop out. It would seem that if you continue in this way until nobody is left you can make the required calculation. However, consider the case of an infinite number of people. You could expect one person to never get a"head". This will prevent you being able to make the required calculation (in my non mathematical opinion).
Mr Skeptic Posted February 2, 2011 Posted February 2, 2011 The problem of course is that after infinitely many flips you might get someone with infinitely many tails. Turns out that that 2^infinite is infinitely bigger than infinite but even that is no guarantee when you compare to the series 1/2+1/3+1/4+1/5+... ajb basically gave you the answer. It's 2 minus something that gets smaller and smaller the farther you go, so if you go til that other term is infinitesimally small then you get left with 2. As he said though, no one proved it is actually true. (well I did but I'll let someone else do it if they want to. after all, many of you probably never proved something to be true for infinitely many things, right? It's easier than it looks!) OK, this is the claim that needs proving: [math]\sum_{i=0}^{n}\frac{i}{2^{i}} = 2 - 2^{-n}(2+n)[/math], The way this is usually done is prove that if the claim were to hold for some arbitrary number n=k, then it would also hold for n=k+1. If you do this remember that you can assume that it is true and also that you should put the k in the final answer into parenthesis as (k+1). Then prove that it does actually hold true for some n, typically n=1.
michel123456 Posted February 2, 2011 Posted February 2, 2011 All you have to do to get your answer is sum up an infinite series. 1/2 the time when you flip it once you get a heads. 1/2 the time when you flip it the second time you get a heads, but you only have 1/2 chance of getting to this step ... So the sum is 1/2 the time once + 1/4 the time twice + 1/8 the time 3 times + ... + 1/2^n th time n times. (...) This sum is equal to 1, isn't it?
ajb Posted February 2, 2011 Posted February 2, 2011 As he said though, no one proved it is actually true. (well I did but I'll let someone else do it if they want to. after all, many of you probably never proved something to be true for infinitely many things, right? It's easier than it looks!) As you suggest, a simple induction argument will prove it.
michel123456 Posted February 5, 2011 Posted February 5, 2011 (edited) ajb basically gave you the answer. It's 2 minus something that gets smaller and smaller the farther you go, so if you go til that other term is infinitesimally small then you get left with 2. (...) "An infinite number of mathematicians walk into a bar. The first orders one beer. The second orders half of a beer. The third orders a quarter of a beer. The fourth orders an eighth of a beer. The bartender says, Screw you! and pours two beers." thanks to Swansont's blog Edited February 5, 2011 by michel123456
D H Posted February 5, 2011 Posted February 5, 2011 First you need to prove that [math]\sum_{i=0}^{n}\frac{i}{2^{i}} = 2 - 2^{-n}(2+n)[/math], then take the limit. Or just attack the infinite sum directly. First note that the n=0 term contributes nothing to the sum: [math]\sum_{r=0}^{\infty}\frac{n}{2^n} = \sum_{n=1}^{\infty}\frac{n}{2^n}[/math] Now use the fact that [math]n=1+(n-1)[/math]: [math]\sum_{n=1}^{\infty}\frac{n}{2^n} = \sum_{n=1}^{\infty}\frac{1+(n-1)}{2^n} = \sum_{n=1}^{\infty}\frac{1}{2^n} + \sum_{n=1}^{\infty}\frac{n-1}{2^n}[/math] The first sum on the right is well-known to be 1. The n=1 term of the second sum on the right is zero. Using this and reorganizing that second sum, [math]\sum_{n=1}^{\infty}\frac{n-1}{2^n} = \sum_{n=2}^{\infty}\frac{n-1}{2^n} = \sum_{n=1}^{\infty}\frac{n}{2^{n+1}} = \frac 1 2 \sum_{n=1}^{\infty}\frac{n}{2^n}[/math] Thus [math]\sum_{n=1}^{\infty}\frac{n}{2^n} = 1 + \frac 1 2 \sum_{n=1}^{\infty}\frac{n}{2^n}[/math] or [math]\sum_{n=1}^{\infty}\frac{n}{2^n} = 2[/math]
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now