Jump to content

Recommended Posts

Posted

I had this really easy Conditional Probability problem in class the other day but for some reason I just could not do it. I could not define the events properly and even though the answer is of no use to me now it has really been bothering me lately, so I was wondering if anyone could help me out?

 

It went something like this: A short sighted man likes to throw a basketball at a hoop in his yard but he has rely on his son to report if he scored. The probability that he scores a goal is 0,6 and the probability that his son reports it correctly 0,8. If his son reports that the father has made a goal, what is the probability that he actually did so?

Posted

I am willing to help you but I don't understand your question.

 

You wrote that the probability that the man scores is 0,6

 

 

That makes no sense, that is not proper mathematical notation. What do you mean?

Posted

It looks to me like the probability that he actually made the goal that his son reported would be the same as the probability that his son reported accurately--0,8.

Posted
I am willing to help you but I don't understand your question.

 

You wrote that the probability that the man scores is 0' date='6

 

 

That makes no sense, that is not proper mathematical notation. What do you mean?[/quote']

 

It was a word problem it was just written like that. Before I did not have the paper with me, because I was not intending to post this question but anyways here is the exact wording now:

 

A short sighted neighbour enjoys tossing basketball at a basket, but must rely on his son to report whether or not he makes a goal. His probability of making a goal is 0,6 and the probability that his son will report correctly is 0,8. If his son reports a goal, what is the probability that the neighbour did make a goal?

 

Hope that helps.

 

You know let D denote the event that the man makes a goal P(D)=0,6 whereas the probability that he does not make a goal ie the complement of D (since making a goal and not making a goal are mutually exclusive) denoted by D^c is P(D)-1=0,4 and similarily for the son reporting a goal I guess.

 

I'm pretty sure that they want you to use the partion theorem but then I just don't see how.

 

P.S. thanks for your help

Posted

I don't understand the notation.

 

Suppose you have a six sided die, and you are going to roll it.

 

Here is the set of all possible outcomes:

 

[math] S = \mathcal{f}1,2,3,4,5,6 \mathcal{g} [/math]

 

Now, consider a specific roll of the die, say the one which occurs at moment in time tn.

 

In order for all of these to be possible outcomes, there has to be at least six possible futures, relative to moment in time tn.

 

So we are assuming that once the die is released, the laws of physics (which we stipulate are constant in time) permit there to be multiple outcomes of this one specific throw.

 

Let us assume that each of the six outcomes in the possibility space, are possible outcomes of this one throw. It will be the only throw of this particular die, in the history of time.

 

Let us presume that each of the six possible outcomes is equally probable.

 

Here is the standard notation for probability:

 

[math] P(\text{face shows 1}) = \frac{1}{6} [/math]

[math] P(\text{face shows 2}) = \frac{1}{6} [/math]

[math] P(\text{face shows 3}) = \frac{1}{6} [/math]

[math] P(\text{face shows 4}) = \frac{1}{6} [/math]

[math] P(\text{face shows 5}) = \frac{1}{6} [/math]

[math] P(\text{face shows 6}) = \frac{1}{6} [/math]

Posted

Euhm... I believe 0,6 is equivalent to 0.6 - in some European counties, a comma is used instead of a dot to indicate a decimal number.

Posted
Euhm... I believe 0,6 is equivalent to 0.6 - in some European counties, a comma is used instead of a dot to indicate a decimal number.

 

oh great, now you tell me :cool:

Posted
I had this really easy Conditional Probability problem in class the other day but for some reason I just could not do it. I could not define the events properly and even though the answer is of no use to me now it has really been bothering me lately' date=' so I was wondering if anyone could help me out?

 

It went something like this: A short sighted man likes to throw a basketball at a hoop in his yard but he has rely on his son to report if he scored. The probability that he scores a goal is 0,6 and the probability that his son reports it correctly 0,8. If his son reports that the father has made a goal, what is the probability that he actually did so?[/quote']

 

Start from the definition of conditional probablity, and work fowards.

 

Here is the article on probability at wolfram:

 

Mathworld: Probability

 

Look down, and you will see the following definition of conditional probability:

 

[math] P(E|F) = \frac{P(E \cap F)}{P(F)} [/math]

 

 

You are told:

1. The probability the man scores is 0.6, in European notation 0,6

2. The probability the son reports the answer correctly is 0.8, or in European notation 0,8

 

 

And you are asked to compute the probability that the man scored, given that the son reported that he scored.

 

Let E denote the event that the man scored.

Let F denote the event that the son makes no error in the report.

 

By design of this problem, we have:

 

[math] P(F) = .8 [/math]

[math] P(E) = .6 [/math]

 

In order to use the LHS of the formula above, we need to compute

[math] P(E \cap F) [/math]

 

But these are independent events, they have nothing to do with one another. Whether or not the man makes the basket, does not influence whether or not the son reports the result correctly. At least that's the normal minded assumption.

 

Therefore:

 

[math] E \cap F = \emptyset [/math]

[math] P(\emptyset)=0 [/math]

 

[math]P(E \cap F) = P(\emptyset) =0 [/math]

 

[math] P(E|F) = \frac{P(E \cap F)}{P(F)} [/math]

 

So...

[math] P(E|F) P(F) = P(E \cap F) [/math]

[math] P(E|F) P(F) = 0 [/math]

 

Therefore, either P(F)=0 or P(E|F) =0.

 

There is some kind of logical problem with the way i've set this up. I need to give it more thought.

 

I need to have a look at Bayes` theorem again.

Posted

 

But these are independent events' date=' they have nothing to do with one another. Whether or not the man makes the basket, does not influence whether or not the son reports the result correctly. At least that's the normal minded assumption.

 

Therefore:

 

aecddef60143079675e961f8d52f1a26.gif

fb4c5377cc91ae48cfafceb08795a624.gif

 

c53b804b95e77dc69077afe81ac1fffa.gif

 

You stated the definition for two mutually exclusive events, not independent events.

 

Therefore' date=' either P(F)=0 or P(E|F) =0.

 

There is some kind of logical problem with the way i've set this up. I need to give it more thought.

 

I need to have a look at Bayes` theorem again.[/quote']

 

You are confusing independent events with mutually exclusive events.You are right though they are independent but the definition of independent events is that

 

P(E|F)=P(E).P(F) [where a "." denotes multiplication ;)]

 

But remember that:

 

P(F)= the probability that a goal was reported correctly and

P(E)= the probability that a goal was scored

 

so P(E|F)= the probability that a goal was scored given that it was reported correctly

 

but we do not know whether it was reported correctly or not, we just know it was reported. We do know that the probability of reporting a correct goal is 0.8 i.e event F

 

so maybe we should write P(E|F|E) but I don't know if that is possible or if it works (never come accross the Conditional probability of a conditional probability before :)). So I don't know

 

BTW does any one know where I can find out how to use this "Latex" thing?

 

P.S. I never knew that a comma could cause such a stir :D. Thanks anyway.

Posted
BTW does any one know where I can find out how to use this "Latex" thing?

 

There's a thread in the General Mathematics section, stickied at the top.

Posted
You stated the definition for two mutually exclusive events, not independent events.

 

Yes, let me see if I remember, not using google on this

 

If two events E,F are independent then:

 

[math] P(E \cap F) = P(E)P(F) [/math]

 

Now let me see if this is right...

 

yes, that is right, its equation 11 at Wolfram on probablity theory

 

Ok so thank you for that, now let me read the rest of your response.

 

Let E,F denote events. Two events are mutually exclusive if and only if

 

[math] E \cap F [/math]

 

Where we have used set theory to define the events.

 

I need to make sure I have the logic right...

 

Definition: Let E,F denote two events.

 

Let S denote the sample space.

 

[math] E \cap F \equiv \mathcal{f} s \in S| s \in E \wedge s \in F \mathcal{g} [/math]

 

 

[math] P(E) \equiv \text{probability E occurs} [/math]

[math] P(F) \equiv \text{probability F occurs} [/math]

 

 

[math] P(E|F) \equiv \frac{P(E \cap F)}{P(F)} \equiv \text{probability of E given F} [/math]

 

Let me ask you a question, what is the sample space in your particular problem?

 

Lookin back to your first post, I see that you admit you couldn't define the events properly. That should give you a clue as to where the problem lies. You have formulas which are defined using a sample space, and you have a question whose sample space is impossible to define.

 

Regards

 

PS: But thank you anyways, at least I got to look at the formulas again. I used to use a tree diagram, to do Bayes formula, and I never bothered to remember the formula for total probability, but your post has made me want to go back and recall it again, so thanks.

Posted

Let S be the event the father scores and the let R be the event the son reports correctly.

 

The probabily of the father scoring is [math]P(S)=0.6[/math] and the probability of the son correctly reporting is [math]P®=0.8[/math].

 

I think we need to find the probability that the father scores given that the son reports correctly, which is

 

[math]P(S|R)=\frac{P(SR)}{P®}[/math]

 

and since S and R are independent

 

[math]P(S|R)=\frac{P(S)P®}{P®}=P(S)=0.6[/math].

 

This right??

Posted

That's what I got a couple of days ago. I was just scared to post because I haven't done probability in aaaages ;)

 

PS: You can use \cap for the intersect symbol - [math]\cap[/math]

Posted

Looks like there are multiple ways to understand this... let me present the usual way (at least for me) of solving elementary examples of conditional probabilities:

 

Let [math]S[/math] denote the event that father scores, and let [math]R[/math] denote the event that the son reports scoring. The given probabilities are [math]P(S)=0.6[/math] and [math]P(R | S)=0.8[/math], and from the formulation of the problem it follows that [math]P( \bar R | \bar S)=0.8[/math] and [math]P(R | \bar S)=0.2[/math] (where [math]\bar A[/math] denotes the complement of [math]A[/math]).

 

Now we can use Bayes formula + law of total probability to compute what we want:

 

[math]P(S | R)=\frac{P(S \cap R)}{P®}=[/math]

 

[math]=\frac{P(R|S) P(S)}{P(R|S)P(S)+P(R|\bar S)P(\bar S)}=\frac{0.8\cdot0.6}{0.8\cdot0.6+0.2\cdot 0.4}\approx 0.857 [/math].

 

Notice that the probability goes up from 0.6 because we add information: this is the main idea of Bayesian thinking. A priori (before information from the son) we have more uncertainty about scoring, and afterwards, a posteriori, we are more certain.

 

Cheers,

 

Tuomas

Posted
Looks like there are multiple ways to understand this... let me present the usual way (at least for me) of solving elementary examples of conditional probabilities:

 

Let 5dbc98dcc983a70728bd082d1a47546e.gif denote the event that father scores' date=' and let e1e1d3d40573127e9ee0480caf1283d6.gif denote the event that the son reports scoring. The given probabilities are 7c4b5f5e6dd8b8aa6e43a82fb70d3656.gif and dfda6c8d7340aa1fe53f5ad1702471e2.gif, and from the formulation of the problem it follows that 1f7af5a59f87c083b8d14edde8f3f6d0.gif and b24175a410c11a7bb54a8b69fdb32396.gif (where f5f080ae3aced6664bd17ae27c9076af.gif denotes the complement of 7fc56270e7a70fa81a5935b72eacbe29.gif).

 

Now we can use Bayes formula + law of total probability to compute what we want:

 

d9daa78de6ba803071dca1ec5d23ee88.gif

 

3dd328c8609968a2c4f2b3c558ae46d4.gif.

 

Notice that the probability goes up from 0.6 because we add information: this is the main idea of Bayesian thinking. A priori (before information from the son) we have more uncertainty about scoring, and afterwards, a posteriori, we are more certain.

 

Cheers,

 

Tuomas

 

Yes that is exactly right Thank you very much. Of course :) you define it in terms of the conditional probability that the son reports correctly, I tried that but I just never realised that when you do that 0a14997d4487d8bc9cb07edb148512ce.gif as well. Thank you, it won't bother me any longer, I can sleep now :rolleyes:.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.