David Goodman Probability and Stochastic , Transcriptions for Music of India
oguz-onur-guengoer
oguz-onur-guengoer

David Goodman Probability and Stochastic , Transcriptions for Music of India

443 pages
3Number of download
174Number of visits
100%on 1 votesNumber of votes
Description
David Goodman Probability and Stochastic
20 points
Download points needed to download
this document
Download the document
Preview3 pages / 443
This is only a preview
3 shown on 443 pages
This is only a preview
3 shown on 443 pages
This is only a preview
3 shown on 443 pages
This is only a preview
3 shown on 443 pages
probsol.dvi

Probability and Stochastic Processes

A Friendly Introduction for Electrical and Computer Engineers

SECOND EDITION

Problem Solutions September 28, 2005 Draft

Roy D. Yates, David J. Goodman, David Famolari

September 28, 2005

This solution manual remains under construction. The current count is that 678 (out of 687) problems have solutions. The unsolved problems are

12.1.7, 12.1.8, 12.5.8, 12.5.9, 12.11.5 – 12.11.9.

If you volunteer a solution for one of those problems, we’ll be happy to include it . . . and, of course, “your wildest dreams will come true.”

Of course, the correctness of every single solution reamins unconfirmed. If you find errors or have suggestions or comments, please send email: ryates@winlab.rutgers.edu.

If you need to make solution sets for your class, you might like the Solution Set Constructor at the instructors site www.winlab.rutgers.edu/probsolns. If you need access, send email: ryates@winlab.rutgers.edu.

Matlab functions written as solutions to homework problems can be found in the archive matsoln.zip (available to instructors) or in the directory matsoln. Other Matlab functions used in the text or in these homework solutions can be found in the archive matcode.zip or directory matcode. The .m files in matcode are available for download from the Wiley website. Two other documents of interest are also available for download:

A manual probmatlab.pdf describing the matcode .m functions is also available.

The quiz solutions manual quizsol.pdf.

A web-based solution set constructor for the second edition is available to instructors at http://www.winlab.rutgers.edu/probsolns

The next update of this solution manual is likely to occur in January, 2006.

1

Problem Solutions – Chapter 1

Problem 1.1.1 Solution Based on the Venn diagram

M O

T

the answers are fairly straightforward:

(a) Since T ∩M = φ, T and M are not mutually exclusive. (b) Every pizza is either Regular (R), or Tuscan (T ). Hence R ∪ T = S so that R and T are

collectively exhaustive. Thus its also (trivially) true that R ∪ T ∪M = S. That is, R, T and M are also collectively exhaustive.

(c) From the Venn diagram, T and O are mutually exclusive. In words, this means that Tuscan pizzas never have onions or pizzas with onions are never Tuscan. As an aside, “Tuscan” is a fake pizza designation; one shouldn’t conclude that people from Tuscany actually dislike onions.

(d) From the Venn diagram, M ∩T and O are mutually exclusive. Thus Gerlanda’s doesn’t make Tuscan pizza with mushrooms and onions.

(e) Yes. In terms of the Venn diagram, these pizzas are in the set (T ∪M ∪O)c.

Problem 1.1.2 Solution Based on the Venn diagram,

M O

T

the complete Gerlandas pizza menu is Regular without toppings Regular with mushrooms Regular with onions Regular with mushrooms and onions Tuscan without toppings Tuscan with mushrooms

Problem 1.2.1 Solution

(a) An outcome specifies whether the fax is high (h), medium (m), or low (l) speed, and whether the fax has two (t) pages or four (f) pages. The sample space is

S = {ht, hf,mt,mf, lt, lf} . (1)

2

(b) The event that the fax is medium speed is A1 = {mt,mf}. (c) The event that a fax has two pages is A2 = {ht,mt, lt}. (d) The event that a fax is either high speed or low speed is A3 = {ht, hf, lt, lf}. (e) Since A1 ∩A2 = {mt} and is not empty, A1, A2, and A3 are not mutually exclusive. (f) Since

A1 ∪A2 ∪A3 = {ht, hf,mt,mf, lt, lf} = S, (2) the collection A1, A2, A3 is collectively exhaustive.

Problem 1.2.2 Solution

(a) The sample space of the experiment is

S = {aaa, aaf, afa, faa, ffa, faf, aff, fff} . (1)

(b) The event that the circuit from Z fails is

ZF = {aaf, aff, faf, fff} . (2)

The event that the circuit from X is acceptable is

XA = {aaa, aaf, afa, aff} . (3)

(c) Since ZF ∩XA = {aaf, aff} = φ, ZF and XA are not mutually exclusive. (d) Since ZF ∪XA = {aaa, aaf, afa, aff, faf, fff} = S, ZF and XA are not collectively exhaus-

tive.

(e) The event that more than one circuit is acceptable is

C = {aaa, aaf, afa, faa} . (4)

The event that at least two circuits fail is

D = {ffa, faf, aff, fff} . (5)

(f) Inspection shows that C ∩D = φ so C and D are mutually exclusive. (g) Since C ∪D = S, C and D are collectively exhaustive.

Problem 1.2.3 Solution The sample space is

S = {A♣, . . . ,K♣, A♦, . . . ,K♦, A♥, . . . ,K♥, A♠, . . . ,K♠} . (1)

The event H is the set H = {A♥, . . . ,K♥} . (2)

3

Problem 1.2.4 Solution The sample space is

S =

⎧⎨ ⎩

1/1 . . . 1/31, 2/1 . . . 2/29, 3/1 . . . 3/31, 4/1 . . . 4/30, 5/1 . . . 5/31, 6/1 . . . 6/30, 7/1 . . . 7/31, 8/1 . . . 8/31, 9/1 . . . 9/31, 10/1 . . . 10/31, 11/1 . . . 11/30, 12/1 . . . 12/31

⎫⎬ ⎭ . (1)

The event H defined by the event of a July birthday is described by following 31 sample points.

H = {7/1, 7/2, . . . , 7/31} . (2)

Problem 1.2.5 Solution Of course, there are many answers to this problem. Here are four event spaces.

1. We can divide students into engineers or non-engineers. Let A1 equal the set of engineering students and A2 the non-engineers. The pair {A1, A2} is an event space.

2. We can also separate students by GPA. Let Bi denote the subset of students with GPAs G satisfying i − 1 ≤ G < i. At Rutgers, {B1, B2, . . . , B5} is an event space. Note that B5 is the set of all students with perfect 4.0 GPAs. Of course, other schools use different scales for GPA.

3. We can also divide the students by age. Let Ci denote the subset of students of age i in years. At most universities, {C10, C11, . . . , C100} would be an event space. Since a university may have prodigies either under 10 or over 100, we note that {C0, C1, . . .} is always an event space

4. Lastly, we can categorize students by attendance. Let D0 denote the number of students who have missed zero lectures and let D1 denote all other students. Although it is likely that D0 is an empty set, {D0, D1} is a well defined event space.

Problem 1.2.6 Solution Let R1 and R2 denote the measured resistances. The pair (R1, R2) is an outcome of the experiment. Some event spaces include

1. If we need to check that neither resistance is too high, an event space is

A1 = {R1 < 100, R2 < 100} , A2 = {either R1 100 or R2 100} . (1) 2. If we need to check whether the first resistance exceeds the second resistance, an event space

is B1 = {R1 > R2} B2 = {R1 ≤ R2} . (2)

3. If we need to check whether each resistance doesn’t fall below a minimum value (in this case 50 ohms for R1 and 100 ohms for R2), an event space is

C1 = {R1 < 50, R2 < 100} , C2 = {R1 < 50, R2 100} , (3) C3 = {R1 50, R2 < 100} , C4 = {R1 50, R2 100} . (4)

4. If we want to check whether the resistors in parallel are within an acceptable range of 90 to 110 ohms, an event space is

D1 = { (1/R1 + 1/R2)1 < 90

} , (5)

D2 = { 90 (1/R1 + 1/R2)1 110

} , (6)

D2 = { 110 < (1/R1 + 1/R2)1

} . (7)

4

Problem 1.3.1 Solution The sample space of the experiment is

S = {LF,BF,LW,BW} . (1)

From the problem statement, we know that P [LF ] = 0.5, P [BF ] = 0.2 and P [BW ] = 0.2. This implies P [LW ] = 10.50.20.2 = 0.1. The questions can be answered using Theorem 1.5. (a) The probability that a program is slow is

P [W ] = P [LW ] + P [BW ] = 0.1 + 0.2 = 0.3. (2)

(b) The probability that a program is big is

P [B] = P [BF ] + P [BW ] = 0.2 + 0.2 = 0.4. (3)

(c) The probability that a program is slow or big is

P [W ∪B] = P [W ] + P [B]− P [BW ] = 0.3 + 0.40.2 = 0.5. (4)

Problem 1.3.2 Solution A sample outcome indicates whether the cell phone is handheld (H) or mobile (M) and whether the speed is fast (F ) or slow (W ). The sample space is

S = {HF,HW,MF,MW} . (1)

The problem statement tells us that P [HF ] = 0.2, P [MW ] = 0.1 and P [F ] = 0.5. We can use these facts to find the probabilities of the other outcomes. In particular,

P [F ] = P [HF ] + P [MF ] . (2)

This implies P [MF ] = P [F ]− P [HF ] = 0.50.2 = 0.3. (3)

Also, since the probabilities must sum to 1,

P [HW ] = 1− P [HF ]− P [MF ]− P [MW ] = 10.20.30.1 = 0.4. (4)

Now that we have found the probabilities of the outcomes, finding any other probability is easy.

(a) The probability a cell phone is slow is

P [W ] = P [HW ] + P [MW ] = 0.4 + 0.1 = 0.5. (5)

(b) The probability that a cell hpone is mobile and fast is P [MF ] = 0.3.

(c) The probability that a cell phone is handheld is

P [H] = P [HF ] + P [HW ] = 0.2 + 0.4 = 0.6. (6)

5

Problem 1.3.3 Solution A reasonable probability model that is consistent with the notion of a shuffled deck is that each card in the deck is equally likely to be the first card. Let Hi denote the event that the first card drawn is the ith heart where the first heart is the ace, the second heart is the deuce and so on. In that case, P [Hi] = 1/52 for 1 ≤ i ≤ 13. The event H that the first card is a heart can be written as the disjoint union

H = H1 ∪H2 ∪ · · · ∪H13. (1) Using Theorem 1.1, we have

P [H] = 13∑ i=1

P [Hi] = 13/52. (2)

This is the answer you would expect since 13 out of 52 cards are hearts. The point to keep in mind is that this is not just the common sense answer but is the result of a probability model for a shuffled deck and the axioms of probability.

Problem 1.3.4 Solution Let si denote the outcome that the down face has i dots. The sample space is S = {s1, . . . , s6}. The probability of each sample outcome is P [si] = 1/6. From Theorem 1.1, the probability of the event E that the roll is even is

P [E] = P [s2] + P [s4] + P [s6] = 3/6. (1)

Problem 1.3.5 Solution Let si equal the outcome of the student’s quiz. The sample space is then composed of all the possible grades that she can receive.

S = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10} . (1) Since each of the 11 possible outcomes is equally likely, the probability of receiving a grade of i, for each i = 0, 1, . . . , 10 is P [si] = 1/11. The probability that the student gets an A is the probability that she gets a score of 9 or higher. That is

P [Grade of A] = P [9] + P [10] = 1/11 + 1/11 = 2/11. (2)

The probability of failing requires the student to get a grade less than 4.

P [Failing] = P [3] + P [2] + P [1] + P [0] = 1/11 + 1/11 + 1/11 + 1/11 = 4/11. (3)

Problem 1.4.1 Solution From the table we look to add all the disjoint events that contain H0 to express the probability that a caller makes no hand-offs as

P [H0] = P [LH0] + P [BH0] = 0.1 + 0.4 = 0.5. (1)

In a similar fashion we can express the probability that a call is brief by

P [B] = P [BH0] + P [BH1] + P [BH2] = 0.4 + 0.1 + 0.1 = 0.6. (2)

The probability that a call is long or makes at least two hand-offs is

P [L ∪H2] = P [LH0] + P [LH1] + P [LH2] + P [BH2] (3) = 0.1 + 0.1 + 0.2 + 0.1 = 0.5. (4)

6

Problem 1.4.2 Solution

(a) From the given probability distribution of billed minutes, M , the probability that a call is billed for more than 3 minutes is

P [L] = 1− P [3 or fewer billed minutes] (1) = 1− P [B1]− P [B2]− P [B3] (2) = 1− α− α(1− α)− α(1− α)2 (3) = (1− α)3 = 0.57. (4)

(b) The probability that a call will billed for 9 minutes or less is

P [9 minutes or less] = 9∑

i=1

α(1− α)i−1 = 1(0.57)3. (5)

Problem 1.4.3 Solution The first generation consists of two plants each with genotype yg or gy. They are crossed to produce the following second generation genotypes, S = {yy, yg, gy, gg}. Each genotype is just as likely as any other so the probability of each genotype is consequently 1/4. A pea plant has yellow seeds if it possesses at least one dominant y gene. The set of pea plants with yellow seeds is

Y = {yy, yg, gy} . (1)

So the probability of a pea plant with yellow seeds is

P [Y ] = P [yy] + P [yg] + P [gy] = 3/4. (2)

Problem 1.4.4 Solution Each statement is a consequence of part 4 of Theorem 1.4.

(a) Since A ⊂ A ∪B, P [A] ≤ P [A ∪B]. (b) Since B ⊂ A ∪B, P [B] ≤ P [A ∪B]. (c) Since A ∩B ⊂ A, P [A ∩B] ≤ P [A]. (d) Since A ∩B ⊂ B, P [A ∩B] ≤ P [B].

Problem 1.4.5 Solution Specifically, we will use Theorem 1.7(c) which states that for any events A and B,

P [A ∪B] = P [A] + P [B]− P [A ∩B] . (1)

To prove the union bound by induction, we first prove the theorem for the case of n = 2 events. In this case, by Theorem 1.7(c),

P [A1 ∪A2] = P [A1] + P [A2]− P [A1 ∩A2] . (2)

7

By the first axiom of probability, P [A1 ∩A2] 0. Thus,

P [A1 ∪A2] ≤ P [A1] + P [A2] . (3)

which proves the union bound for the case n = 2. Now we make our induction hypothesis that the union-bound holds for any collection of n − 1 subsets. In this case, given subsets A1, . . . , An, we define

A = A1 ∪A2 ∪ · · · ∪An−1, B = An. (4) By our induction hypothesis,

P [A] = P [A1 ∪A2 ∪ · · · ∪An−1] ≤ P [A1] + · · ·+ P [An−1] . (5)

This permits us to write

P [A1 ∪ · · · ∪An] = P [A ∪B] (6) ≤ P [A] + P [B] (by the union bound for n = 2) (7) = P [A1 ∪ · · · ∪An−1] + P [An] (8) ≤ P [A1] + · · ·P [An−1] + P [An] (9)

which completes the inductive proof.

Problem 1.4.6 Solution

(a) For convenience, let pi = P [FHi] and qi = P [V Hi]. Using this shorthand, the six unknowns p0, p1, p2, q0, q1, q2 fill the table as

H0 H1 H2 F p0 p1 p2 V q0 q1 q2

. (1)

However, we are given a number of facts:

p0 + q0 = 1/3, p1 + q1 = 1/3, (2) p2 + q2 = 1/3, p0 + p1 + p2 = 5/12. (3)

Other facts, such as q0 + q1 + q2 = 7/12, can be derived from these facts. Thus, we have four equations and six unknowns, choosing p0 and p1 will specify the other unknowns. Un- fortunately, arbitrary choices for either p0 or p1 will lead to negative values for the other probabilities. In terms of p0 and p1, the other unknowns are

q0 = 1/3− p0, p2 = 5/12(p0 + p1), (4) q1 = 1/3− p1, q2 = p0 + p1 1/12. (5)

Because the probabilities must be nonnegative, we see that

0 ≤ p0 1/3, (6) 0 ≤ p1 1/3, (7)

1/12 ≤ p0 + p1 5/12. (8)

8

Although there are an infinite number of solutions, three possible solutions are:

p0 = 1/3, p1 = 1/12, p2 = 0, (9) q0 = 0, q1 = 1/4, q2 = 1/3. (10)

and

p0 = 1/4, p1 = 1/12, p2 = 1/12, (11) q0 = 1/12, q1 = 3/12, q2 = 3/12. (12)

and

p0 = 0, p1 = 1/12, p2 = 1/3, (13) q0 = 1/3, q1 = 3/12, q2 = 0. (14)

(b) In terms of the pi, qi notation, the new facts are p0 = 1/4 and q1 = 1/6. These extra facts uniquely specify the probabilities. In this case,

p0 = 1/4, p1 = 1/6, p2 = 0, (15) q0 = 1/12, q1 = 1/6, q2 = 1/3. (16)

Problem 1.4.7 Solution It is tempting to use the following proof:

Since S and φ are mutually exclusive, and since S = S ∪ φ,

1 = P [S ∪ φ] = P [S] + P [φ] . (1)

Since P [S] = 1, we must have P [φ] = 0.

The above “proof” used the property that for mutually exclusive sets A1 and A2,

P [A1 ∪A2] = P [A1] + P [A2] . (2)

The problem is that this property is a consequence of the three axioms, and thus must be proven. For a proof that uses just the three axioms, let A1 be an arbitrary set and for n = 2, 3, . . ., let An = φ. Since A1 = ∪∞i=1Ai, we can use Axiom 3 to write

P [A1] = P [∪∞i=1Ai] = P [A1] + P [A2] + i=3

P [Ai] . (3)

By subtracting P [A1] from both sides, the fact that A2 = φ permits us to write

P [φ] + n=3

P [Ai] = 0. (4)

By Axiom 1, P [Ai] 0 for all i. Thus, ∑

n=3 P [Ai] 0. This implies P [φ] 0. Since Axiom 1 requires P [φ] 0, we must have P [φ] = 0.

9

Problem 1.4.8 Solution Following the hint, we define the set of events {Ai|i = 1, 2, . . .} such that i = 1, . . . ,m, Ai = Bi and for i > m, Ai = φ. By construction, ∪mi=1Bi = ∪∞i=1Ai. Axiom 3 then implies

P [∪mi=1Bi] = P [∪∞i=1Ai] = i=1

P [Ai] . (1)

For i > m, P [Ai] = P [φ] = 0, yielding the claim P [∪mi=1Bi] = ∑m

i=1 P [Ai] = ∑m

i=1 P [Bi]. Note that the fact that P [φ] = 0 follows from Axioms 1 and 2. This problem is more challenging

if you just use Axiom 3. We start by observing

P [∪mi=1Bi] = m−1∑ i=1

P [Bi] +

i=m

P [Ai] . (2)

Now, we use Axiom 3 again on the countably infinite sequence Am, Am+1, . . . to write

i=m

P [Ai] = P [Am ∪Am+1 ∪ · · ·] = P [Bm] . (3)

Thus, we have used just Axiom 3 to prove Theorem 1.4: P [∪mi=1Bi] = ∑m

i=1 P [Bi].

Problem 1.4.9 Solution Each claim in Theorem 1.7 requires a proof from which we can check which axioms are used. However, the problem is somewhat hard because there may still be a simpler proof that uses fewer axioms. Still, the proof of each part will need Theorem 1.4 which we now prove.

For the mutually exclusive events B1, . . . , Bm, let Ai = Bi for i = 1, . . . ,m and let Ai = φ for i > m. In that case, by Axiom 3,

P [B1 ∪B2 ∪ · · · ∪Bm] = P [A1 ∪A2 ∪ · · ·] (1)

= m−1∑ i=1

P [Ai] +

i=m

P [Ai] (2)

= m−1∑ i=1

P [Bi] +

i=m

P [Ai] . (3)

Now, we use Axiom 3 again on Am, Am+1, . . . to write

i=m

P [Ai] = P [Am ∪Am+1 ∪ · · ·] = P [Bm] . (4)

Thus, we have used just Axiom 3 to prove Theorem 1.4:

P [B1 ∪B2 ∪ · · · ∪Bm] = mi=1

P [Bi] . (5)

(a) To show P [φ] = 0, let B1 = S and let B2 = φ. Thus by Theorem 1.4,

P [S] = P [B1 ∪B2] = P [B1] + P [B2] = P [S] + P [φ] . (6) Thus, P [φ] = 0. Note that this proof uses only Theorem 1.4 which uses only Axiom 3.

10

(b) Using Theorem 1.4 with B1 = A and B2 = Ac, we have

P [S] = P [A ∪Ac] = P [A] + P [Ac] . (7) Since, Axiom 2 says P [S] = 1, P [Ac] = 1− P [A]. This proof uses Axioms 2 and 3.

(c) By Theorem 1.2, we can write both A and B as unions of disjoint events:

A = (AB) (ABc) B = (AB) (AcB). (8) Now we apply Theorem 1.4 to write

P [A] = P [AB] + P [ABc] , P [B] = P [AB] + P [AcB] . (9)

We can rewrite these facts as

P [ABc] = P [A]− P [AB], P [AcB] = P [B]− P [AB]. (10) Note that so far we have used only Axiom 3. Finally, we observe that A ∪B can be written as the union of mutually exclusive events

A ∪B = (AB) (ABc) (AcB). (11) Once again, using Theorem 1.4, we have

P [A ∪B] = P [AB] + P [ABc] + P [AcB] (12) Substituting the results of Equation (10) into Equation (12) yields

P [A ∪B] = P [AB] + P [A]− P [AB] + P [B]− P [AB] , (13) which completes the proof. Note that this claim required only Axiom 3.

(d) Observe that since A ⊂ B, we can write B as the disjoint union B = A ∪ (AcB). By Theorem 1.4 (which uses Axiom 3),

P [B] = P [A] + P [AcB] . (14)

By Axiom 1, P [AcB] 0, hich implies P [A] ≤ P [B]. This proof uses Axioms 1 and 3.

Problem 1.5.1 Solution Each question requests a conditional probability.

(a) Note that the probability a call is brief is

P [B] = P [H0B] + P [H1B] + P [H2B] = 0.6. (1)

The probability a brief call will have no handoffs is

P [H0|B] = P [H0B] P [B]

= 0.4 0.6

= 2 3 . (2)

(b) The probability of one handoff is P [H1] = P [H1B] + P [H1L] = 0.2. The probability that a call with one handoff will be long is

P [L|H1] = P [H1L] P [H1]

= 0.1 0.2

= 1 2 . (3)

(c) The probability a call is long is P [L] = 1− P [B] = 0.4. The probability that a long call will have one or more handoffs is

P [H1 ∪H2|L] = P [H1L ∪H2L] P [L]

= P [H1L] + P [H2L]

P [L] =

0.1 + 0.2 0.4

= 3 4 . (4)

11

Problem 1.5.2 Solution Let si denote the outcome that the roll is i. So, for 1 ≤ i ≤ 6, Ri = {si}. Similarly, Gj = {sj+1, . . . , s6}. (a) Since G1 = {s2, s3, s4, s5, s6} and all outcomes have probability 1/6, P [G1] = 5/6. The event

R3G1 = {s3} and P [R3G1] = 1/6 so that

P [R3|G1] = P [R3G1] P [G1]

= 1 5 . (1)

(b) The conditional probability that 6 is rolled given that the roll is greater than 3 is

P [R6|G3] = P [R6G3] P [G3]

= P [s6]

P [s4, s5, s6] =

1/6 3/6

. (2)

(c) The event E that the roll is even is E = {s2, s4, s6} and has probability 3/6. The joint probability of G3 and E is

P [G3E] = P [s4, s6] = 1/3. (3)

The conditional probabilities of G3 given E is

P [G3|E] = P [G3E] P [E]

= 1/3 1/2

= 2 3 . (4)

(d) The conditional probability that the roll is even given that it’s greater than 3 is

P [E|G3] = P [EG3] P [G3]

= 1/3 1/2

= 2 3 . (5)

Problem 1.5.3 Solution Since the 2 of clubs is an even numbered card, C2 ⊂ E so that P [C2E] = P [C2] = 1/3. Since P [E] = 2/3,

P [C2|E] = P [C2E] P [E]

= 1/3 2/3

= 1/2. (1)

The probability that an even numbered card is picked given that the 2 is picked is

P [E|C2] = P [C2E] P [C2]

= 1/3 1/3

= 1. (2)

Problem 1.5.4 Solution DefineD as the event that a pea plant has two dominant y genes. To find the conditional probability of D given the event Y , corresponding to a plant having yellow seeds, we look to evaluate

P [D|Y ] = P [DY ] P [Y ]

. (1)

Note that P [DY ] is just the probability of the genotype yy. From Problem 1.4.3, we found that with respect to the color of the peas, the genotypes yy, yg, gy, and gg were all equally likely. This implies

P [DY ] = P [yy] = 1/4 P [Y ] = P [yy, gy, yg] = 3/4. (2)

Thus, the conditional probability can be expressed as

P [D|Y ] = P [DY ] P [Y ]

= 1/4 3/4

= 1/3. (3)

12

Problem 1.5.5 Solution The sample outcomes can be written ijk where the first card drawn is i, the second is j and the third is k. The sample space is

S = {234, 243, 324, 342, 423, 432} . (1) and each of the six outcomes has probability 1/6. The events E1, E2, E3, O1, O2, O3 are

E1 = {234, 243, 423, 432} , O1 = {324, 342} , (2) E2 = {243, 324, 342, 423} , O2 = {234, 432} , (3) E3 = {234, 324, 342, 432} , O3 = {243, 423} . (4)

(a) The conditional probability the second card is even given that the first card is even is

P [E2|E1] = P [E2E1] P [E1]

= P [243, 423]

P [234, 243, 423, 432] =

2/6 4/6

= 1/2. (5)

(b) The conditional probability the first card is even given that the second card is even is

P [E1|E2] = P [E1E2] P [E2]

= P [243, 423]

P [243, 324, 342, 423] =

2/6 4/6

= 1/2. (6)

(c) The probability the first two cards are even given the third card is even is

P [E1E2|E3] = P [E1E2E3] P [E3]

= 0. (7)

(d) The conditional probabilities the second card is even given that the first card is odd is

P [E2|O1] = P [O1E2] P [O1]

= P [O1] P [O1]

= 1. (8)

(e) The conditional probability the second card is odd given that the first card is odd is

P [O2|O1] = P [O1O2] P [O1]

= 0. (9)

Problem 1.5.6 Solution The problem statement yields the obvious facts that P [L] = 0.16 and P [H] = 0.10. The words “10% of the ticks that had either Lyme disease or HGE carried both diseases” can be written as

P [LH|L ∪H] = 0.10. (1) (a) Since LH ⊂ L ∪H,

P [LH|L ∪H] = P [LH ∩ (L ∪H)] P [L ∪H] =

P [LH] P [L ∪H] = 0.10. (2)

Thus, P [LH] = 0.10P [L ∪H] = 0.10 (P [L] + P [H]− P [LH]) . (3)

Since P [L] = 0.16 and P [H] = 0.10,

P [LH] = 0.10 (0.16 + 0.10)

1.1 = 0.0236. (4)

13

(b) The conditional probability that a tick has HGE given that it has Lyme disease is

P [H|L] = P [LH] P [L]

= 0.0236 0.16

= 0.1475. (5)

Problem 1.6.1 Solution This problems asks whether A and B can be independent events yet satisfy A = B? By definition, events A and B are independent if and only if P [AB] = P [A]P [B]. We can see that if A = B, that is they are the same set, then

P [AB] = P [AA] = P [A] = P [B] . (1)

Thus, for A and B to be the same set and also independent,

P [A] = P [AB] = P [A]P [B] = (P [A])2 . (2)

There are two ways that this requirement can be satisfied:

• P [A] = 1 implying A = B = S. • P [A] = 0 implying A = B = φ.

Problem 1.6.2 Solution

A

B

In the Venn diagram, assume the sample space has area 1 correspond- ing to probability 1. As drawn, both A and B have area 1/4 so that P [A] = P [B] = 1/4. Moreover, the intersection AB has area 1/16 and covers 1/4 of A and 1/4 of B. That is, A and B are independent since

P [AB] = P [A]P [B] . (1)

Problem 1.6.3 Solution

(a) Since A and B are disjoint, P [A ∩B] = 0. Since P [A ∩B] = 0,

P [A ∪B] = P [A] + P [B]− P [A ∩B] = 3/8. (1)

A Venn diagram should convince you that A ⊂ Bc so that A ∩Bc = A. This implies

P [A ∩Bc] = P [A] = 1/4. (2)

It also follows that P [A ∪Bc] = P [Bc] = 11/8 = 7/8. (b) Events A and B are dependent since P [AB] = P [A]P [B].

14

(c) Since C and D are independent,

P [C ∩D] = P [C]P [D] = 15/64. (3) The next few items are a little trickier. From Venn diagrams, we see

P [C ∩Dc] = P [C]− P [C ∩D] = 5/815/64 = 25/64. (4) It follows that

P [C ∪Dc] = P [C] + P [Dc]− P [C ∩Dc] (5) = 5/8 + (13/8)25/64 = 55/64. (6)

Using DeMorgan’s law, we have

P [Cc ∩Dc] = P [(C ∪D)c] = 1− P [C ∪D] = 15/64. (7) (d) Since P [CcDc] = P [Cc]P [Dc], Cc and Dc are independent.

Problem 1.6.4 Solution

(a) Since A ∩B = , P [A ∩B] = 0. To find P [B], we can write P [A ∪B] = P [A] + P [B]− P [A ∩B] (1)

5/8 = 3/8 + P [B]0. (2) Thus, P [B] = 1/4. Since A is a subset of Bc, P [A ∩Bc] = P [A] = 3/8. Furthermore, since A is a subset of Bc, P [A ∪Bc] = P [Bc] = 3/4.

(b) The events A and B are dependent because

P [AB] = 0 = 3/32 = P [A]P [B] . (3) (c) Since C and D are independent P [CD] = P [C]P [D]. So

P [D] = P [CD] P [C]

= 1/3 1/2

= 2/3. (4)

In addition, P [C ∩Dc] = P [C] − P [C ∩D] = 1/2 1/3 = 1/6. To find P [Cc ∩Dc], we first observe that

P [C ∪D] = P [C] + P [D]− P [C ∩D] = 1/2 + 2/31/3 = 5/6. (5) By De Morgan’s Law, Cc ∩Dc = (C ∪D)c. This implies

P [Cc ∩Dc] = P [(C ∪D)c] = 1− P [C ∪D] = 1/6. (6) Note that a second way to find P [Cc ∩Dc] is to use the fact that if C and D are independent, then Cc and Dc are independent. Thus

P [Cc ∩Dc] = P [Cc]P [Dc] = (1− P [C])(1− P [D]) = 1/6. (7) Finally, since C and D are independent events, P [C|D] = P [C] = 1/2.

(d) Note that we found P [C ∪D] = 5/6. We can also use the earlier results to show P [C ∪Dc] = P [C] + P [D]− P [C ∩Dc] = 1/2 + (12/3)1/6 = 2/3. (8)

(e) By Definition 1.7, events C and Dc are independent because

P [C ∩Dc] = 1/6 = (1/2)(1/3) = P [C]P [Dc] . (9)

15

Problem 1.6.5 Solution For a sample space S = {1, 2, 3, 4} with equiprobable outcomes, consider the events

A1 = {1, 2} A2 = {2, 3} A3 = {3, 1} . (1)

Each event Ai has probability 1/2. Moreover, each pair of events is independent since

P [A1A2] = P [A2A3] = P [A3A1] = 1/4. (2)

However, the three events A1, A2, A3 are not independent since

P [A1A2A3] = 0 = P [A1]P [A2]P [A3] . (3)

Problem 1.6.6 Solution There are 16 distinct equally likely outcomes for the second generation of pea plants based on a first generation of {rwyg, rwgy, wryg, wrgy}. They are listed below

rryy rryg rrgy rrgg rwyy rwyg rwgy rwgg wryy wryg wrgy wrgg wwyy wwyg wwgy wwgg

(1)

A plant has yellow seeds, that is event Y occurs, if a plant has at least one dominant y gene. Except for the four outcomes with a pair of recessive g genes, the remaining 12 outcomes have yellow seeds. From the above, we see that

P [Y ] = 12/16 = 3/4 (2)

and P [R] = 12/16 = 3/4. (3)

To find the conditional probabilities P [R|Y ] and P [Y |R], we first must find P [RY ]. Note that RY , the event that a plant has rounded yellow seeds, is the set of outcomes

RY = {rryy, rryg, rrgy, rwyy, rwyg, rwgy, wryy, wryg, wrgy} . (4)

Since P [RY ] = 9/16,

P [Y |R ] = P [RY ] P [R]

= 9/16 3/4

= 3/4 (5)

and P [R |Y ] = P [RY ]

P [Y ] =

9/16 3/4

= 3/4. (6)

Thus P [R|Y ] = P [R] and P [Y |R] = P [Y ] and R and Y are independent events. There are four visibly different pea plants, corresponding to whether the peas are round (R) or not (Rc), or yellow (Y ) or not (Y c). These four visible events have probabilities

P [RY ] = 9/16 P [RY c] = 3/16, (7) P [RcY ] = 3/16 P [RcY c] = 1/16. (8)

16

Problem 1.6.7 Solution

(a) For any events A and B, we can write the law of total probability in the form of

P [A] = P [AB] + P [ABc] . (1)

Since A and B are independent, P [AB] = P [A]P [B]. This implies

P [ABc] = P [A]− P [A]P [B] = P [A] (1− P [B]) = P [A]P [Bc] . (2)

Thus A and Bc are independent.

(b) Proving that Ac and B are independent is not really necessary. Since A and B are arbitrary labels, it is really the same claim as in part (a). That is, simply reversing the labels of A and B proves the claim. Alternatively, one can construct exactly the same proof as in part (a) with the labels A and B reversed.

(c) To prove that Ac and Bc are independent, we apply the result of part (a) to the sets A and Bc. Since we know from part (a) that A and Bc are independent, part (b) says that Ac and Bc are independent.

Problem 1.6.8 Solution

A AC

AB ABC C

BCB

In the Venn diagram at right, assume the sample space has area 1 cor- responding to probability 1. As drawn, A, B, and C each have area 1/2 and thus probability 1/2. Moreover, the three way intersection ABC has probability 1/8. Thus A, B, and C are mutually independent since

P [ABC] = P [A]P [B]P [C] . (1)

Problem 1.6.9 Solution

A AB B

AC C BC

In the Venn diagram at right, assume the sample space has area 1 cor- responding to probability 1. As drawn, A, B, and C each have area 1/3 and thus probability 1/3. The three way intersection ABC has zero probability, implying A, B, and C are not mutually independent since

P [ABC] = 0 = P [A]P [B]P [C] . (1)

However, AB, BC, and AC each has area 1/9. As a result, each pair of events is independent since

P [AB] = P [A]P [B] , P [BC] = P [B]P [C] , P [AC] = P [A]P [C] . (2)

17

Problem 1.7.1 Solution A sequential sample space for this experiment is

 H1 1/4

 T13/4

 H2 1/4

T23/4

H2 1/4 T23/4

•H1H2 1/16 •H1T2 3/16

•T1H2 3/16 •T1T2 9/16

(a) From the tree, we observe

P [H2] = P [H1H2] + P [T1H2] = 1/4. (1)

This implies

P [H1|H2] = P [H1H2] P [H2]

= 1/16 1/4

= 1/4. (2)

(b) The probability that the first flip is heads and the second flip is tails is P [H1T2] = 3/16.

Problem 1.7.2 Solution The tree with adjusted probabilities is

 

 G11/2

 R11/2

 G2 3/4

R21/4

 G2 1/4

R23/4

•G1G2 3/8

•G1R2 1/8

•R1G2 1/8

•R1R2 3/8

From the tree, the probability the second light is green is

P [G2] = P [G1G2] + P [R1G2] = 3/8 + 1/8 = 1/2. (1)

The conditional probability that the first light was green given the second light was green is

P [G1|G2] = P [G1G2] P [G2]

= P [G2|G1]P [G1]

P [G2] = 3/4. (2)

Finally, from the tree diagram, we can directly read that P [G2|G1] = 3/4.

Problem 1.7.3 Solution Let Gi and Bi denote events indicating whether free throw i was good (Gi) or bad (Bi). The tree for the free throw experiment is

18

 

 G11/2

B11/2

 G2 3/4

B21/4

 G2 1/4

B23/4

•G1G2 3/8

•G1B2 1/8

•B1G2 1/8

•B1B2 3/8

The game goes into overtime if exactly one free throw is made. This event has probability

P [O] = P [G1B2] + P [B1G2] = 1/8 + 1/8 = 1/4. (1)

Problem 1.7.4 Solution The tree for this experiment is

  A 1/2

 B1/2

  H 1/4

T3/4

H 3/4 T1/4

•AH 1/8 •AT 3/8

•BH 3/8 •BT 1/8

The probability that you guess correctly is

P [C] = P [AT ] + P [BH] = 3/8 + 3/8 = 3/4. (1)

Problem 1.7.5 Solution The P [− |H ] is the probability that a person who has HIV tests negative for the disease. This is referred to as a false-negative result. The case where a person who does not have HIV but tests positive for the disease, is called a false-positive result and has probability P [+|Hc]. Since the test is correct 99% of the time,

P [−|H] = P [+|Hc] = 0.01. (1) Now the probability that a person who has tested positive for HIV actually has the disease is

P [H|+] = P [+, H] P [+]

= P [+, H]

P [+, H] + P [+, Hc] . (2)

We can use Bayes’ formula to evaluate these joint probabilities.

P [H|+] = P [+|H]P [H] P [+|H]P [H] + P [+|Hc]P [Hc] (3)

= (0.99)(0.0002)

(0.99)(0.0002) + (0.01)(0.9998) (4)

= 0.0194. (5)

Thus, even though the test is correct 99% of the time, the probability that a random person who tests positive actually has HIV is less than 0.02. The reason this probability is so low is that the a priori probability that a person has HIV is very small.

19

Problem 1.7.6 Solution Let Ai and Di indicate whether the ith photodetector is acceptable or defective.

  A1 3/5

D12/5

  A2 4/5

D21/5

A2 2/5D23/5

•A1A2 12/25 •A1D2 3/25

•D1A2 4/25 •D1D2 6/25

(a) We wish to find the probability P [E1] that exactly one photodetector is acceptable. From the tree, we have

P [E1] = P [A1D2] + P [D1A2] = 3/25 + 4/25 = 7/25. (1)

(b) The probability that both photodetectors are defective is P [D1D2] = 6/25.

Problem 1.7.7 Solution The tree for this experiment is

 

 A11/2

B11/2

 H1 1/4

T13/4

H1 3/4 T11/4

 H2 3/4

T21/4 H2

3/4 T21/4

 H2 1/4

T23/4 H2

1/4 T23/4

•A1H1H2 3/32 •A1H1T2 1/32 •A1T1H2 9/32 •A1T1T2 3/32

•B1H1H2 3/32 •B1H1T2 9/32 •B1T1H2 1/32 •B1T1T2 3/32

The event H1H2 that heads occurs on both flips has probability

P [H1H2] = P [A1H1H2] + P [B1H1H2] = 6/32. (1)

The probability of H1 is

P [H1] = P [A1H1H2] + P [A1H1T2] + P [B1H1H2] + P [B1H1T2] = 1/2. (2)

Similarly,

P [H2] = P [A1H1H2] + P [A1T1H2] + P [B1H1H2] + P [B1T1H2] = 1/2. (3)

Thus P [H1H2] = P [H1]P [H2], implying H1 and H2 are not independent. This result should not be surprising since if the first flip is heads, it is likely that coin B was picked first. In this case, the second flip is less likely to be heads since it becomes more likely that the second coin flipped was coin A.

20

Problem 1.7.8 Solution

(a) The primary difficulty in this problem is translating the words into the correct tree diagram. The tree for this problem is shown below.

 

 

H11/2

T11/2  

H21/2

T21/2 H3

1/2

 

 T31/2

H4 1/2T41/2

 

 

H31/2

T31/2 

H41/2 T41/2

•H1 1/2 •T1H2H3 1/8

•T1H2T3H4 1/16 •T1H2T3T4 1/16

•T1T2H3H4 1/16 •T1T2H3T4 1/16

•T1T2T3 1/8

(b) From the tree,

P [H3] = P [T1H2H3] + P [T1T2H3H4] + P [T1T2H3H4] (1) = 1/8 + 1/16 + 1/16 = 1/4. (2)

Similarly,

P [T3] = P [T1H2T3H4] + P [T1H2T3T4] + P [T1T2T3] (3) = 1/8 + 1/16 + 1/16 = 1/4. (4)

(c) The event that Dagwood must diet is

D = (T1H2T3T4) (T1T2H3T4) (T1T2T3). (5) The probability that Dagwood must diet is

P [D] = P [T1H2T3T4] + P [T1T2H3T4] + P [T1T2T3] (6) = 1/16 + 1/16 + 1/8 = 1/4. (7)

The conditional probability of heads on flip 1 given that Dagwood must diet is

P [H1|D] = P [H1D] P [D]

= 0. (8)

Remember, if there was heads on flip 1, then Dagwood always postpones his diet.

(d) From part (b), we found that P [H3] = 1/4. To check independence, we calculate

P [H2] = P [T1H2H3] + P [T1H2T3] + P [T1H2T4T4] = 1/4 (9) P [H2H3] = P [T1H2H3] = 1/8. (10)

Now we find that P [H2H3] = 1/8 = P [H2]P [H3] . (11)

Hence, H2 and H3 are dependent events. In fact, P [H3|H2] = 1/2 while P [H3] = 1/4. The reason for the dependence is that given H2 occurred, then we know there will be a third flip which may result in H3. That is, knowledge of H2 tells us that the experiment didn’t end after the first flip.

21

Problem 1.7.9 Solution

(a) We wish to know what the probability that we find no good photodiodes in n pairs of diodes. Testing each pair of diodes is an independent trial such that with probability p, both diodes of a pair are bad. From Problem 1.7.6, we can easily calculate p.

p = P [both diodes are defective] = P [D1D2] = 6/25. (1)

The probability of Zn, the probability of zero acceptable diodes out of n pairs of diodes is pn

because on each test of a pair of diodes, both must be defective.

P [Zn] = n

i=1

p = pn = (

6 25

)n (2)

(b) Another way to phrase this question is to ask how many pairs must we test until P [Zn] 0.01. Since P [Zn] = (6/25)n, we require(

6 25

)n ≤ 0.01 ⇒ n ≥ ln 0.01

ln 6/25 = 3.23. (3)

Since n must be an integer, n = 4 pairs must be tested.

Problem 1.7.10 Solution The experiment ends as soon as a fish is caught. The tree resembles

 

 C1p

Cc11−p  

  C2p

Cc21−p  

  C3p

Cc31−p ...

From the tree, P [C1] = p and P [C2] = (1− p)p. Finally, a fish is caught on the nth cast if no fish were caught on the previous n− 1 casts. Thus,

P [Cn] = (1− p)n−1p. (1)

Problem 1.8.1 Solution There are 25 = 32 different binary codes with 5 bits. The number of codes with exactly 3 zeros equals the number of ways of choosing the bits in which those zeros occur. Therefore there are( 5 3

) = 10 codes with exactly 3 zeros.

Problem 1.8.2 Solution Since each letter can take on any one of the 4 possible letters in the alphabet, the number of 3 letter words that can be formed is 43 = 64. If we allow each letter to appear only once then we have 4 choices for the first letter and 3 choices for the second and two choices for the third letter. Therefore, there are a total of 4 · 3 · 2 = 24 possible codes.

22

Problem 1.8.3 Solution

(a) The experiment of picking two cards and recording them in the order in which they were selected can be modeled by two sub-experiments. The first is to pick the first card and record it, the second sub-experiment is to pick the second card without replacing the first and recording it. For the first sub-experiment we can have any one of the possible 52 cards for a total of 52 possibilities. The second experiment consists of all the cards minus the one that was picked first(because we are sampling without replacement) for a total of 51 possible outcomes. So the total number of outcomes is the product of the number of outcomes for each sub-experiment.

52 · 51 = 2652 outcomes. (1)

(b) To have the same card but different suit we can make the following sub-experiments. First we need to pick one of the 52 cards. Then we need to pick one of the 3 remaining cards that are of the same type but different suit out of the remaining 51 cards. So the total number outcomes is

52 · 3 = 156 outcomes. (2)

(c) The probability that the two cards are of the same type but different suit is the number of outcomes that are of the same type but different suit divided by the total number of outcomes involved in picking two cards at random from a deck of 52 cards.

P [same type, different suit] = 156 2652

= 1 17

. (3)

(d) Now we are not concerned with the ordering of the cards. So before, the outcomes (K♥, 8) and (8♦,K♥) were distinct. Now, those two outcomes are not distinct and are only considered to be the single outcome that a King of hearts and 8 of diamonds were selected. So every pair of outcomes before collapses to a single outcome when we disregard ordering. So we can redo parts (a) and (b) above by halving the corresponding values found in parts (a) and (b). The probability however, does not change because both the numerator and the denominator have been reduced by an equal factor of 2, which does not change their ratio.

Problem 1.8.4 Solution We can break down the experiment of choosing a starting lineup into a sequence of subexperiments:

1. Choose 1 of the 10 pitchers. There are N1 = ( 10 1

) = 10 ways to do this.

2. Choose 1 of the 15 field players to be the designated hitter (DH). There are N2 = ( 15 1

) = 15

ways to do this.

3. Of the remaining 14 field players, choose 8 for the remaining field positions. There are N3 =

( 14 8

) to do this.

4. For the 9 batters (consisting of the 8 field players and the designated hitter), choose a batting lineup. There are N4 = 9! ways to do this.

23

So the total number of different starting lineups when the DH is selected among the field players is

N = N1N2N3N4 = (10)(15) ( 14 8

) 9! = 163,459,296,000. (1)

Note that this overestimates the number of combinations the manager must really consider because most field players can play only one or two positions. Although these constraints on the manager reduce the number of possible lineups, it typically makes the manager’s job more difficult. As for the counting, we note that our count did not need to specify the positions played by the field players. Although this is an important consideration for the manager, it is not part of our counting of different lineups. In fact, the 8 nonpitching field players are allowed to switch positions at any time in the field. For example, the shortstop and second baseman could trade positions in the middle of an inning. Although the DH can go play the field, there are some coomplicated rules about this. Here is an an excerpt from Major league Baseball Rule 6.10:

The Designated Hitter may be used defensively, continuing to bat in the same posi- tion in the batting order, but the pitcher must then bat in the place of the substituted defensive player, unless more than one substitution is made, and the manager then must designate their spots in the batting order.

If you’re curious, you can find the complete rule on the web.

Problem 1.8.5 Solution When the DH can be chosen among all the players, including the pitchers, there are two cases:

The DH is a field player. In this case, the number of possible lineups, NF , is given in Problem 1.8.4. In this case, the designated hitter must be chosen from the 15 field players. We repeat the solution of Problem 1.8.4 here: We can break down the experiment of choosing a starting lineup into a sequence of subexperiments:

1. Choose 1 of the 10 pitchers. There are N1 = ( 10 1

) = 10 ways to do this.

2. Choose 1 of the 15 field players to be the designated hitter (DH). There are N2 = ( 15 1

) =

15 ways to do this.

3. Of the remaining 14 field players, choose 8 for the remaining field positions. There are N3 =

( 14 8

) to do this.

4. For the 9 batters (consisting of the 8 field players and the designated hitter), choose a batting lineup. There are N4 = 9! ways to do this.

So the total number of different starting lineups when the DH is selected among the field players is

N = N1N2N3N4 = (10)(15) ( 14 8

) 9! = 163,459,296,000. (1)

The DH is a pitcher. In this case, there are 10 choices for the pitcher, 10 choices for the DH among the pitchers (including the pitcher batting for himself),

( 15 8

) choices for the field

players, and 9! ways of ordering the batters into a lineup. The number of possible lineups is

N ′ = (10)(10) ( 15 8

) 9! = 233, 513, 280, 000. (2)

The total number of ways of choosing a lineup is N +N ′ = 396,972,576,000.

24

Problem 1.8.6 Solution

(a) We can find the number of valid starting lineups by noticing that the swingman presents three situations: (1) the swingman plays guard, (2) the swingman plays forward, and (3) the swingman doesn’t play. The first situation is when the swingman can be chosen to play the guard position, and the second where the swingman can only be chosen to play the forward position. Let Ni denote the number of lineups corresponding to case i. Then we can write the total number of lineups as N1 +N2 +N3. In the first situation, we have to choose 1 out of 3 centers, 2 out of 4 forwards, and 1 out of 4 guards so that

N1 = ( 3 1

)( 4 2

)( 4 1

) = 72. (1)

In the second case, we need to choose 1 out of 3 centers, 1 out of 4 forwards and 2 out of 4 guards, yielding

N2 = ( 3 1

)( 4 1

)( 4 2

) = 72. (2)

Finally, with the swingman on the bench, we choose 1 out of 3 centers, 2 out of 4 forward, and 2 out of four guards. This implies

N3 = ( 3 1

)( 4 2

)( 4 2

) = 108, (3)

and the total number of lineups is N1 +N2 +N3 = 252.

Problem 1.8.7 Solution What our design must specify is the number of boxes on the ticket, and the number of specially marked boxes. Suppose each ticket has n boxes and 5+ k specially marked boxes. Note that when k > 0, a winning ticket will still have k unscratched boxes with the special mark. A ticket is a winner if each time a box is scratched off, the box has the special mark. Assuming the boxes are scratched off randomly, the first box scratched off has the mark with probability (5 + k)/n since there are 5 + k marked boxes out of n boxes. Moreover, if the first scratched box has the mark, then there are 4 + k marked boxes out of n − 1 remaining boxes. Continuing this argument, the probability that a ticket is a winner is

p = 5 + k n

4 + k n− 1

3 + k n− 2

2 + k n− 3

1 + k n− 4 =

(k + 5)!(n− 5)! k!n!

. (1)

By careful choice of n and k, we can choose p close to 0.01. For example,

n 9 11 14 17 k 0 1 2 3 p 0.0079 0.012 0.0105 0.0090

(2)

A gamecard with N = 14 boxes and 5 + k = 7 shaded boxes would be quite reasonable.

Problem 1.9.1 Solution

25

no comments were posted
This is only a preview
3 shown on 443 pages