Notes 3

  • May 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Notes 3 as PDF for free.

More details

  • Words: 1,151
  • Pages: 3
1

Lecture Plan • Experiments, Outcomes and Events • The Axioms of Probability • Axiom consequences • Finite Sample Space.

2

Experiments, Outcomes, Sample Space, and Events

Example: • Experiment: Toss a coin three times. • Outcomes: The possible outcomes are hhh, hht, hth, htt, thh, tht, tth, ttt. • Sample space: The set of all outcomes: S = {hhh, hht, hth, htt, thh, tht, tth, ttt} • Events: Are subsets of S to which we will assign probabilities. Examples of events: A at least one head: A = {hhh, hht, hth, htt, thh, tht, tth}, B the first two tosses are heads: B = {tth, ttt} An event A is said to have occurred if any outcome ω ∈ A occurs when an experiment is conducted. Suppose an experiment is done with outcome ω = tth then all events (subsets of S) containing ω occur.

2.1

The Axioms of Probability

See section 2.4 pages 69-74. In probability we are interested in assigning numbers in [0, 1] to certain subsets of S (called events). The following axioms allow us to do this in a consistent way. A1 If A ⊂ S is an event then P (A) ≥ 0. A2 P (S) = 1 A3 If A1 , A2 , . . . are mutually exclusive events then P (∪ni=1 Ai ) =

n X

P (Ai )

i=1

1

for alln = 1, 2, . . . , ∞.

Note: In order for a set of outcomes to be an event we need to be able to assign a probability to the set. The set S is known as the sample space. It contains all the possible outcomes of an experiment. In general an outcome of an experiment is denoted by ω. An experiment can have a finite number of outcomes, a countable number of outcomes e.g. S = {ω1 , ω2 , ω3 , . . .}, or an uncountable number of outcomes, e.g. S = {ω : 0 ≤ ω ≤ 1}. Before discussing some of the implications of these axioms let us verify that they indeed hold true for a couple of simple examples: 1. Suppose S = {h, t}. This is the sample space corresponding to the experiment of tossing a coin where h represents heads and t represents tails. The relevant subsets of S are ∅, {h}, {t}, {h, t}. If the coin is fair we would have P ({h}) = P ({t}) = 0.5, P (∅) = 0, P ({h, t}) = 1 which agrees with A1, A2, A3. 2. Suppose S = {(x, y) : 0 ≤ x ≤ 1, 0 ≤ y1} and for each A = {(x, y) : x1 ≤ x ≤ x2 , y1 ≤ y ≤ y2 } ⊂ S let P (A) = (x2 − x1 )(y2 − y1 ). Notice that P (S) = 1. For each z ∈ [0, 1] let Az = {(x, y) : x = z, 0 ≤ y ≤ 1} and notice that P (Az ) = 0 for all z ∈ [0, 1]. Also, S = ∪z∈[0,1] Az . Because Ax ∩P Az = ∅ for all x 6= z the sets Az , z ∈ [0, 1] are mutually exclusive. Consequently, P (∪z∈[0,1] Az ) = 1, while z∈[0,1] P (Az ) = 0. This example seems to contradict A3, except for the fact A3 is only valid for countable collections of mutually exclusive subsets. You can think of A1, A2, A3 as some fundamental properties (or axioms) that are satisfied by all probability models. These axioms have been carefully chosen and they have many profound implications. Here are a few consequences of the axioms: The axioms do not tell us how to assign probabilities to events. In practice this is done by experience (experiments) or by assumptions.

3

Axiom Consequences

P (A0 ) = 1 − P (A), so P (∅) = 0 and P (A) ≤ 1. P (A ∪ B) = P (A) + P (B) − P (A ∩ B). So P (A ∪ B) ≤ P (A) + P (B). Example: Toss two fair coins: A first toss is a head, B second toss is a head. Find A∩B and P (A), P (B), P (A∩ B) and P (A ∪ B).

4

Finite Sample Space

Let S = {w1 , w2 , . . . , wN }, let Ai = {wi }, i = 1, . . . , N , and let pi = P (Ai ) = P ({wi }), i = 1, . . . , N. By PN axiom A1 we have pi ≥ 0 by axiomPA2 and A3 applied to A1 , . . . , AN we have i=1 pi = 1. From A3 for any subset A of S we have P (A) = i:wi ∈A pi . Example: Toss a coin two times and record the number of heads. S = {0, 1, 2}. Suppose the probabilities of these events are p0 = (1 − p)2 , p1 = 2p(1 − p), and p2 = p2 where p is a number between zero and one. If A = {1, 2} then P (A) = p1 + p2 = 2p(1 − p) + p2 . PN What if all the outcomes are equally likely? Then pi = p for every i = 1, . . . , N and 1 = i=1 pi = N p so p = 1/N, and P (A) is equal to the number of elements of A divided by N. Example: Black urn has five red and six green balls, a white urn has three red and four green balls. Select an urn and then an ball from the selected urn. What strategy maximizes the probability of getting a red ball? Answer: 5/11 > 3/7 so pick black urn.

2

Example: Now consider another game in which a second black urn contains six red and three green balls, and a second white urn contains nine red and five green balls. What strategy maximizes the probability of getting a red ball? Answer: 6/9 > 9/14 so select black urn again. Example: Finally, mix the contents of the black urns and the contents of the white urns. Which urn should you choose? Answer: 11/20 < 12/21 so now select the white urn. Example: Suppose that a course has five sections. If three students are equally likely to select one section from a group of five sections what is the probability that they all select different sections. To answer this question it is helpful to know about counting techniques. Section 2.5 of the textbook discusses counting techniques. I will not cover these in class, but students should go over them with care. I will merely write down the definition and equations. The first homework includes a couple of problems involving counting techniques.

3

Related Documents

Notes 3
May 2020 3
Notes 3
July 2020 7
Notes 3
May 2020 5
Notes 3
June 2020 8
Notes 3
June 2020 8
Notes 3
April 2020 7