Probability Introduction

  • Uploaded by: Mr Bhanushali
  • 0
  • 0
  • May 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Probability Introduction as PDF for free.

More details

  • Words: 1,590
  • Pages: 29
Probability

DR KISHOR BHANUSHALI FACULTY MEMBER IBS- AHMEDABAD [email protected]

Probability means Chance, Possibility, Probably, Likely The event is not certain to take place Uncertainty about the happening of an event in

question Layman - Uncertainty about the happening of an event - Belief - Wishful thinking

Statistics/Mathematics - Condition under which sensible statements can be made - Application of numerical methods to compute numerical values of probability and expectations

Origin of probability Games of chances Gambling Tossing Drawing of cards

Probability ------------ Statistics Sample Interpretation of statistical results Random variations Conclusions Foundation of statistical inferences

Probability ------- Defined Likelihood or chance of occurrence of an events Value lies between zero to one (0 – 1) Classical probability Relative frequency theory of probability Subjective approach to probability Axiomatic approach to probability

Classical Probability Oldest and Simple Eighteenth century Origin in games of chance Assumption of equally likely events Probability is the ratio of the number of favorable cases

to the total number of equally likely cases Pr (A) =a/n Pr (not A) = b/n a/n + b/n = 1 a + b = n Real life situation is different- cases are not equally likely A priori concept

Relative Frequency Theory of Probability The probability of an event is defined as the relative

frequency with which event occur in an indefinitely large number of experiments or trials

Lt a P ( A) = = → ∞ n Theoreticallynwe can never obtain the probability of an event as given by the limit In practice we can only try to have close estimate of P(A) based on large number of bservations A Posteriori

Subjective Approach to Probability 1926 – Frank Ramsey The subjective probability is defined as the

probability assigned to an event by an individual based on whatever evidences is available Belief of an individual making a judgments Very broad and highly flexible Useful in business decisions

Axiomatic Approach to Probability A.N. Kolmogorov 1933 “ Foundation of Probability” No precise definition of probability Probability calculations are based on some

postulates Finite Space Probability of an event from 0 to 1 P(s)= 1 , Probability of entire sample space P (AÜB) = P(A) + P(B)

Probability ---- Some terms Two events are said to be mutually exclusive or in

compatible when both cannot happen simultaneously in a single trial or in other words the occurrence of any one of the event precludes the occurrence of the other event. If A and B are mutually exclusive, that P(AB) = 0 Two events are said to be independent when the outcome of one event does not affect or is affected by the outcome of another event Events are said to be equally likely when one events does not occur more often than the other.

Probability ---- Some terms In the case of simple events we consider the

probability of the happening or not happening of single events while in the case of compound events we consider the joint occurrence of two or more events Events are said to be exhaustive when their totality include all the possible outcome of a random experiments (sample space) Events A is called complementary event of B is A and B are mutually exclusive and exhaustive. It is written as AB

Additional Theorem P (A or B) = P (A) + P (B) for mutually exclusive

events (Two events are mutually exclusive if they cannot occur at the same time) P(A and B) = 0 P (A or B) = P (A) + P (B) – P (A and B) for mutually

non exclusive events (n events which aren't mutually exclusive, there is some overlap) When P(A) and P(B) are added, the probability of the intersection (and) is added twice. To compensate for that double addition, the intersection needs to be subtracted.

Additional Theorem Planned to Purchase

Actually Purchased YES

NO

Total

YES

200

50

250

NO

100

650

750

TOTAL

300

700

1000

Multiplication Theorem If two events A and B are independent, the probability

that both will occur is equal to the product of their individual probabilities Prob A and B = Prob A * Prob B (independent events)

Conditional Probability When computing the probability of a particular

event A, given information about the occurrence of event B, this probability is called conditional probability P(A|B) The probability of A given B is probability of A and B divided by the probability of B P(A|B) = P (A and B) P(B)

Bayes’ Theorem Bayes' Theorem is a theorem of probability theory

originally stated by the Reverend Thomas Bayes It can be seen as a way of understanding how the probability that a theory is true is affected by a new piece of evidence

Example  Marie is getting married tomorrow, at an outdoor

ceremony in the desert. In recent years, it has rained only 5 days each year. Unfortunately, the weatherman has predicted rain for tomorrow. When it actually rains, the weatherman correctly forecasts rain 90% of the time. When it doesn't rain, he incorrectly forecasts rain 10% of the time. What is the probability that it will rain on the day of Marie's wedding?

Solution  The sample space is defined by two mutually-exclusive events - it rains or it

does not rain. Additionally, a third event occurs when the weatherman predicts rain. Notation for these events appears below.  Event A1. It rains on Marie's wedding.  Event A2. It does not rain on Marie's wedding  Event B. The weatherman predicts rain.  In terms of probabilities, we know the following: P( A1 ) = 5/365 =0.0136985 [It

rains 5 days out of the year.]  P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out of the year.]  P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain 90% of the time.]  P( B | A2 ) = 0.1 [When it does not rain, the weatherman predicts rain 10% of

the time.]  We want to know P( A1 | B ), the probability it will rain on the day of Marie's wedding, given a forecast for rain by the weatherman. The answer can be determined from Bayes' theorem, as shown below.  P( A1 | B ) =P( A1 ) P( B | A1 )/P( A1 ) P( B | A1 ) + P( A2 ) P( B | A2 )P( A1 | B )  =(0.014)(0.9) / [ (0.014)(0.9) + (0.986)(0.1) ]P( A1 | B ) =0.111

Probability Distribution Frequency distribution is a listing of all the outcomes

of an experiments that actually occurred when the experiment was done The probability distribution is a listing of all the probabilities of all the possible outcomes that could result if the experiment were done Discrete probability distribution can take only limited number of values which can be listed Under continuous probability distribution variable can take any value within a given range so that we cannot list all possible outcome

Probability Distribution of a Discrete Random Variable A probability distribution for a discrete random

variable is a mutually exclusive listing of all possible numerical outcomes for that variable such that a particular probability of occurrence is associated with each outcome Expected value of a random variable Variance of a random variable Standard deviation of a random variable Covariance and its application in finance

Binomial Distribution Frequency distribution where only two (mutually

exclusive) outcomes are possible, such as better or worse, gain or loss, head or tail, rise or fall, success or failure, yes or no. Therefore, if the probability of success in any given trial is known, binomial distributions can be employed to compute a given number of successes in a given number of trials.

Binomial Distribution X n! P( X ) = p ( 1 − X !(n − X )!

n− X

p)

P(X) = Probability of X successes, given “n” and “p” n = number of observations p = Probability of success 1-p = probability of failure X = number of success in the sample

Poisson Distribution The Poisson distribution is used to describe a

number of processes, including distribution of telephone calls going through the switchboard system, the demand of patient for services at a health institution, the arrivals of trucks and cars at the tollbooth, and the number of accidents at an intersection

Poisson Distribution

λ

− λ

P ( X ) =e

x

X!

P(X) = the probability of X events in an area of opportunity λ= expected number of events e = mathematical constant approximated by 2.71828 X = number of events

Continuous Probability Distributions Continuous probability density function Normal distribution Uniform distribution/rectangular distribution Exponential distribution

Normal Distribution Normal distribution is represented by the classical

bell shape In the normal distribution you can calculate the probability that various values occur within certain range or intervals Exact probability of a particular value from a continuous distribution such as normal distribution is zero

Normal Probability Density Function

f (X ) =

1 2πσ

e

− (1 / 2 ){( X − µ ) / σ }2

e = the mathematical constant approximated by 2.71828 π= the mathematical constant approximated by 3.14159 π= the mean σ= the standard deviation X = any value of the continuous variable, where - ∞ < X< ∞

Normal Distribution Continuous random variable There is no single normal curve, but the family of

normal curves. To define a particular normal probability distribution, we need only two parameters: the mean and standard deviation Transformation formula X −µ Z= σ

Related Documents


More Documents from "matju"

9
May 2020 26
11
May 2020 40
Managerial Economics
May 2020 30
Utility Theory
May 2020 23