Markov Chain Analysis

  • Uploaded by: aditi m
  • 0
  • 0
  • June 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Markov Chain Analysis as PDF for free.

More details

  • Words: 581
  • Pages: 17
Markov Chain Analysis

Presented By:Aditi Misra

Markov Chains

A Markov Chain is a stochastic process . It has following Properties:3. Discrete state space, 4. Markovian property, 5. One step stationary transition proba

Stochastic Process A stochastic process is defined by the family or set of random variables{Xt}, where t is a parameter(index) from a given set T.

Discrete State Space A state space(S) is the sample space for all possible values of Xt. When a state space contain discrete values, it is called as discrete state space. If the discrete state space has a finite number of states, then a finite-state Markov chain has

Markovian Property The probability of a state on the next trial depends only on the state in the current trial and not on the states prior to the present.

Transition probability § A transition probability is defined as the conditional probability that the process will be in a specific future state given its most recent state. § This probabilities are also called one-step transition probabilities, since they describe the system between t and t+1. § It may be presented in the tabular

Lets consider an example:-

Brand Switching Example:Suppose that there are two brands of detergents D1 and D2, selling in the market in the beginning of a year, when a third one, D3, is introduced into the market. The market is then observed continuously month-after-month for change in the brand loyalty. Let us say that the rate of brand switching has settled over time as follows:

Example continues…… Brand this month 60% next month D1 30%

Brand 20%

D1

10% 50%

D2

D3

15% 5% 80%

D2

30% D3

Now, given these conditions about brand switching, assuming no further entry or exit, and given further that the market share for the brands on a certain date, say march 1, is 30%, 45% and 25% for brands D1, D2 and D3

Markov Analysis: Input & Output

Markov analysis provides for the following predictions: The probability of the system being in a given state at a given future time. The steady state probabilities.

Input  Transition

probability:-

Transition probabilities for this problem:-

ij n It must satisfy the following 2 properties:0J=
The

initial conditions:The initial conditions describe the situation the system presently is in. Here, the initial condition isThe market share for the brands is 30%, 45% and 25% for brands D1, D2 and D3 respectively. In a row matrix[0.30 0.45 0.25]

Output Specific-state

Probabilities:It is for calculating the probabilities for the system in specific states. The probability distribution of the system being in a certain state(i) in a certain period(k) may be expressed as a row matrix: Q(k) = [q1(k) q2(k) q3(k)……….. qn(k)]

For this example Q(0)=[q (0) q (0) q (0)]=[0.30 D D D 2 3 0.451 0.25] For calculating market share for the next month: Q(next)=Q(current)xP

To calculate the probability of a customer to buy a

To calculate the probability of a customer to buy D3 two months hence, given that his latest purchase has been D2 D2

to

D1

D1 to D3 t=1

0 .10

0 0 2 D2 to D3 D2 to .D2 D2 0 0 to.50 .30 0 D3 .3 D3 to D3 0

0 .80

t=2

Probability

0.20 x 0.10 = 0.02 0.50 x 0.30 = 0.15

0.30 x 0.80 = 0.24 Total

0.41

Steady state probability: A stablised system is said to be in a steady state or in equilibrium. Q(k) = Q(k-1)

Related Documents


More Documents from ""