A terminating Markov chain is a Markov chain where all states are transient, except one which is absorbing. Reordering the states, the transition probability matrix of a terminating Markov chain with m {\displaystyle m} transient states is
where T {\displaystyle {T}} is a m × m {\displaystyle m\times m} matrix, T 0 {\displaystyle \mathbf {T} ^{0}} and 0 {\displaystyle \mathbf {0} } are column vectors with m {\displaystyle m} entries, and T 0 + T 1 = 1 {\displaystyle \mathbf {T} ^{0}+{T}\mathbf {1} =\mathbf {1} } . The transition matrix is characterized entirely by its upper-left block T {\displaystyle {T}} .
Definition. A distribution on { 0 , 1 , 2 , . . . } {\displaystyle \{0,1,2,...\}} is a discrete phase-type distribution if it is the distribution of the first passage time to the absorbing state of a terminating Markov chain with finitely many states.
Fix a terminating Markov chain. Denote T {\displaystyle {T}} the upper-left block of its transition matrix and τ {\displaystyle \tau } the initial distribution. The distribution of the first time to the absorbing state is denoted P H d ( τ , T ) {\displaystyle \mathrm {PH} _{d}({\boldsymbol {\tau }},{T})} or D P H ( τ , T ) {\displaystyle \mathrm {DPH} ({\boldsymbol {\tau }},{T})} .
Its cumulative distribution function is
for k = 1 , 2 , . . . {\displaystyle k=1,2,...} , and its density function is
for k = 1 , 2 , . . . {\displaystyle k=1,2,...} . It is assumed the probability of process starting in the absorbing state is zero. The factorial moments of the distribution function are given by,
where I {\displaystyle I} is the appropriate dimension identity matrix.
Just as the continuous time distribution is a generalisation of the exponential distribution, the discrete time distribution is a generalisation of the geometric distribution, for example: