Markov-chain based reliability analysis for distributed systems |
| |
Authors: | Jin-Long Wang [Author Vitae] |
| |
Affiliation: | Information and Telecommunications Engineering Department, The Ming Chuan University, Taipei 11120, Taiwan, ROC |
| |
Abstract: | In a typical distributed computing system (DCS), nodes consist of processing elements, memory units, shared resources, data files, and programs. For a distributed application, programs and data files are distributed among many processing elements that may exchange data and control information via communication link. The reliability of DCS can be expressed by the analysis of distributed program reliability (DPR) and distributed system reliability (DSR). In this paper, two reliability measures are introduced which are Markov-chain distributed program reliability (MDPR) and Markov-chain distributed system reliability (MDSR) to accurately model the reliability of DCS. A discrete time Markov chain with one absorbing state is constructed for this problem. The transition probability matrix is employed to represent the transition probability from one state to another state in a unit of time. In addition to mathematical method to evaluate the MDPR and MDSR, a simulation result is also presented to prove its correction. |
| |
Keywords: | Distributed computing system Distributed systems Reliability |
本文献已被 ScienceDirect 等数据库收录! |
|