《英文msh词典》Chain,Markov
[入口词] Chain,Markov
[主题词] Markov Chains
[英文释义] A stochastic process such that the conditional probability distribution for a state at any future instant,given the present state,is unaffected by any additional knowledge of the past history of the system.