单词 | Process,Markov |
释义 | Process,Markov《英文msh词典》Process,Markov [入口词] Process,Markov [主题词] Markov Chains [英文释义] A stochastic process such that the conditional probability distribution for a state at any future instant,given the present state,is unaffected by any additional knowledge of the past history of the system. |
随便看 |
|
英汉翻译词典包含4368284条英汉汉英翻译词条,基本涵盖了全部常用英语单词及常用语的翻译及用法,是英语学习的有利工具。