相 關 資 料 |
信息和通訊技術(ICT)
Bayesian network
Evidence-based design
ETC
CAF
CAT
Bayesian network
A Bayesian network is a form of probabilistic graphical model, also known as Bayesian belief network or just belief network.
A Bayesian network can be represented by a graph (as in graph theory) with probabilities attached. Thus, a Bayesian network represents a set of variables together with a joint probability distribution with explicit independence assumptions.
- 1 Definition
A Bayesian network is a directed acyclic graph whose
nodes represent variables,
arcs represent statistical dependence relations among the variables and local probability distributions for each variable given values of its parents.
Nodes can represent any kind of variable, be it a measured parameter, a latent variable or a hypothesis. They are not restricted to representing random variables; this is what is "Bayesian" about a Bayesian network.
If there is an arc from node A to another node B, then variable B depends directly on variable A, and A is called a parent of B. If for each variable Xi, i= 1 to n, the set of parent variables is denoted by parents(Xi) then the joint distribution of the variables is product of the local distributions
If Xi has no parents, its local probability distribution is said to be unconditional, otherwise it is conditional. If the variable represented by a node is observed, then the node is said to be an evidence node.
Questions about incongruent dependence among variables can be answered by studying the graph alone. It can be shown that conditional independence is represented in the graph by the graphical property of d-separation: nodes X and Y are d-separated in the graph, given specified evidence nodes, if and only if variables X and Y are independent given the corresponding evidence variables. The set of all other nodes on which node X can directly depend is given by X's Markov blanket.
One advantage of Bayesian networks is that it is intuitively easier for a human to understand direct dependencies and local distributions than complete joint distribution.
- 2 Example
- 3 Causal Bayesian networks
- 4 Structure learning
- 5 Parameter learning
- 6 Inference
- 7 Applications
- 8 See also
- 9 Links and software
- 10 References
|