Bayesian Networks form a very flexible class of models in data mining and applied statistics. They can be used to deduce probabilistic dependencies between variables and - opposed to a standard prediction model such as a decision tree - are also able to predict composed target. In Bayesian Networks, dependencies between variables are usually modeled through graph structures, where the exact form of dependency is modeled through conditional likelihoods.
The lecture gives an introduction to Bayesian networks. Starting from principal modelling of effects and conditional likelihoods, we will treat algorithms for exact and approximate Inferenz (propagation of evidence), analysis of Bayesian networks, learning of Parameters together with the learning of structures.
Usually, algorithms for inference and learning of bayesian networks are based on algorithms for graph structures, such as well known methods like topological ordering and a check for connectedness, and less known methods such as ordering cliques and the likes. For purposes of generality and accessibility, all required algorithms will also be presented throughout the lecture.
Lecturer: Prof. Dr. Dr. Lars Schmidt-Thieme
Trainer: Nicolas Schilling