Scott Davies Fast Factorized Density Estimation and Compression with Bayesian Networks Degree Type: Ph.D. in Computer Science Advisor(s): Andrew Moore Graduated: December 2002 Abstract: Many important data analysis tasks can be addressed by formulating them as probability estimation problems. For example, a popular general approach to automatic classification problems is to learn a probabilistic model of each class from data in which the classes are known, and then use Bayes's rule with these models to predict the correct classes of other data for which they are not known. Anomaly detection and scientific discovery tasks can often be addressed by learning probability models over possible events and then looking for events to which these models assign low probabilities. Many data compression algorithms such as Huffman coding and arithmetic coding rely on probabilistic models of the data stream in order achieve high compression rates. In this thesis we examine several aspects of probability estimation algorithms. In particular, we focus on the automatic learning and use of probability models based on Bayesian networks, a convenient formalism in which the probability estimation task is split into many simpler subtasks. We also emphasize computational efficiency. First, we provide Bayesian network-based algorithms for losslessly compressing large discrete datasets. We show that these algorithms can produce compression ratios dramatically higher than those achieved by popular compression programs such as gzip or bzip2, yet still maintain megabyte-per-second decoding speeds on well-aged conventional PCs. Next, we provide algorithms for quickly learning Bayesian network-based probability models over domains with both discrete and continuous variables. We show how recently developed methods for quickly learning Gaussian mixture models from data [Moore 99] can be used to learn Bayesian networks modeling complex nonlinear relationships over dozens of variables from thousands of datapoints in a practical amount of time. Finally we explore a large space of tree-based density learning algorithms, and show that they can be used to quickly learn Bayesian networks that can provide accurate density estimates and that are fast to evaluate. Thesis Committee: Andrew Moore (Chair) Christos Faloutsos John Lafferty Tom Mitchell Nir Friedman (Hebrew University) Randy Bryant, Head, Computer Science Department James Morris, Dean, School of Computer Science Keywords: Machine learning, density estimation, Bayesian networks, graphical models, Gaussian mixture models, compression, interpoloating density trees, conditional density estimation CMU-CS-02-138.pdf (4.86 MB) ( 181 pages) Copyright Notice