Probability and Computing

Course ID 15559

Description Probability theory is indispensable in computer science today. In areas such as artificial intelligence and computer science theory, probabilistic reasoning and randomization are central. Within networks and systems, probability is used to model uncertainty and queuing latency. This course gives an introduction to probability as it is used in computer science theory and practice, drawing on applications and current research developments as motivation. The course has 3 parts: Part I is an introduction to probability, including discrete and continuous random variables, heavy tails, simulation, Laplace transforms, z-transforms, and applications of generating functions. Part II is an in-depth coverage of concentration inequalities, like the Chernoff bound and SLLN bounds, as well as their use in randomized algorithms. Part III covers Markov chains (both discrete-time and continuous-time) and stochastic processes and their application to queuing systems performance modeling. This is a fast-paced class which will cover more material than the other probability options and will cover it in greater depth.

Key Topics
Probability on events, Discrete random variables, Continuous random variables, Moment generating functions (z-transforms and Laplace transforms), Heavy-tailed distributions, Poisson process, Event-driven simulation, Statistical Estimation, Statistical Inference, Bayesian Inference

Required Background Knowledge
15-559 assumes NO PRIOR PROBABILITY/STATS classes, and will satisfy the Computer Science Probability/Statistics requirement. The course DOES assume that you have taken calculus (and still remember how to integrate, differentiate, and do Taylor-series expansions). 3D Calculus is a co-requisite for the class.15-559 or 15-659 (both for graduate students)

Course Goals
Analyze probabilities and expectations using tools such as conditioning, independence, linearity of expectations. Compute expectation and variance of common discrete and continuous random variables. Apply z-transforms and Laplace transforms to derive higher moments of random variables. Prove elementary theorems on probability. Analyze tail probabilities via Markov and Chebyshev concentration inequalities. Understand basics of event-driven simulation, including generating random variables for simulation and the Poisson arrival process. Derive statistical estimators, including maximum likelihood estimators and MAP estimators. Understand the application of probability to problems in machine learning, theoretical computer science, networking, cloud computing, and operations research.

Assessment Structure
Weekly homework -- worth 35% combined. Midterm 1 -- worth 20%. Midterm 2 -- worth 20%. Final -- worth 20%. Occasional Unannounced Quizzes -- worth 5%. Class Participation will allow you to move up a grade if you're at the border.