Professional Course

Computational Probability and Inference

edX, Online
12 weeks
49 USD
Next course start
Inquire for more information See details
Virtual Classroom
12 weeks
49 USD
Next course start
Inquire for more information See details
Virtual Classroom
Visit this course's homepage on the provider's site to learn more or book!

Course description

Computational Probability and Inference

Probability and inference are used everywhere. For example, they help us figure out which of your emails are spam, what results to show you when you search on Google, how a self-driving car should navigate its environment, or even how a computer can beat the best Jeopardy and Go players! What do all of these examples have in common? They are all situations in which a computer program can carry out inferences in the face of uncertainty at a speed and accuracy that far exceed what we could do in our heads or on a piece of paper.

In this data analysis and computer programming course, you will learn the principles of probability and inference. We will put these mathematical concepts to work in code that solves problems people care about. You will learn about different data structures for storing probability distributions, such as probabilistic graphical models, and build efficient algorithms for reasoning with these data structures.

By the end of this course, you will know how to model real-world problems with probability, and how to use the resulting models for inference.

You don’t need to have prior experience in either probability or inference, but you should be comfortable with basic Python programming and calculus.

Upcoming start dates

1 start date available

Inquire for more information

  • Virtual Classroom
  • Online
  • English

Who should attend?


  • Basic Python programming
  • Calculus (specifically differentiation to find the maximum or minimum of a function)
  • Some comfort with mathematical notation is helpful

Training content

Introduction to probability and computation

A first look at basic discrete probability, how to interpret it, what probability spaces and random variables are, and how to code these up and do basic simulations and visualizations.

Incorporating observations

Incorporating observations using jointly distributed random variables and using events. Three classic probability puzzles are presented to help elucidate how to interpret probability: Simpson’s paradox, Monty Hall, boy or girl paradox.

Introduction to inference, and to structure in distributions

The product rule and inference with Bayes' theorem. Independence-A structure in distributions. Measures of randomness: entropy and information divergence. Mini-project: movie recommendations.

Expectations, and driving to infinity in modeling uncertainty

Expected values of random variables. Classic puzzle: the two envelope problem. Probability spaces and random variables that take on a countably infinite number of values and inference with these random variables.

Efficient representations of probability distributions on a computer

Introduction to undirected graphical models as a data structure for representing probability distributions and the benefits/drawbacks of these graphical models. Incorporating observations with graphical models.

Inference with graphical models, part I

Computing marginal distributions with graphical models in undirected graphical models including hidden Markov models. Mini-project: robot localization, part I.

Inference with graphical models, part II Computing most probable configurations with graphical models including hidden Markov models. Mini-project: robot localization, part II.

Introduction to learning probability distributions

Learning an underlying unknown probability distribution from observations using maximum likelihood. Three examples: estimating the bias of a coin, the German tank problem, and email spam detection.

Parameter estimation in graphical models

Given the graph structure of an undirected graphical model, we examine how to estimate all the tables associated with the graphical model.

Model selection with information theory

Learning both the graph structure and the tables of an undirected graphical model with the help of information theory. Mutual information of random variables.

Final project part I

Mystery project

Final project part II

Mystery project, cont’d

Course delivery details

This course is offered through Massachusetts Institute of Technology, a partner institute of EdX.

4-6 hours per week


  • Verified Track -$49
  • Audit Track - Free

Certification / Credits

What you'll learn

  • Basic discrete probability theory
  • Graphical models as a data structure for representing probability distributions
  • Algorithms for prediction and inference
  • How to model real-world problems in terms of probabilistic inference

Contact this provider

Contact course provider

Fill out your details to find out more about Computational Probability and Inference.

  Contact the provider

  Get more information

  Register your interest

Country *

reCAPTCHA logo This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
141 Portland Street
02139 Cambridge Massachusetts


edX For Business helps leading companies upskill their labor forces by making the world’s greatest educational resources available to learners across a wide variety of in-demand fields. edX For Business delivers high-quality corporate eLearning to train and engage your employees...

Read more and show all training delivered by this supplier