计算概率与推理

Computational Probability and Inference

Learn fundamentals of probabilistic analysis and inference. Build computer programs that reason with uncertainty and make predictions. Tackle machine learning problems, from recommending movies to spam filtering to robot navigation.

883 次查看
麻省理工学院
edX
  • 完成时间大约为 12
  • 中级
  • 英语
注:因开课平台的各种因素变化,以上开课日期仅供参考

你将学到什么

Basic discrete probability theory

Graphical models as a data structure for representing probability distributions

Algorithms for prediction and inference

How to model real-world problems in terms of probabilistic inference

课程概况

Probability and inference are used everywhere. For example, they help us figure out which of your emails are spam, what results to show you when you search on Google, how a self-driving car should navigate its environment, or even how a computer can beat the best Jeopardy and Go players! What do all of these examples have in common? They are all situations in which a computer program can carry out inferences in the face of uncertainty at a speed and accuracy that far exceed what we could do in our heads or on a piece of paper.

In this data analysis and computer programming course, you will learn the principles of probability and inference. We will put these mathematical concepts to work in code that solves problems people care about. You will learn about different data structures for storing probability distributions, such as probabilistic graphical models, and build efficient algorithms for reasoning with these data structures.

By the end of this course, you will know how to model real-world problems with probability, and how to use the resulting models for inference.

You don’t need to have prior experience in either probability or inference, but you should be comfortable with basic Python programming and calculus.

“I love that you can do so much with the material, from programming a robot to move in an unfamiliar environment, to segmenting foreground/background of an image, to classifying tweets on Twitter—all homework examples taken from the class!” – Previous Student in the residential version of this new online course.

课程大纲

Week 1: Introduction to probability and computation
A first look at basic discrete probability, how to interpret it, what probability spaces and random variables are, and how to code these up and do basic simulations and visualizations.

Week 2: Incorporating observations
Incorporating observations using jointly distributed random variables and using events. Three classic probability puzzles are presented to help elucidate how to interpret probability: Simpson’s paradox, Monty Hall, boy or girl paradox.

Week 3: Introduction to inference, and to structure in distributions
The product rule and inference with Bayes' theorem. Independence-A structure in distributions. Measures of randomness: entropy and information divergence. Mini-project: movie recommendations.

Week 4: Expectations, and driving to infinity in modeling uncertainty
Expected values of random variables. Classic puzzle: the two envelope problem. Probability spaces and random variables that take on a countably infinite number of values and inference with these random variables.

Week 5: Efficient representations of probability distributions on a computer
Introduction to undirected graphical models as a data structure for representing probability distributions and the benefits/drawbacks of these graphical models. Incorporating observations with graphical models.

Week 6: Inference with graphical models, part I
Computing marginal distributions with graphical models in undirected graphical models including hidden Markov models. Mini-project: robot localization, part I.

Week 7: Inference with graphical models, part II
Computing most probable configurations with graphical models including hidden Markov models. Mini-project: robot localization, part II.

Week 8: Introduction to learning probability distributions
Learning an underlying unknown probability distribution from observations using maximum likelihood. Three examples: estimating the bias of a coin, the German tank problem, and email spam detection.

Week 9: Parameter estimation in graphical models
Given the graph structure of an undirected graphical model, we examine how to estimate all the tables associated with the graphical model.

Week 10: Model selection with information theory
Learning both the graph structure and the tables of an undirected graphical model with the help of information theory. Mutual information of random variables.

Week 11: Final project part I
Mystery project

Week 12: Final project part II
Mystery project, cont’d

预备知识

Basic Python programming
Calculus (specifically differentiation to find the maximum or minimum of a function)
Some comfort with mathematical notation is helpful

千万首歌曲。全无广告干扰。
此外,您还能在所有设备上欣赏您的整个音乐资料库。免费畅听 3 个月,之后每月只需 ¥10.00。
Apple 广告
声明:MOOC中国十分重视知识产权问题,我们发布之课程均源自下列机构,版权均归其所有,本站仅作报道收录并尊重其著作权益。感谢他们对MOOC事业做出的贡献!
  • Coursera
  • edX
  • OpenLearning
  • FutureLearn
  • iversity
  • Udacity
  • NovoEd
  • Canvas
  • Open2Study
  • Google
  • ewant
  • FUN
  • IOC-Athlete-MOOC
  • World-Science-U
  • Codecademy
  • CourseSites
  • opencourseworld
  • ShareCourse
  • gacco
  • MiriadaX
  • JANUX
  • openhpi
  • Stanford-Open-Edx
  • 网易云课堂
  • 中国大学MOOC
  • 学堂在线
  • 顶你学堂
  • 华文慕课
  • 好大学在线CnMooc
  • (部分课程由Coursera、Udemy、Linkshare共同提供)

© 2008-2020 MOOC.CN 慕课改变你,你改变世界