Chevron Left
Back to Probabilistic Graphical Models 2: Inference

Learner Reviews & Feedback for Probabilistic Graphical Models 2: Inference by Stanford University

4.6
stars
485 ratings

About the Course

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems. This course is the second in a sequence of three. Following the first course, which focused on representation, this course addresses the question of probabilistic inference: how a PGM can be used to answer questions. Even though a PGM generally describes a very high dimensional distribution, its structure is designed so as to allow questions to be answered efficiently. The course presents both exact and approximate algorithms for different types of inference tasks, and discusses where each could best be applied. The (highly recommended) honors track contains two hands-on programming assignments, in which key routines of the most commonly used exact and approximate algorithms are implemented and applied to a real-world problem....

Top reviews

AT

Aug 22, 2019

Just like the first course of the specialization, this course is really good. It is well organized and taught in the best way which really helped me to implement similar ideas for my projects.

AL

Aug 19, 2019

I have clearly learnt a lot during this course. Even though some things should be updated and maybe completed, I would definitely recommend it to anyone whose interest lies in PGMs.

Filter by:

1 - 25 of 78 Reviews for Probabilistic Graphical Models 2: Inference

By AlexanderV

•

Mar 9, 2020

Great course, except that the programming assignments are in Matlab rather than Python

By Shi Y

•

Dec 16, 2018

It's absolutely very very hard but extremely interesting course! Although code assignments always have a lot of small bugs, and it cost me lots of time to find out, but, hey! Everything is the same in school(offline), nothing gonna be perfect. The sampling part is the most difficult stuff to learn so far, and after I tried to review it again and again, combined with other online material, I got those shit done! The only drawback of this course is that not many people active in the forum(Including those TA), maybe that just because only a small number of people enrolled in this course. In short, worth learning!

By Jonathan H

•

Aug 3, 2017

Pretty good course, albeit very dense compared to the first one (which was certainly not trivial). I would give it 5 stars just based on the content, but the programming assignments don't work without significant extra effort. I completed the honors track for the first course, but gave up after spending 4 hours trying to fix HW bugs that were reported 8 months ago.

Would have also been nice to have more practical examples to work on. Some of the material is very theoretical, and I find it hard to build intuitions without applying the algorithms in practice.

By Anurag S

•

Nov 8, 2017

Great introduction to inference. Requires some extra reading from the textbook.

By Tianyi X

•

Feb 23, 2018

not very clear from the top-down level.

By Jiaxing L

•

Nov 27, 2016

I am kind of disappointed that you have to pay for the course before you can submit the solution to the problem set. However, that is not the main issue of this course, as I fully understand that the financial profit for the lecturer is very important. The main issue of this course is that the chaos in the symbol used in the second programming assignment, the lecturer cannot even main self-consistency in the symbol used. The statement of everything in both PA1 and PA2 is also very confusing.

By Hunter J

•

May 2, 2017

The lectures are fine and the book is great, but the assignments have a lot of technical problems. I spent most of my effort trying to solve trivial issues with the sample code and dealing with the auto grader.

By Kuan-Cheng L

•

Jul 23, 2020

Content is good but the course is totally not maintained especially assignments.

By Deleted A

•

Nov 18, 2018

This course seems to have been abandoned by Coursera. Mentors never reply to discussion forum posts (if there is any active mentor at all). Many assignments and tests are confusing and misleading. There are numerous materials you can find online to learn about Graphical Models than spending time & money on this.

By Anthony L

•

Aug 20, 2019

I have clearly learnt a lot during this course. Even though some things should be updated and maybe completed, I would definitely recommend it to anyone whose interest lies in PGMs.

By Michael K

•

Dec 24, 2016

The course lectures are even better than PGM I, as it appears that Professor Koller has recorded some material recently that helps fill in small holes from the previously recorded lectures. Hopefully she'll have time to clean up PGM I in the near future for future students.

This course is another tour-de-force for debugging, though it definitely made me a better programmer (I'm intermediate). I wish that the Discussion Boards were more active, and it's a shame that the Mentors were Missing In Action. On the one hand, the programming instructions were sometimes a bit vague, which made the assignments less like assignments are more like research projects. For these 2 reasons, the course is 4-star rather than 5-star.

Still, it's a lot better than trying to learn this out of the book by oneself. Some say enrollment has dropped off since they began charging for getting access to Quizzes and Programming Assignments. Or it may be attrition, as these are pretty challenging (and well taught) courses. I'm very happy to support this course financially, as it's loads cheaper than what I'd be paying if I were back at Stanford.

Like PGM I, I strongly recommend doing the Honors Programming Assignments, as it's really the way to learn the material well.

By george v

•

Nov 28, 2017

great course, though really advanced. would like a bit more examples especially regarding the coding. worth it overally

By Kaixuan Z

•

Dec 4, 2018

hope to get some feedbacks about hw or exam

By Michel S

•

Jul 14, 2018

Good course, but the material really needs a refresh!

By Mahmoud S

•

Feb 22, 2019

The honorary assignments contain code mistakes, and difficult to do! You are sifting through mistakes in the instructions along with the supplemented code!

By Sergey S

•

Sep 24, 2020

Again 5 stars! As I telecommunication and signal processing engineer, I was surprised, how simple one can explain the Belief Propagation algorithm in terms of PGM. I am very thankful to Daphne Koller, that she dedicated also time to explain BP in a CliqueTrees, that is the key to understand BCJR (MIN-SUM or MAP or Forward-Backward algorithm). Few simple steps and one can rediscover how Kalman filters work (just follow an article 2001 Factor graphs and the sum-product algorithm by Kschischang, Frey and Loeliger) and deep dive into modern probability decoding methods for error correcting codes. This module was awesome!

By Chan-Se-Yeun

•

Jan 30, 2018

I kind of like the teacher. She can always explain complicated things in a simple way, though the notes she writes in the slides are all in free style. Loopy belief propagation and dual decomposition are the best things I've learnt in this course. I've met them before in some papers, but I found it extremely hard to understand then. Now I gain some significant intuition of them and I'm ready to do further exploration. Anyway, I'll keep on learning course 3 to achieve my first little goal in courser.

By Rishi C

•

Oct 28, 2017

Perhaps the best introduction to AI/ML - especially for those who think "the future ain't what it used to be"; the mathematical techniques covered by the course form a toolkit which can be easily thought of as "core", i.e. a locus of strength which enables a wide universe of thinking about complex problems (many of which were correctly not thought to be tractable in practice until very recently!)...

By Dat N

•

Nov 20, 2019

The lectures are in good detail and the lecturer clearly explains many topics. The programming assignments are helpful in applying the learned concepts but sometimes it takes long time to figure out what the instruction really means and the code structures. It was hard work but after all, I would like to thanks for a great course because I have learned a lot.

By satish p

•

Aug 28, 2020

Quite an enriching experience indeed. The material was relatively more dense when compared to course 1. The assignments were reasonably challenging, requiring some critical thinking and imagination to extend the concepts to scenarios that were unthought of while going through the videos and course content.

By Alfred D

•

Jul 29, 2020

Very good course learnt a lot , but some of the videos were very long avg 23 mins , those were really taxing on the mind and had to be seen many times over longer breaks. but thnx to Prof Daphne, she really got to the point for most of the topics discussed and also on the quiz questions

By Ayush T

•

Aug 23, 2019

Just like the first course of the specialization, this course is really good. It is well organized and taught in the best way which really helped me to implement similar ideas for my projects.

By Lik M C

•

Feb 3, 2019

Very great course! A lot of things have been learnt. The lectures, quiz and assignments clear up all key concepts. Especially, assignments are wonderful!

By Orlando D

•

Mar 12, 2017

Thanks a lot for professor D.K.'s great course for PGM inference part. Really a very good starting point for PGM model and preparation for learning part.

By Yang P

•

May 29, 2017

I learned pretty much from this course. It answered my quandaries from the representation course, and as well deepened my understanding of PGM.