Probabilistic Deep Learning with TensorFlow 2

Imperial College London via Coursera

Go to Course: https://www.coursera.org/learn/probabilistic-deep-learning-with-tensorflow2

Introduction

### Course Review: Probabilistic Deep Learning with TensorFlow 2 In the ever-evolving field of machine learning, a deep understanding of probabilistic approaches to deep learning is indispensable, particularly when it comes to quantifying uncertainty in real-world datasets. Coursera's course, **Probabilistic Deep Learning with TensorFlow 2**, caters to those eager to deepen their knowledge while applying cutting-edge concepts in a practical environment. If you’re looking to build on the foundational skills in TensorFlow and delve into the nuances of probabilistic modeling, this course is an essential step on your learning journey. #### Course Overview The course builds on the skills gained from earlier courses in the specialization, transforming foundational knowledge into practical applications focused on probabilistic methods. Given the increasing importance of handling uncertainty in deep learning applications—especially in critical areas like autonomous driving and medical diagnosis—this course is incredibly timely and relevant. #### Syllabus Breakdown 1. **TensorFlow Distributions** The journey begins with the TensorFlow Probability (TFP) library, which provides powerful tools for probabilistic modeling. In this week, you will dive into Distribution objects, learning to sample from various distributions while computing probabilities. A hands-on programming assignment using the Iris dataset will solidify these concepts by implementing a Naive Bayes classifier, introducing you to the core principles of probabilistic reasoning in machine learning. 2. **Probabilistic Layers and Bayesian Neural Networks** In this week, students explore the importance of accounting for uncertainty within deep learning models. While traditional deep learning models often overlook uncertainty, this module focuses on training Bayesian neural networks using TFP's probabilistic layers. The programming assignment challenges you to develop a Bayesian CNN for the MNIST and MNIST-C datasets, enhancing your ability to recognize patterns while quantifying prediction uncertainties—essential for applications in high-stakes scenarios. 3. **Bijectors and Normalising Flows** Building on prior knowledge, the course introduces normalising flows, which transform a simple base distribution into a complex data representation through bijective transformations. This week equips you with the ability to implement and review these transformations using bijector objects from TFP. You will apply your skills in a programming assignment that involves developing a RealNVP normalising flow model on the LSUN bedroom dataset, deepening your grasp of generative modeling. 4. **Variational Autoencoders (VAEs)** One of the most critical parts of the course, this week focuses on VAEs, which are instrumental for generative modeling. You will learn how to use TFP to jointly train an encoder and a decoder. The programming assignment tasks you with building a variational autoencoder for a celebrity faces dataset, allowing you to understand both latent space compression and data generation—a key capability in many machine learning applications. 5. **Capstone Project** The course culminates in a capstone project that challenges you to integrate all the techniques learned throughout the weeks. You will create a synthetic image dataset using normalising flows and train a variational autoencoder on it. This project not only solidifies your understanding but also provides a comprehensive experience in developing and implementing probabilistic models from scratch. #### Recommendations I highly recommend **Probabilistic Deep Learning with TensorFlow 2** to anyone looking to enhance their understanding of deep learning and how uncertainty can drastically affect predictions in real-world applications. The course is structured logically, gradually taking you from foundational concepts to complex applications, making it suitable for intermediate learners who are comfortable with TensorFlow basics. The blend of theoretical background, practical programming assignments, and a capstone project offers learners both depth and breadth in learning. Moreover, the relevance of the topics covered means that participants will be well-equipped for roles in data science, machine learning engineering, and research positions focused on deep learning applications in uncertain environments. In conclusion, if you are ready to confront the challenges of uncertainty in data while building practical skills in TensorFlow and probabilistic deep learning, this course promises a rewarding and comprehensive educational experience. Enroll today and take a giant leap toward mastering the art of probabilistic deep learning!

Syllabus

TensorFlow Distributions

Probabilistic modelling is a powerful and principled approach that provides a framework in which to take account of uncertainty in the data. The TensorFlow Probability (TFP) library provides tools for developing probabilistic models that extend the capability of TensorFlow. In this first week of the course, you will learn how to use the Distribution objects in TFP, and the key methods to sample from and compute probabilities from these distributions. You will also learn how to make these distributions trainable. The programming assignment or this week will put these techniques into practice by implementing a Naive Bayes classifier on the Iris dataset.

Probabilistic layers and Bayesian neural networks

Accounting for sources of uncertainty is an important aspect of the modelling process, especially for safety-critical applications such as medical diagnoses. Most standard deep learning models do not quantify the uncertainty in their predictions. In this week you will learn how to use probabilistic layers from TensorFlow Probability to develop deep learning models that are able to provide measures of uncertainty in both the data, and the model itself. In the programming assignment for this week, you will develop a Bayesian CNN for the MNIST and MNIST-C datasets.

Bijectors and normalising flows

Normalising flows are a powerful class of generative models, that aim to model the underlying data distribution by transforming a simple base distribution through a series of bijective transformations. In this week you will learn how to use bijector objects from the TensorFlow Probability library to implement these transformations, and learn a complex transformed distribution from data. These models can be used to sample new data generations, as well as evaluate the likelihood of data examples. In the programming assignment for this week, you will develop a RealNVP normalising flow model for the LSUN bedroom dataset.

Variational autoencoders

Variational autoencoders are one of the most popular types of likelihood-based generative deep learning models. In the VAE algorithm two networks are jointly learned: an encoder or inference network, as well as a decoder or generative network. In this week you will learn how to implement the VAE using the TensorFlow Probability library. You will then use the trained networks to encode data examples into a compressed latent space, as well as generate new samples from the prior distribution and the decoder. In the programming assignment for this week, you will develop the variational autoencoder for an image dataset of celebrity faces.

Capstone Project

In this course you have learned how to develop probabilistic deep learning models using tools and concepts from the TensorFlow Probability library such as Distribution objects, probabilistic layers, bijectors, and KL divergence optimisation. The Capstone Project brings many of these concepts together with a task to create a synthetic image dataset using normalising flows, and train a variational autoencoder on the dataset.

Overview

Welcome to this course on Probabilistic Deep Learning with TensorFlow! This course builds on the foundational concepts and skills for TensorFlow taught in the first two courses in this specialisation, and focuses on the probabilistic approach to deep learning. This is an increasingly important area of deep learning that aims to quantify the noise and uncertainty that is often present in real world datasets. This is a crucial aspect when using deep learning models in applications such as autono

Skills

Generative Model Tensorflow Probabilistic Programming Language (PRPL) Deep Learning Probabilistic Neural Network

Reviews

A really valuable learning experience. With these courses, I now feel confident that I can apply the skills from the Deep Learning Specialization in a practical setting.

Very easy understanding, great for getting practice on TF probability

Really interesting and well thogut. I wish there were more advanced courses like that

Really good course touching some really recent research in deep learning.

Very good. Liked this course a lot, even though I recognize I should have had a better a background before taking it.