![Machine Learning Street Talk (MLST) artwork](https://is4-ssl.mzstatic.com/image/thumb/Podcasts123/v4/93/38/03/933803c5-1b89-d74b-7186-27ed93801348/mza_5670221076398063789.jpg/100x100bb.jpg)
#65 Prof. PEDRO DOMINGOS [Unplugged]
Machine Learning Street Talk (MLST)
English - February 26, 2022 00:27 - 1 hour - 121 MBTechnology Homepage Download Google Podcasts Overcast Castro Pocket Casts RSS feed
Note: there are no politics discussed in this show and please do not interpret this show as any kind of a political statement from us. We have decided not to discuss politics on MLST anymore due to its divisive nature.
Patreon: https://www.patreon.com/mlst
Discord: https://discord.gg/HNnAwSduud
[00:00:00] Intro
[00:01:36] What we all need to understand about machine learning
[00:06:05] The Master Algorithm Target Audience
[00:09:50] Deeply Connected Algorithms seen from Divergent Frames of Reference
[00:12:49] There is a Master Algorithm; and it's mine!
[00:14:59] The Tribe of Evolution
[00:17:17] Biological Inspirations and Predictive Coding
[00:22:09] Shoe-Horning Gradient Descent
[00:27:12] Sparsity at Training Time vs Prediction Time
[00:30:00] World Models and Predictive Coding
[00:33:24] The Cartoons of System 1 and System 2
[00:40:37] AlphaGo Searching vs Learning
[00:45:56] Discriminative Models evolve into Generative Models
[00:50:36] Generative Models, Predictive Coding, GFlowNets
[00:55:50] Sympathy for a Thousand Brains
[00:59:05] A Spectrum of Tribes
[01:04:29] Causal Structure and Modelling
[01:09:39] Entropy and The Duality of Past vs Future, Knowledge vs Control
[01:16:14] A Discrete Universe?
[01:19:49] And yet continuous models work so well
[01:23:31] Finding a Discretised Theory of Everything