This week we had a super insightful conversation with  Jordan Edwards, Principal Program Manager for the AzureML team!  Jordan is on the coalface of turning machine learning software engineering into a reality for some of Microsoft's largest customers. 


ML DevOps is all about increasing the velocity of- and orchastrating the non-interactive phase of- software deployments for ML. We cover ML DevOps and Microsoft Azure ML. We discuss model governance, testing, intepretability, tooling. We cover the age-old discussion of the dichotomy between science and engineering and how you can bridge the gap with ML DevOps. We cover Jordan's maturity model for ML DevOps. 


We also cover off some of the exciting ML announcments from the recent Microsoft Build conference i.e. FairLearn, IntepretML, SEAL, WhiteNoise, OpenAI code generation, OpenAI GPT-3. 


00:00:04 Introduction to ML DevOps and Microsoft Build ML Announcements


00:10:29 Main show kick-off


00:11:06 Jordan's story


00:14:36 Typical ML DevOps workflow


00:17:38 Tim's articulation of ML DevOps


00:19:31 Intepretability / Fairness


00:24:31 Testing / Robustness


00:28:10 Using GANs to generate testing data


00:30:26 Gratuitous DL?


00:33:46 Challenges of making an ML DevOps framework / IaaS


00:38:48 Cultural battles in ML DevOps


00:43:04 Maturity Model for Ml DevOps


00:49:19 ML: High interest credit card of technical debt paper


00:50:19 ML Engineering at Microsoft


01:01:20 ML Flow


01:03:05 Company-wide governance 


01:08:15 What's coming next


01:12:10 Jordan's hillarious piece of advice for his younger self




Super happy with how this turned out, this is not one to miss folks! 


#deeplearning #machinelearning #devops #mldevops