Batch Inference using Azure Machine Learning | AI Show
Channel 9
English - November 21, 2019 18:00 - 9 minutes - 9.06 MB - ★★★★ - 38 ratingsTechnology Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed
Previous Episode: Collection View All The Things | Partly Cloudy
In this episode we will cover a quick overview of new batch inference capability that allows Azure Machine Learning users to get inferences on large scale datasets in a secure, scalable, performant and cost-effective way by fully leveraging the power of cloud.
[00:21] Context on Inference
[02:00] Handling High Volume Workloads
[03:05] ParallelRunStep Intro
[03:53] Support for Structured and Unstructured data
[04:14] – Demo walkthrough
[06:17] – ParallelRunStep Config
[07:40] – Pre and Post Processing
Learn More:
Batch Inference DocumentationBatch Inference NotebooksThe AI Show's Favorite links:
Don't miss new episodes, subscribe to the AI Show Create a Free account (Azure) AI Blog Fast ML MIT News | AI Medium | Francesca Lazzeri Deep Learning vs. Machine Learning Get Started with Machine Learning