![The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) artwork](https://is1-ssl.mzstatic.com/image/thumb/Podcasts113/v4/39/58/c6/3958c6ce-86e4-3b80-bfb9-840e1dfd7e4b/mza_491361902049110775.png/100x100bb.jpg)
Creating Robust Language Representations with Jamie Macbeth - #477
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
English - April 21, 2021 21:11 - 40 minutes - ★★★★★ - 323 ratingsTechnology News Tech News machinelearning artificialintelligence datascience samcharrington tech technology thetwimlaipocast thisweekinmachinelearning twiml twimlaipodcast Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed
Today we’re joined by Jamie Macbeth, an assistant professor in the department of computer science at Smith College.
In our conversation with Jamie, we explore his work at the intersection of cognitive systems and natural language understanding, and how to use AI as a vehicle for better understanding human intelligence. We discuss the tie that binds these domains together, if the tasks are the same as traditional NLU tasks, and what are the specific things he’s trying to gain deeper insights into.
One of the unique aspects of Jamie’s research is that he takes an “old-school AI” approach, and to that end, we discuss the models he handcrafts to generate language. Finally, we examine how he evaluates the performance of his representations if he’s not playing the SOTA “game,” what he bookmarks against, identifying deficiencies in deep learning systems, and the exciting directions for his upcoming research.
The complete show notes for this episode can be found at https://twimlai.com/go/477.
Today we’re joined by Jamie Macbeth, an assistant professor in the department of computer science at Smith College.
In our conversation with Jamie, we explore his work at the intersection of cognitive systems and natural language understanding, and how to use AI as a vehicle for better understanding human intelligence. We discuss the tie that binds these domains together, if the tasks are the same as traditional NLU tasks, and what are the specific things he’s trying to gain deeper insights into.
One of the unique aspects of Jamie’s research is that he takes an “old-school AI” approach, and to that end, we discuss the models he handcrafts to generate language. Finally, we examine how he evaluates the performance of his representations if he’s not playing the SOTA “game,” what he bookmarks against, identifying deficiencies in deep learning systems, and the exciting directions for his upcoming research.
The complete show notes for this episode can be found at https://twimlai.com/go/477.