Is GTP-3 Artificial Intelligence's new mind? Dr. Jared Kaplan is a theoretical physicist who has been recently working on Generative Pre-trained Transformer 3 (GPT-3), which is an autoregressive language model that uses deep learning to produce human-like text. We all use a limited version when we ask google a question or when google auto corrects our email. However, this text generator is millions of times more powerful. It can write poetry, complete legal documents and write computer code. 

Jared Kaplan is a theoretical physicist with interests in quantum gravity, holography, and conformal field theory, as well as effective field theory, particle physics, and cosmology. He is also working on topics at the interface between physics and machine learning.

In the last few years he has also been collaborating with both physicists and computer scientists on Machine Learning research, including on scaling laws for neural models and the GPT-3 language model. His goal is to understand these systems and to help make them safe and beneficial.

We begin the podcast by discussing the difference between Newton's conception of gravity and Einstein's theory of general relativity. Then we delve into the subjects of quantum gravity and the curvature of space. We also discuss why theoretical physics is relevant to our understanding of reality.

In this podcast we will ask Dr. Kaplan if the advances in AI technology can lead to GTP-3 Artificial Intelligence's New Mind. In other words will this lead to a kind of sentience for AI giving it a brain much like ours? Dr. Kaplan has been working with OpenAI on scaling laws for neural models and the GPT-3 language model. OpenAI is an artificial intelligence research laboratory consisting of the for-profit corporation OpenAI LP and its parent company, the non-profit OpenAI Inc.