![PyTorch Developer Podcast artwork](https://is3-ssl.mzstatic.com/image/thumb/Podcasts115/v4/5d/4e/01/5d4e0127-9482-b8e3-6f59-59eaf50a21d9/mza_11274935194810526674.jpg/100x100bb.jpg)
Double backwards
PyTorch Developer Podcast
English - July 07, 2021 13:00 - 16 minutes - 15.2 MB - ★★★★★ - 35 ratingsTechnology deep learning machine learning pytorch Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed
Previous Episode: Functional modules
Next Episode: Intro to distributed
Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.
Epic PR that added double backwards support for convolution initially https://github.com/pytorch/pytorch/pull/1643