![PyTorch Developer Podcast artwork](https://is3-ssl.mzstatic.com/image/thumb/Podcasts115/v4/5d/4e/01/5d4e0127-9482-b8e3-6f59-59eaf50a21d9/mza_11274935194810526674.jpg/100x100bb.jpg)
__torch_function__
PyTorch Developer Podcast
English - June 02, 2021 13:00 - 17 minutes - 15.6 MB - ★★★★★ - 35 ratingsTechnology deep learning machine learning pytorch Homepage Download Apple Podcasts Google Podcasts Overcast Castro Pocket Casts RSS feed
Previous Episode: TensorIterator
Next Episode: Why is autograd so complicated
What is __torch_function__? Why would I want to use it? What does it have to do with keeping extra metadata on Tensors or torch.fx? How is it implemented? Why is __torch_function__ a really popular way of extending functionality in PyTorch? What makes it different from the dispatcher extensibility mechanism? What are some downsides of it being written this way? What are we doing about it?
Further reading.
__torch_function__ RFC: https://github.com/pytorch/rfcs/blob/master/RFC-0001-torch-function-for-methods.mdOne of the original GitHub issues tracking the overall design discussion https://github.com/pytorch/pytorch/issues/24015Documentation for using __torch_function__ https://pytorch.org/docs/stable/notes/extending.html#extending-torch