Previous Episode: Code generation
Next Episode: Mobile selective build

What goes into the implementation of torch.nn? Why do NN modules exist in the first place? What's the function of Parameter? How do modules actually track all the parameters in question? What is all of the goop in the top level NN module class? What are some new developments in torch.nn modules? What are some open problems with our modules?

Further reading:

Implementation of nn.Module https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/module.pynn.Module is complicated and that means its sometimes a bit slow. Some analysis at https://dev-discuss.pytorch.org/t/overhead-in-nn-module-causing-massive-slowdowns-compared-to-raw-cublas-or-torchscript/110Lazy modules PR https://github.com/pytorch/pytorch/pull/44538 and factory kwargs https://github.com/pytorch/pytorch/pull/54508

Liner notes:

python for hackability (c++ is reimplemented)parameters parameter collection (for optimization) buffers: not considered optimizablemodules functorial operation (_apply) jit script: staged computation (init is not scripted) __call__ to forward (extra instrumentation) serialization / state_dict new stuff: device kwarg (joel schlosser) new stuff: lazy modules (emcastillo) open problems: parameter initialization