Matrix operations with pytorch – optimizer – addendum
This blog post is an addendum to a 3 post miniseries1. Here were present 2 notebooks. 02a—SVD-with-pytorch-optimizer-SGD.ipynb Is a dropin replacement of the stochastic gradient + momentum method shown earlier2, but with using the inbuilt pytorch sgd optimiser. 02b—SVD-with-pytorch-optimizer-adam.ipynb Uses the inbuild pytorch adam optimizer – rather than the sgd …