“The first duty in life is to be as artificial as possible. What the second duty is no one has as yet discovered.” Oscar Wilde
 
Matrix operations with pytorch – optimizer – addendum

Matrix operations with pytorch – optimizer – addendum

This blog post is an addendum to a 3 post miniseries​1​

Here were present 2 notebooks.

  • 02a—SVD-with-pytorch-optimizer-SGD.ipynb

Is a dropin replacement of the stochastic gradient + momentum method shown earlier​2​, but with using the inbuilt pytorch sgd optimiser.

  • 02b—SVD-with-pytorch-optimizer-adam.ipynb

Uses the inbuild pytorch adam optimizer – rather than the sgd optimiser. As known in the literature, the adam optimiser shows better results​3​.

So here’s the notebook using the inbuilt pytorch sgd optimiser.

So here’s the notebook using the inbuilt pytorch adam optimiser.

  1. 1.
    Matrix operations with pytorch – optimizer – part 1. Arthought. https://arthought.com/matrix-operations-with-pytorch-optimizer-part-1/. Published February 7, 2020. Accessed February 7, 2020.
  2. 2.
    Matrix operations with pytorch – optimizer – part 3. Arthought. https://arthought.com/matrix-operations-with-pytorch-optimizer-part-3/. Published February 7, 2020. Accessed February 7, 2020.
  3. 3.
    Stochastic gradient descent. Wikipedia. https://en.wikipedia.org/wiki/Stochastic_gradient_descent. Published February 6, 2020. Accessed February 6, 2020.
PHP Code Snippets Powered By : XYZScripts.com