“The first duty in life is to be as artificial as possible. What the second duty is no one has as yet discovered.” Oscar Wilde
 
Matrix operations with pytorch – optimizer – part 3

Matrix operations with pytorch – optimizer – part 3

SVD with pytorch optimizer

This blog post is part of a 3 post miniseries. 

Today’s post in particular covers the topic SVD with pytorch optimizer.

The point of the entire miniseries is to reproduce matrix operations such as matrix inverse and svd using pytorch’s automatic differentiation capability.

These algorithms are already implemented in pytorch itself and other libraries such as scikit-learn. However, we will solve this problem in a general way using gradient descent. We hope that this will provide an understanding of the power of the gradient method in general and the capabilities of pytorch in particular.

To avoid reader fatigue, we present the material in 3 posts:

  • A introductory section: pytorch – playing with tensors demonstrates some basic tensor usage​1​. This notebook also shows how to calculate various derivatives.
  • A main section: pytorch – matrix inverse with pytorch optimizer shows how to calculate the matrix inverse​2​ using gradient descent.
  • An advanced section: SVD with pytorch optimizer shows how to do singular value decomposition​3,4​ with gradient descent.

Some background information on gradient descent can be found here​5,6​.A post with a similar albeit slightly more mathematical character can be found here​7​. Some more advanced material can be found here​8​.

Requirements

Hardware

  • A computer with at least 4 GB Ram

Software

  • The computer can run on Linux, MacOS or Windows

Wetware

  • Familiarity with Python and basic linear algebra

Let’s get  started

The code of this post is provided in a jupyter notebook on github:

https://github.com/hfwittmann/matrix-operations-with-pytorch/blob/master/Matrix-operations-with-pytorch-optimizer/02—SVD-with-pytorch-optimizer.ipynb

Remark: the following part of the post is directly written in a Jupyter notebook. It is displayed via a very nice wordpress plugin nbconvert​9​.

  1. 1.
    Matrix operations with pytorch – optimizer – part 1. Arthought. https://arthought.com/matrix-operations-with-pytorch-optimizer-part-1/. Published February 7, 2020. Accessed February 7, 2020.
  2. 2.
    Invertible matrix. Wikipedia. https://en.wikipedia.org/wiki/Invertible_matrix. Published February 6, 2020. Accessed February 6, 2020.
  3. 3.
    Singular value decomposition. Wikipedia. https://en.wikipedia.org/wiki/Singular_value_decomposition. Published February 6, 2020. Accessed February 6, 2020.
  4. 4.
    Mills P. Singular Value Decomposition (SVD) Tutorial: Applications, Examples, Exercises. statsbot. https://blog.statsbot.co/. Published October 5, 2017. Accessed February 8, 2020.
  5. 5.
    Gradient Descent. Wikipedia. https://en.wikipedia.org/wiki/Gradient_descent. Published February 8, 2020. Accessed February 8, 2020.
  6. 6.
    Gradient descent. ML glossary. https://ml-cheatsheet.readthedocs.io/en/latest/gradient_descent.html. Published February 8, 2020. Accessed February 8, 2020.
  7. 7.
    PyTorch Basics: Solving the Ax=b matrix equation with gradient descent. bytepawn. http://bytepawn.com/pytorch-basics-solving-the-axb-matrix-equation-with-gradient-descent.html. Published February 8, 2019. Accessed February 8, 2020.
  8. 8.
    PyTorch for Scientific Computing – Quantum Mechanics Example Part 3) Code Optimizations – Batched Matrix Operations, Cholesky Decomposition and Inverse. pugetsystems. https://www.pugetsystems.com/. Published August 31, 2018. Accessed February 8, 2020.
  9. 9.
    Challis A. PHP: nbconvert – A wordpress plugin for Jupyter notebooks.   . https://www.andrewchallis.co.uk/. Published May 1, 2019. Accessed February 6, 2020.
PHP Code Snippets Powered By : XYZScripts.com