Linear algebra is basically used to study systems of linear equations like

$$ \begin{cases} a_{1,1}x_1 + a_{1,2}x_2 + \cdots + a_{1,n}x_n = b_1 \\ a_{2,1}x_1 + a_{2,2}x_2 + \cdots + a_{2,n}x_n = b_2 \\ a_{2,m}x_1 + a_{2,m}x_2 + \cdots + a_{2,m}x_n = b_m\end{cases} $$

These kinds of problems have important applications across engineering and especially in machine learning where we’re often building and optimizing models, using matrix algebra, etc. Although I took linear algebra a few years ago in high school through the UMTYMP math program at the University of Minnesota, I felt rusty. So, I decided that this would be a good place to start.

Here are my notes

Lec 9: Invertible Matrices

Lec 10: Determinants

Lec 11: Subspaces, Kernels, and Ranges

Lec 12: Spanning, Linear Independence, and Dimension

Lec 13: Rank and Nullity

Lec 14: Bases and Coordinate Systems

Lec 15: Eigenvectors and Eigenvalues

Lec 16: Diagonalization

My alma mater, UC Berkeley, has a great math department and Math 54 is the linear algebra course required across many majors. Dr. Alexander Paulin is also known to be a great professor, so I knew I wanted to work through his curriculum (with a very pleasing accent πŸ˜„), so I decided to follow his Fall 2018 offering of the course.

Alexander Paulin