The review will refresh your knowledge of the fundamentals of vectors and of matrix theory, how to perform operations on matrices, and how to solve systems of equations. After the review, you should be able to understand complex numbers from algebraic and geometric viewpoints to the fundamental theorem of algebra. Next, we will focus on eigenvalues and eigenvectors. Today, these have applications in such diverse fields as computer science (Google’s PageRank algorithm), physics (quantum mechanics, vibration analysis, etc.), economics (equilibrium states of Markov models), and more. We will end with the spectral theorem, which provides a decomposition of the vector space on which operators act, and singular-value decomposition, which is a generalization of the spectral theorem to arbitrary matrices. Then, we will study vector spaces: real, complex, and abstract (i.e., vector space of dimension N over an arbitrary field K) linear transformations. Vector spaces are structures formed by a collection of vectors and are characterized by their dimensions. We will then introduce a new structure on vector spaces: an inner product. Inner products allow us to introduce geometric aspects, such as length of a vector, and to define the notion of orthogonality between vectors. In this context, we will study the geometric aspects of linear algebra by using Euclidean spaces as a guide. If you encounter a theorem that seems difficult or does not seem intuitive, try to study that theorem in the simplest case possible and then move on to more abstract cases. For example, if you are uncomfortable with abstract vector spaces (V) over an arbitrary field (K), then you can fall back on intuition from such spaces as R and C (real and complex). Alternatively, you can reduce the dimension of the vector spaces involved as many notions can be understood in the two-dimensional case.
Have taken Linear Algebra.