From last lecture, we know that for an $n \times n$ matrix $A$, the characteristic polynomial has degree $n$ and can have a maximum of $n$ distinct eigenvalues.
Fact: Eigenvectors corresponding to distinct eigenvalues are linearly independent
Now, we ask the question: can we find $n$ linearly independent eigenvectors of $A$ that form a basis of $\R^n$?
A practical application of this is in discrete dynamical systems.
A discrete dynamical system is a system which changes its state at discrete time intervals.
$$ \vec x_{k + 1} = A\vec x_k $$
How does this system change over time?
$$ \vec x_0, A\vec x_0, A^2 \vec x_0, \ldots, A^k\vec x_0, \ldots $$
Exponentiating a matrix can get really computationally intensive, especially if you’re trying to go to high values of $k$. We need a better way to do this.
How does this problem relate with the earlier question? Let’s say that $\{\vec b_1, \ldots, \vec b_n \}$ make up a basis of eigenvectors of $A$ with corresponding eigenvalues $\lambda_1, \ldots, \lambda_n$. That means that for $\{\theta_i\} \in \R$, we can express
$$ \begin{align*} \vec x_0 &= \theta_1 \vec b_1 + \ldots + \theta_n \vec b_n \\ \vec x_1 &= \theta_1 \lambda_1 \vec b_1 + \ldots + \theta_n \vec b_n \\ \end{align*} \\ \vdots \\ \vec x_k = \theta_1 \lambda_1^k\vec b_1 + \ldots + \theta_n \lambda_n^k \vec b_n
$$
Clearly, this quantity is much easier to compute, so how do we know if we can find a basis of $\R^n$ using eigenvectors of $A$?
Let $\{ \vec b_1, \ldots , \vec b_n \}$ be a basis of $\R^n$. We know that $P = [ \vec b_1, \ldots , \vec b_n ]$ is invertible because a basis is, by definition, a linearly independent spanning set of its vector space, which are the two conditions for invertibility (one-to-one and onto). That means there exists $P^{-1}$.