Suppose the $n$ by $n$ matrix $A$ has $n$ linearly independent eigenvectors $x_1, ... , x_n$ . Put them into the columns of an eigenvector matrix $X$. Then $X^{-1} AX$ is the eigenvalue matrix $\Lambda$ :
Eigenvectors $x_1, ... , x_j$ that correspond to distinct (all different) eigenvalues are linearly independent. An $n$ by $n$ matrix that has $n$ different eigenvalues (no repeated $\lambda$'s) must be diagonalizable
Suppose $c_1x_1 + c_2x_2 = 0$. Multiply by A to find $c_1\lambda_1x_1 + c_2\lambda_2x_2 = 0$. Multiply by $\lambda2$ to find$c_1\lambda_2x_1 + c_2\lambda_2x_2 = 0$. Now subtract one from the other: Subtraction leaves $(\lambda_1-\lambda_2)c_1x_1=0$ Therefore $c_1 = 0$.
Since the $\lambda$'s are different and $x_1$ $≠$ $0$, we are forced to the conclusion that $c_1 = 0.$ Similarly $c_2 = 0.$ Only the combination with $c_1 = c_2 = 0$ gives $c_1x_1 + c_2x_2 = 0$. So the eigenvectors $x_1$ and $x_2$ must be independent.
also for higher dimension, we will reach$(\lambda_1-\lambda_2)...(\lambda_1-\lambda_j)c_1x_1=0$which forces $c_1 =0$. Similarly every $c_i=0$.
Introduction : how we find $f_{100}$
Let $u_k=\begin{bmatrix} f_{k+1}\\f_k\end{bmatrix}$ and use the rule $\begin{matrix}f_{k+2}=f_{k+1}+f_k\\f_{k+1}=f_{k+1}\end{matrix}$ ,so that
$$ u_{k+1}=\begin{bmatrix}1\quad1\\ 1\quad 0\end{bmatrix}u_k
$$
then we can find the eigenvalue($\lambda_1,\lambda_2$) of matrix A($\begin{bmatrix}1\quad1\\ 1\quad 0\end{bmatrix}$),and try to find the combination of those eigenvector that give $u_0=(1,0)$,after that multiply $u_0$ by $A^{100}$ to find $u_{100}$
$A^K=X\Lambda^kX^{-1}$