Skip to main content

EssenceOfLinearAlgebra - 04

1099 words·6 mins

09-Change of Basis
#

All the cases we discussed earlier were conducted in the coordinate system defined by the standard basis vectors $\hat{i} = \begin{bmatrix} 1 \ 0 \end{bmatrix}$ and $\hat{j} = \begin{bmatrix} 0 \ 1 \end{bmatrix}$.

In different coordinate systems, the coordinates describing the same vector are different, because the basis vectors we choose have changed.

Example
#

Suppose there is another set of basis vectors $\hat{i}’ = \begin{bmatrix} 2 \ 1 \end{bmatrix}$ and $\hat{j}’ = \begin{bmatrix} -1 \ 1 \end{bmatrix}$.

The vector $\begin{bmatrix} 5/3 \ 1/3 \end{bmatrix}$ in this coordinate system is represented as $\begin{bmatrix} 3 \ 2 \end{bmatrix}$ in our standard coordinate system.

Coordinate System Transformation
#

BasicChange

  • We can form the new basis vectors into a matrix $A = \begin{bmatrix} 2 & -1 \ 1 & 1 \end{bmatrix}$.
  • Multiplying a vector in another coordinate system by this matrix gives its representation in the standard coordinate system.

$$\begin{bmatrix} 2 & -1 \ 1 & 1 \end{bmatrix}\begin{bmatrix} 5/3 \ 1/3 \end{bmatrix} = \begin{bmatrix} 3 \ 2 \end{bmatrix}$$

Inverse Coordinate System Transformation
#

  • Simply take the inverse matrix $A^{-1}$ of the above matrix $A$.
  • Multiplying a vector in the standard coordinate system by $A^{-1}$ gives its representation in another coordinate system.

Case Study
#

How to implement the same transformation in two different coordinate systems? (e.g., counterclockwise rotation by 90°)

InvertBasic

  • Suppose $M$ is the transformation matrix in the standard coordinate system (rotation by 90°: $M = \begin{bmatrix} 0 & -1 \ 1 & 0 \end{bmatrix}$).
  • Suppose $A$ is the transformation matrix from another coordinate system to the standard coordinate system.
  • The matrix to implement the same transformation in another coordinate system is $A^{-1}MA$.

The intuitive understanding of this operation is:

  1. $A$: Transform the vector from another coordinate system to the standard coordinate system.
  2. $M$: Perform rotation in the standard coordinate system.
  3. $A^{-1}$: Transform the result back to the original coordinate system.

For a coordinate system with basis vectors $\begin{bmatrix} 2 \ 1 \end{bmatrix}$ and $\begin{bmatrix} -1 \ 1 \end{bmatrix}$, its transformation matrix for rotation by 90° is: $A^{-1}MA = \begin{bmatrix} 2 & -1 \ 1 & 1 \end{bmatrix} \begin{bmatrix} 0 & -1 \ 1 & 0 \end{bmatrix} \begin{bmatrix} 1/3 & 1/3 \ -1/3 & 2/3 \end{bmatrix} = \begin{bmatrix} 1/3 & -5/3 \ 2/3 & -1/3 \end{bmatrix}$

10-Eigenvectors and Eigenvalues
#

Eigenvector
#

A non-zero vector that maintains its direction (only scales within its spanned space) when undergoing the linear transformation described by a matrix.

Eigenvalue
#

The scaling factor by which the eigenvector is stretched or compressed during the transformation.

Geometrically, if a 3D rotation can be viewed as occurring around an axis, then the direction vector of that axis is an eigenvector, with its corresponding eigenvalue being 1 (because it is not stretched during rotation). This understanding is much more intuitive than a 3×3 matrix.

Calculation Method
#

Formula: $A\vec{v} = \lambda\vec{v}$

  • $A$: transformation matrix
  • $\vec{v}$: eigenvector
  • $\lambda$: eigenvalue

To solve, we transform the formula: $A\vec{v} - \lambda\vec{v} = 0$ $A\vec{v} - \lambda I\vec{v} = 0$ $(A - \lambda I)\vec{v} = 0$

  • This result shows that the eigenvector $\vec{v}$ is compressed to the zero vector after transformation by $(A - \lambda I)$.
  • This means the transformation $(A - \lambda I)$ is a dimension-reducing transformation, so its determinant must be zero.
  • $\det(A - \lambda I) = 0$

Example
#

EigenvectorDemo

For matrix $A = \begin{bmatrix} 3 & 1 \ 0 & 2 \end{bmatrix}$: $\det(A - \lambda I) = \det \left( \begin{bmatrix} 3 & 1 \ 0 & 2 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 \ 0 & 1 \end{bmatrix} \right) = \det \begin{bmatrix} 3-\lambda & 1 \ 0 & 2-\lambda \end{bmatrix} = (3-\lambda)(2-\lambda) = 0$ Solving gives eigenvalues $\lambda = 2$ or $\lambda = 3$.

  • When $\lambda = 2$, we solve $(A - 2I)\vec{v}$ = $\begin{bmatrix} 1 & 1 \ 0 & 0 \end{bmatrix} \begin{bmatrix} x \ y \end{bmatrix} = \begin{bmatrix} 0 \ 0 \end{bmatrix}$ All solutions lie in the space spanned by vector $\begin{bmatrix} -1 \ 1 \end{bmatrix}$.

  • When $\lambda = 3$, the matrix $\begin{bmatrix} 3 & 1 \ 0 & 2 \end{bmatrix}$ will stretch the corresponding eigenvector to 3 times its original size.

Properties and Applications
#

Transformations without Eigenvectors
#

A counterclockwise rotation by 90° in 2D space has no real eigenvectors.

Transformations with Multiple Eigenvectors
#

Scaling matrices

Diagonal Matrices
#

If the basis vectors of a transformation are all eigenvectors, then the matrix describing this transformation is a diagonal matrix, and the diagonal elements of the matrix are the eigenvalues corresponding to these basis vectors.

Simplifying High-Power Calculations
#

Power calculations for diagonal matrices are very simple, just calculate the corresponding powers of the diagonal elements (eigenvalues). When we need to calculate a non-diagonal matrix $M$ multiple times, if it has sufficient eigenvectors, we can:

  1. Use the eigenvectors as new basis vectors to form the basis transformation matrix $A$.
  2. Convert $M$ to a diagonal matrix $D$ through $A^{-1}MA$.
  3. Calculate $D^n$.
  4. Convert the result back to the original coordinate system through $A D^n A^{-1}$, i.e., $M^n = A D^n A^{-1}$.

11-Abstract Vector Spaces
#

Vectors and functions have commonalities.

Functions as Vectors
#

Linear Properties
#

Many operations on functions (such as differentiation) are linear, satisfying:

  • Additivity: $L(f + g) = L(f) + L(g)$

  • Proportionality: $L(cf) = cL(f)$

  • Vector representation of polynomials: We can view polynomials as infinite-dimensional vectors.

    • Take a set of basis functions $b_0(x)=1, b_1(x)=x, b_2(x)=x^2, \dots$ as an example.
    • $1x^2 + 3x + 5 \cdot 1$ can be viewed as vector $\begin{bmatrix} 5, 3, 1, 0, \dots \end{bmatrix}^T$.
    • $4x^7 - 5x^2$ can be viewed as vector $\begin{bmatrix} 0, 0, -5, 0, 0, 0, 0, 4, \dots \end{bmatrix}^T$.
  • Matrix representation of differentiation: The differentiation transformation can also be described by a matrix.

    • $\frac{d}{dx}(a_n x^n + \dots + a_1 x + a_0) = n a_n x^{n-1} + \dots + a_1$
    • This transformation acting on the vector corresponding to the polynomial is like a matrix acting on a vector.

Eight Axioms
#

This is because both vector calculations and function calculations conform to these eight axioms:

  1. Vector addition associativity: $U + (V + W) = (U + V) + W$
  2. Vector addition commutativity: $V + W = W + V$
  3. Additive identity exists: There exists a zero vector $0$ such that $0 + V = V$
  4. Additive inverse exists: For any vector $V$, there exists $-V$ such that $V + (-V) = 0$
  5. Scalar multiplication compatible with field multiplication: $a(bV) = (ab)V$
  6. Scalar multiplication identity exists: $1V = V$
  7. Scalar multiplication distributes over vector addition: $a(V + W) = aV + aW$
  8. Scalar multiplication distributes over field addition: $(a + b)V = aV + bV$

Related

EssenceOfLinearAlgebra - 02
1458 words·7 mins
EssenceOfLinearAlgebra - 03
667 words·4 mins