• It’s one of the best linear algebra tutorial I’ve looked at

    http://peterbloem.nl/blog/pca

  • PCA has similarities with Autoencoder

  • Minimizing reconstruction error == Maximizing variance (a^2 = b^2 + c^2)

    • Best fit == Perpendicular Projection
    • Combined (not unique) solution v.s. greedy iterative solution
  • Ortho-normal matrix is a rotation/flipping-only transformation

    • Data normalization and basis transformations
  • The great human-face dataset example

  • Spectrum theory

  • We then looked at eigenvectors, and we show that the eigenvectors of the data covariance 𝐒 arise naturally when we imagine that our data was originally decorrelated with unit variance in all directions. To me, this provides some intuition for why PCA works so well when it does. We can imagine that our data was constructed by sampling independent latent variables 𝐳 and then mixing them up linearly.

  • SVD

  • Part 3 - Proof of the Spectrum Theorem

    A matrix is orthogonally diagnolizable IFF it’s symmetric

  • Determinant - Inflation

    abs(det) is the ratio of change in hyper-volume after matrix transformation.

    If the ratio is zero it means at least one dim is squished then the matrix will be non-invertable

  • this reminds me of the 理解相对论

    https://www.bilibili.com/video/BV17P4y1V7BX?from=search&seid=11798032574058936658&spm_id_from=333.337.0.0

  • Non-trivial Null Space of

    we can apply a whole new set of tools from the analysis of polynomials, to the study of eigenvectors. We never have to work out the characteristic polynomial explicitly, we can just use the knowledge that the determinant is a polynomial and use what we know about polynomials to help us further along towards the spectral theorem.

  • Complex Number and Multiplication

    root finding of polynomials

    Multiplication of complex number in polar notation

    Just remember, a complex number is a single value. It just so happens there are ways to represent it by two real values, which can help with our intuition. When we start thinking about complex matrices and vectors, however, it may hurt our intuition, so its best to think of complex numbers as just that: single numbers.