10_svd.pdf

  • Uploaded by: Bryce Colson
  • 0
  • 0
  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View 10_svd.pdf as PDF for free.

More details

  • Words: 715
  • Pages: 2
Singular Value Decomposition Diagonalizing Symmetric Matrices • Symmetric matrices are always diagonalizable • For a symmetric matrix A: – A has real eigenvalues – For any given λ, the geometric multiplicity equals the algebraic multiplicity – The eigenspaces are mutually orthogonal – A is orthogonally diagonalizable: A = U ΛU T Outer Product • For an m × 1 vector x and an n × 1 vector y, the product of the form xy T = A is an outer product. • The output of an outer product is an m × n matrix • The output matrix has rank 1 Spectral Decomposition • For an n × n symmetric matrix A = U ΛU T , A = λ1 u1 uT1 + λ2 u2 uT2 + ... + λn un uTn Singular Value Decomposition • For an m × n matrix A, there exists a matrix factorization A = U ΣV T , which is called the singular value decomposition, or SVD – Σ is an m × n pseudo-diagonal matrix ∗ Diagonal entries σi are called the singular values ∗ We order the singular values from largest to smallest, left to right ∗ If the matrix has rank r, the first r singular values are positive – U is an m × m orthogonal matrix ∗ Columns of U are called the left singular vectors – V is an n × n orthogonal matrix ∗ Columns of V are called the right singular vectors • To find the SVD, we look at AT A. Let {v1 , v2 , ..., vn } be the eigenvectors of AT A and λ√1 , λ2 , ..., λn be the eigenvalues of AT A. We define the singular values along the diagonal of Σ as σi = λ1 . We then let {v1 , v2 , ..., vn } be the right singular vectors, or the columns of V . • To form U , we have to have that the column vectors of U , {u1 , u2 , ..., un } are orthonormal. Therefore, we need ||ui || = 1. From this we get ui = σ1i Avi . That is, the ith column of U equals 1 over the ith singular value times A times the ith column of V . U can also be described as an orthonormal matrix whose columns are the eigenvectors of AAT .

1

  4 0 0 • A = 0 6 0 0 0 1  16 0 0 AT A =  0 36 0 0 0 1 λ1 = 36, λ2 = 16, λ3 = 1 σ1 =6, σ2 = 4,σ3 = 1 6 0 0 Σ = 0 4 0 0 0 1     −20 0 0 0 0 0 ) = 1 E1 = null(AT A − 36I) = null( 0 0 0 −35 0     0 0 0 1 0 ) = 0 E2 = null(AT A − 16I) = null(0 20 0 0 −15     0 35 0 0 0 E3 = null(AT A − I) = null( 0 15 0) = 0 0 0 0 1  0 1 0   V = E1 E2 E3 = 1 0 0  0 0 1  16 0 0 AAT =  0 36 0 0 0 1 λ1 = 36, λ2 = 16, λ3 = 1     −20 0 0 0 0 0 ) = 1 u1 = null(AAT − 36I) = null( 0 0 −35   0  0 1 0 0 0 0 ) = 0 u2 = null(AAT − 16I) = null(0 20    0  0 0 −15 0 15 0 0 u3 = null(AAT − I) = null( 0 35 0) = 0 1 0 0 0  0 1 0   U = uˆ1 uˆ2 uˆ3 = 1 0 0 0 0 1    T  0 1 0 6 0 0 0 1 0 4 0 A = U ΣV T = 1 0 0 0 4 0 1 0 0 = 0 6 0 0 1 0 1 0 0 0 1 0 0

2

 0 0 = A 1

More Documents from "Bryce Colson"