Pca Paras Joshi.docx

  • Uploaded by: anirudh chaudhary
  • 0
  • 0
  • April 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Pca Paras Joshi.docx as PDF for free.

More details

  • Words: 205
  • Pages: 2
PCA 1. Principal Component Analysis. 2. The aim is to reduce the dimensions of the predictor variables. 3. Suppose the dataset has n features, the aim is to reduce the number of features to k such that k<=n.

From 2 dimensions to 1 dimension

4. Mathematically we try to project the n dimensional data to a linear subspace of k dimensions. 5. Before applying PCA:  Perform mean normalization of all features so that their mean is reduced to zero(mandatory).  Do feature scaling by dividing the normalized features by either range or standard deviation of the corresponding feature. 6. Algorithm: Compute the covariance matrix.  Compute Eigen Vectors of the covariance matrix.  Find the Ureduce matrix which gives the n dimensions on which entire data can be projected. Select the first k dimensions.  Now any example zi=(Ureduce)T.xi. 7. Reconstruction from compressed representation:Xiapprox= (Ureduce).zi

8. To select best possible k ensure:Variance with k dimensions >=0.99 Variance with n dimensions

9. Applications of PCA: Reduces memory/disk space required to store data.  Speeds up the learning algorithm.  Makes Visualization easier. 10. Use of PCA to prevent overfitting is bad use of PCA, since some data is lost in the process, use regularization instead.

Related Documents

Pca Paras Joshi.docx
April 2020 15
Paras
May 2020 14
Pca
May 2020 24
Pca
August 2019 37
Pca
November 2019 29
Pca[1]
June 2020 12

More Documents from "api-19644056"