Golden
Kernel PCA

Kernel PCA

Non-linear form of principal component analysis (PCA) that better exploits the complicated spatial structure of high-dimensional features.

Kernel principal component analysis, or kernel PCA for short, is an extension of the principal component analysis tool that's popular for linear dimensionality reduction and feature extraction.



Kernel PCA is the nonlinear form of PCA, which makes it more adept to exploit the complicated spatial structure of high-dimensional features. In other words, kernel PCA is useful for machine learning problems which have data with more complicated structures that can't be represented in a linear subspace.



Generally speaking, a "kernel" is a continuous function that takes two inputs (e.g. real numbers, functions, vectors, etc.) and maps them to a real value independent of the order of the arguments.





Timeline

People

Name
Role
Related Golden topics







Further reading

Title
Author
Link
Type

A tutorial on Kernel Principal Component Analysis

Aleksei Tiulpin

Web

Kernel PCA vs PCA vs ICA in Tensorflow/sklearn

Jae Duk Seo

Web

Kernel Principal Component Analysis and its Applications in Face Recognition and Active Shape Models

Quan Wang

PDF

Documentaries, videos and podcasts

Title
Date
Link





Companies

Company
CEO
Location
Products/Services









References