How would phi of the gaussian rbf kernel map a 100

2019-08-17 02:49发布

问题:

Would a 100-by-3 dimensional feature matrix be mapped into a 100 dimensional or into a infinite dimensional feature space, if the mapping would not be bypassed by the Gaussian RBF Kernel?


Following this reasoning (The RBF kernel of Support Vector Machine) I would tend to say the feature matrix would be mapped to a inifinite dimensional feature space. Here a summary of the content:

Given a m-by-n feature matrix X. Each n-dimensional instance x of X is used to define a n-dimensional normal distribution function N1(n), with its center equal to the n-dimensional point x. N1(p) gives a real number as output for every inputted real number p. So an instance with a finite number of n dimensions is mapped into a function (I dont know how to assess the number of dimensions of this function).

A kernel K(a,b) is a function capable of computing the dot product <phi(a).T, phi(b)>. The dot product of two functions is defined as the integral of the multiplication of them. So <phi(a),phi(b) = int phi(a)*phi(b) da. This integral results in the gaussian rbf kernel K(a,b) = exp(-gamma*||a-b||^2).


Following this reasoning (Why Gaussian radial basis function maps the examples into an infinite-dimensional space?) I would say the feature matrix would be mapped to a 100 dimensional feature space. Here a summary of the content:

If you have m distinct instances then the gaussian radial basis kernel makes the SVM operate in an m-dimensional space. We say that the radial basis kernel maps to a space of infinite dimension because you can make m as large as you want and the space it operates in keeps growing without bound.