support vector classifiers can be written as where
- is the bias
- is the parameter
- is the observation
The kernel trick is to replace the dot product with a functional :
It allows for non-linear decision boundaries.
Search
Apr 23, 20241 min read
support vector classifiers can be written as b+∑iαixTx(i), where
The kernel trick is to replace the dot product with a functional k:Rn×Rn→R:
b+∑iαik(x,x(i)).
It allows for non-linear decision boundaries.