For this lab, we compared kernelized and non-kernilzed forms on the following algorithms on non-linearly seperable datasets:
- PCA
- Kmeans
- Logisitic Regression
Using the following kernel tricks:
- Linear
- Gaussian RBF
- Polynomial
- Laplacian
We compared these algorithms using the following benchmark datasets from scikit learn:
- Half Moons
- Center Circles
- Swiss Rolls
- Classification
For each of the listed algorithms and datasets, we compared our own implimentation against premade scikit-learn's implimentations. We compared the quality of the results, such as the explained variance with PCA or the accuracy of logisitic regression, as well as the run-time efficiency.
In part 4 of the lab, we impliment one class SVM and Maximum Enclosing Ball in MatLab.