Overview What is a Support Vector Machine? Train an SVM classifier with scikit-learn Implement your SVM with CMSIS-DSP What is a Bayesian estimator? Train your Bayesian estimator with scikit-learn Implement your Bayesian estimator with CMSIS-DSP What is clustering? Use CMSIS-DSP distance functions Miscellaneous new CMSIS-DSP functions Related information Next steps

## Implement your SVM with CMSIS-DSP

Once the parameters of the SVM classifier have been dumped from the Python code, you can use them in your C code with the CMSIS-DSP.

You can find the full code in

CMSIS/DSP/Examples/ARM/arm_svm_example/arm_svm_example_f32.c

This example reproduces the Python prediction by using the same test points. The following code declares the instance variable that used by the SVM classifier and some lengths.

This instance variable will contain all the parameters which have been dumped from Python.

Some of those parameters are arrays so we must specify some sizes, for example, the number of support vectors and their dimensions.

The following code defines the instance variable and some sizes, which will be useful later when creating the arrays.

arm_svm_polynomial_instance_f32 params; #define NB_SUPPORT_VECTORS 11 #define VECTOR_DIMENSION 2

The following code defines 2 arrays. The array of dual coefficients and support vectors is filled with the values coming from Python. The classes 0 and 1 are also defined to ease the comparison with Python:

const float32_t dualCoefficients[NB_SUPPORT_VECTORS] = { -0.01628988f, -0.0971605f, -0.02707579f, 0.0249406f, 0.00223095f, 0.04117345f, 0.0262687f, 0.00800358f, 0.00581823f, 0.02346904f, 0.00862162f}; /**< Dual coefficients */ const float32_t supportVectors[NB_SUPPORT_VECTORS*VECTOR_DIMENSION] = { 1.2510991f, 0.47782799f, -0.32711859f, -1.49880648f, -0.08905047f, 1.31907242f, 1.14059333f, 2.63443767f, -2.62561524f, 1.02120701f, -1.2361353f, -2.53145187f, 2.28308122f, -1.58185875f, 2.73955981f, 0.35759327f, 0.56662986f, 2.79702016f, -2.51380816f, 1.29295364f, -0.56658669f, -2.81944734f}; /**< Support vectors */ const int32_t classes[2] = {0,1};

The following code initializes the instance variable with all the parameters that come from Python, for example, the lengths, the above arrays, and the intercept, degree, coef0 and gamma parameters:

arm_svm_polynomial_init_f32(¶ms, NB_SUPPORT_VECTORS, VECTOR_DIMENSION, -1.661719f, /* Intercept */ dualCoefficients, supportVectors, classes, 3, /* degree */ 1.100000f, /* Coef0 */ 0.500000f /* Gamma */ );

Finally, for testing, an input vector is defined and classified using the polynomial SVM predictor.

The following code defines the input vector and applies the classifier:

in[0] = 0.4f; in[1] = 0.1f; arm_svm_polynomial_predict_f32(¶ms, in, &result);

The input vector is a point. This point is defined to be in the center cluster, which corresponds to class 0. This point has the same coordinates as the point which was used in the Python code to test the classifier. So the result of the above code should be the class 0.

An SVM classifier is a binary classifier. If you want to work with more classes, you need to create classifiers for each distinct pair of classes, and use a majority voting on the results to select the final class.

For instance, in the following example from scikit-learn, SVM is used to recognize digits. With ten digits, there are 45 pairs. This means that there are 45 SVM classifiers. Scikit-learn creates them automatically using the strategy one-vs-one: each classed is compared with every other class.

In this case, the extraction of the parameters is more complex because scikit-learn returns matrixes containing parameters for all 45 of the classifiers. In CMSIS-DSP you need 45 instance variables and you extract the values from the matrixes to initialize all those instance variables.