This paper presents the multiclass classifier based on analytical center of feasible space (MACM). This multiclass classifier is formulated as quadratic constrained linear optimization and does not need repeatedly constructing classifiers to separate a single class from all the others. Its generalization error upper bound is proved theoretically. The experiments on benchmark datasets validate the generalization performance of MACM.

Multiclass classification is an important and on-going research subject in machine learning. Its application is immense, such as machine vision [

The one-versus-all approach reduces the problem of classifying among

For the all-versus-all method, a binary classifier is built to discriminate between each pair of classes, while discarding the rest of the classes. This requires building

For error-correcting output coding, it works by training

The above multiclass classification algorithms need construct binary classifier repeatedly to separate a single class from all the others for

To facilitate the discussion of multiclass analytical center classifier, the following definitions are introduced.

A vector,

Let

Given the sample

The point sets

Assume

To simplify the notation for the formulation of multiclass analytical center classifier, we consider an augmented weight space as follows.

Let

Inequality (

In order to further simplify the formulation of multiclass analytical center classifier, we introduce some notations as follows:

After solving the optimization problem (

If the dataset is not piecewise linear separable, the kernel function is used to map the data into high dimension linear space.

In order to analyze the generalization error bound theoretically, we introduce the definition of classification margin and data radius and then deduce the margin-based generalization error bound of MACM.

Given the linear classifier

Given dataset

Define data radius of dataset

Consider the following:

Consider thresholding of a real-valued function

From Definition

The binary classification of sample

Assume that

Consider the classifiers’ set

Because the sample in

Event

In this section, we present the computational results comparing multiclass analytical center classifier (MACM) and multiclass support vector machine (MSVM) [

Table

The generalization error of MACM and MSVM.

Dataset | Classifier | Degree of polynomial | |
---|---|---|---|

1 | 3 | ||

Wine | M-ACM | 97.74 | 98.65 |

M-SVM | 97.19 | 97.75 | |

| |||

Glass | M-ACM | 56.46 | 69.38 |

M-SVM | 55.14 | 66.15 |

In this paper, the multiclass analytical center classifier based on the analytical center of feasible space, which corresponds to a simple quadratic constrained linear optimization, is proposed. At the same time, in order to validate its generalization performance theoretically, its generalization error upper bound is formulated and proved. By the experiments on wine recognition and glass identification dataset, it is shown that the multiclass analytical center classifier outperforms multiclass support vector machine in generalization error.

This work was supported in part by the National Natural Science Foundation of China under Grant nos. 61370096 and 61173012 and the Key Project of Natural Science Foundation of Hunan Province under Grant no. 12JJA005.