Support vector machine (SVM) is regarded as a powerful method for pattern classification. However, the solution of the primal optimal model of SVM is susceptible for class distribution and may result in a nonrobust solution. In order to overcome this shortcoming, an improved model, support vector machine with globalitylocality preserving (GLPSVM), is proposed. It introduces globalitylocality preserving into the standard SVM, which can preserve the manifold structure of the data space. We complete rich experiments on the UCI machine learning data sets. The results validate the effectiveness of the proposed model, especially on the Wine and Iris databases; the recognition rate is above 97% and outperforms all the algorithms that were developed from SVM.
In the past decades, support vector machine (SVM) [
It is well known that SVM is an optimization problem. The optimal solution can be found by solving a quadratic programming problem. Because the objective function is convex, the global minimum solution is guaranteed. However, the traditional SVM solution is susceptible to class distribution, which means it is nonrobust to data samples. So, in order to overcome this shortcoming, Zafeiriou et al. [
According to the above analysis, none of the mentioned methods takes the manifold structure of the data space into consideration, except the MCLPV
In summary, this paper is organized as follows. Section
Given a set of pairwise samples
For the linearly separable case, the SVM model is as follows:
By transforming this optimization problem into its corresponding dual problem, the optimal discriminant vectors can be found through
Usually, in real world applications, we need to deal with the multiclassification cases, such as face recognition [
In this section, we will propose a novel support vector classier which takes the class distribution into consideration, and a robust solution is expected. Firstly, we will introduce the definition of globalitylocality preserving.
Discriminant locality preserving projections (DLPP) [
Formula (
On the other hand, Huang et al. [
locality preserving matrix:
globality preserving matrix:
Now, we give the proposed extension of SVM, called GLPSVM. For the linearly separable data, GLPSVM can be described as follows:
According to the model, we can see that the optimal discriminant directions are no longer the same as classical SVM. It is because that we introduce the obtained GLP to the optimization model of SVM. The classification performance of the proposed method will be shown in Section
Similar to SVM, the proposed model can be viewed as a quadratic optimization problem. Lagrange’s method of undetermined multipliers is used to solve this problem. Suppose
Hence, we have the following dual problem:
Suppose
So, the corresponding decision surface is
Finally, the corresponding optimal bias
As can be seen, in linearly separable case, GLPSVM is required to obtain a completely accurate decision hyperplane. However, in real world applications, the decision hyperplane no longer needs to be completely accurate, so we extend the GLPSVM to soft margin situations.
Reference [
Note that the objection function of classical SVM is
Then, we can find that the solution to GLPSVM can be solved by standard SVM software package, but the optimal discriminant vectors are different. Since we introduce globalitylocality preserving to SVM, the optimal discriminant directions in GLPSVM can preserve the intrinsic manifold structure of the data in lowdimensional feature space. Besides, matrices
In the proposed model, there are totally six parameters; they are the neighborhood parameters
For the parameters setting, the regularization parameter
Firstly, we set the tradeoff parameter to be 0.2 and use the same heat kernel parameter (
The effects of parameter











0.8207  0.8048  0.8008  0.8008  0.7888  0.7769  0.7888  0.7809 












0.8884  0.8884  0.8964  0.8805  0.8765  0.8008  0.8207  0.8048 












0.9323  0.9482  0.9044  0.9203  0.9323  0.9203  0.9084  0.8884 












0.9402  0.9283  0.9402  0.9442  0.9402  0.9323  0.9482  0.9363 









Next, we will explore the effect of the tradeoff parameter
The effects of tradeoff parameter











0.8048  0.8048  0.7849  0.8008  0.8088  0.7928  0.7928  0.8008 












0.9441  0.7849  0.8167  0.8127  0.7968  0.7968  0.8008  0.8008 












0.9203  0.8964  0.9004  0.9203  0.8845  0.8964  0.8884  0.8645 












0.9283  0.9442  0.9243  0.9482  0.9203  0.9323  0.9283  0.9163 









In this subsection, comparative experiments are conducted to test the ability of the proposed GLPSVM. We compare it with SVM, SVM+LDA, MCVSVM, and MCLPV
Database information for comparative analysis.
Database  Number of sample  Attribute  Number of class 

Breast  699  9  2 
Heart  270  13  2 
Pima  768  8  2 
New Thyroid (NT)  215  5  3 
Wine  178  13  3 
Iris  150  4  3 
Glass  214  9  6 
Table
Classification accuracy for comparative analysis.
Database  SVM  SVM + LDA  MCVSVM  MCLPV_SVM  GLPSVM 

Breast 














Heart 














Pima 














NT 














Wine 














Iris 














Glass 











In this paper, a new extension of SVM was proposed, which was called support vector machine with globalitylocality preserving (GLPSVM). It took the intrinsic manifold structure of the data space into consideration. Besides, the soft margin GLPSVM was also presented. The effective algorithm of GLPSVM showed that the model could be solved through transferring it to the standard SVM model and using the standard SVM software package for solving, which would greatly improve the implementation efficiency. Finally, experimental results on real world databases validated that the proposed method could have better performance than SVM, SVM+LDA, MCVSVM, and MCLPV_SVM.
The authors declare that there is no conflict of interests regarding the publication of this paper.
This research has been supported by the National Natural Science Foundation under Grant (no. 61001200).