非平行超平面分类器算法研究

发布时间:2018-08-28 09:29
【摘要】:非平行超平面分类器(nonparallel hyperplane classifier,NHC)分类方法是在传统支持向量机(support vector machine,SVM)基础上发展起来的一类新的机器学习方法。对于二分类问题,传统SVM依据大间隔准则寻找单一的分类超平面,而NHC分类方法通常要为每类样本寻找一个最佳决策超平面,即一对非平行的分类超平面。在线性模式下,NHC分类方法对异或(XOR)问题有着显著的分类能力。鉴于NHC分类方法的优势,目前已经成为机器学习领域的研究热点。然而,NHC分类方法是一类比较新的机器学习方法,在诸多方面尚不成熟、不完善,需要进一步的研究和改进。本文主要从提升分类性能、提高学习速度等方面对现有的NHC分类方法进行了深入系统地研究。具体研究内容如下:1.对局部保持孪生支持向量机进行研究。针对现有NHC分类方法中没有充分考虑训练样本集内在局部几何结构及其潜藏的分类信息,从而可能导致算法分类性能不佳的问题,将局部保持投影(locality preserving projections,LPP)的思想直接引入到NHC分类方法中,提出一种基于局部信息保持的孪生支持向量机(locality preserving twin SVM,LPTSVM)。为了能够有效降低算法二次规划求解的时间复杂度,LPTSVM通过类间近邻图选取少量的边界样本来构造优化问题的约束条件。对于LPTSVM算法中可能出现的奇异性问题,从理论上给出了一种基于主成分分析(principal component analysis,PCA)的降维方法。2.对非线性最小二乘投影孪生支持向量机及相应的递归学习算法进行研究。针对线性最小二乘投影孪生支持向量机(least squares projection twin SVM,LSPTSVM)不能有效处理非线性分类情况的问题,采用核映射技术将原空间中的训练样本映射到高维特征空间,在此基础上提出一种非线性最小二乘投影孪生支持向量机(kernel based LSPTSVM,KLSPTSVM)。为进一步提高KLSPTSVM算法的非线性分类性能,同样采用核映射技术将线性模式下的递归学习算法也推广到非线性模式并与KLSPTSVM分类算法相结合,提出非线性模式下的递归KLSPTSVM分类方法。3.对鲁棒的局部加权孪生支持向量机进行研究。针对局部加权孪生支持向量机(weighted twin SVM with local information,WLTSVM)算法不能充分刻画类内样本之间相似性,训练效率偏低和对噪声敏感的问题,提出一种鲁棒的局部加权孪生支持向量机(robust WLTSVM,RWLTSVM)。RWLTSVM选用高斯核函数定义类内近邻图的权值矩阵并在此基础上生成样本权重,能够更好的刻画类内样本对决策超平面的贡献程度。为了降低优化问题求解时间复杂度,RWLTSVM将WLTSVM算法中不等式约束改成等式约束并通过求解方程组方法获得问题解析解。另外,RWLTSVM还在等式约束条件中考虑了相反类样本的类内权重,从而能够更好的免疫于噪声问题。4.对加权投影孪生支持向量机及其相应的最小二乘版算法进行研究。针对投影孪生支持向量机(projection twin SVM,PTSVM)在优化问题中没有考虑类内训练样本之间相关性的问题,提出一种加权投影孪生支持向量机(weighted PTSVM,WPTSVM)。WPTSVM通过在类内构造近邻图并在此基础上赋予样本特定的权重,以此来突出样本对决策面的贡献程度,进而改善算法的分类性能。此外,WPTSVM在优化问题的不等式约束中同样考虑了样本权重,使得算法能够很好的免疫于噪声问题。为了进一步降低WPTSVM算法的训练时间复杂度,使其能够胜任大规模数据处理,提出最小二乘版加权投影孪生支持向量机(least squares WPTSVM,LSWPTSVM)。LSWPTSVM通过解方程组获得问题的解析解而不是WPTSVM中的二次规划求解。
[Abstract]:Nonparallel hyperplane classifier (NHC) is a new class of machine learning methods based on traditional support vector machine (SVM). For binary classification problems, the traditional SVM searches for a single classification hyperplane according to the large-interval criterion, while the NHC classification method is usually required. In linear mode, NHC classifier has a remarkable ability to classify XOR problems. Because of the advantages of NHC classifier, it has become a research hotspot in the field of machine learning. However, NHC classifier is a relatively new kind of machine. In this paper, the existing NHC classification methods are thoroughly and systematically studied from the aspects of improving classification performance and learning speed. The specific research contents are as follows: 1. Local Twin-Preserving Support Vector Machines are studied. Locality preserving projections (LPP) are directly introduced into NHC classification method, and a local information preserving twin is proposed. Locality preserving twin support vector machine (LPTSVM). In order to reduce the time complexity of quadratic programming, LPTSVM constructs the constraints of optimization problem by selecting a small number of boundary samples from the neighborhood graph between classes. For the singularity problem that may appear in LPTSVM algorithm, a Principal-based approach is proposed in theory. Principal component analysis (PCA) dimensionality reduction method. 2. Nonlinear least squares projection twin support vector machine (LSPTSVM) and its corresponding recursive learning algorithm are studied. Linear least squares projection twin support vector machine (LSPTSVM) can not effectively deal with the problem of nonlinear classification. The kernel-based LSPTSVM (KLSPTSVM) is proposed based on the mapping of training samples from the original space to the high-dimensional feature space. In order to further improve the nonlinear classification performance of the KLSPTSVM algorithm, the kernel-based LSPTSVM (KLSPTSVM) is also used to recursively implement the linear pattern. The recursive learning algorithm is also extended to the nonlinear model and combined with the KLSPTSVM classification algorithm. 3. The robust local weighted twin support vector machine (WLTSVM) is studied. The local weighted twin support vector machine (WLTSVM) algorithm is insufficient. In this paper, a robust local weighted twin support vector machine (RWLTSVM) is proposed to characterize the similarity between samples in a class, low training efficiency and noise sensitivity. RWLTSVM uses Gaussian kernel function to define the weighting matrix of the neighborhood graph in a class and generates the sample weights on the basis of it, which can better characterize the intra-class sample duels. In order to reduce the time complexity of solving optimization problems, RWLTSVM transforms inequality constraints into equality constraints and obtains analytical solutions by solving equations. In addition, RWLTSVM considers the intra-class weights of inverse class samples in equality constraints, so it is immune to noise better. 4. The weighted projection twin support vector machine (WPTSVM) and its corresponding least squares version algorithm are studied. A weighted projection twin support vector machine (WPTSVM) is proposed to optimize the projection twin support vector machine (PTSVM). WPTSVM emphasizes the contribution of the samples to the decision-making surface by constructing the nearest neighbor graph in the class and giving the samples specific weights on the basis of it, so as to improve the classification performance of the algorithm. In addition, WPTSVM also considers the sample weights in the inequality constraints of the optimization problem, so that the algorithm can be well immune to noise problems. To further reduce the training time complexity of WPTSVM algorithm and make it competent for large-scale data processing, least squares weighted projection twin support vector machine (LSWPTSVM) is proposed. LSWPTSVM obtains the analytical solution of the problem by solving the system of equations instead of the quadratic programming in WPTSVM.
【学位授予单位】:中国矿业大学
【学位级别】:博士
【学位授予年份】:2015
【分类号】:TP181;TP391.4


本文编号:2208988

资料下载
论文发表

本文链接:https://www.wllwen.com/shoufeilunwen/xxkjbs/2208988.html


Copyright(c)文论论文网All Rights Reserved | 网站地图 |

版权申明:资料由用户2a4f2***提供,本站仅收录摘要或目录,作者需要删除请E-mail邮箱bigeng88@qq.com