Global Rank Preservation Learning Machine Based on Manifold-Based Discriminant Analysis
-
摘要: 当前主流分类方法在分类决策时无法同时考虑样本的全局特征和局部特征,而且大多算法仅关注各类样本的可分性,往往忽略样本之间的相对关系。为了解决上述问题,提出了基于流形判别分析的全局保序学习机。该方法引入流形判别分析来反映样本的全局特征和局部特征;通过保持各类样本中心的相对关系不变进而实现保持全体样本的先后顺序不变;借鉴核心向量机有关理论和方法,通过建立所提方法与核心向量机对偶形式的等价关系实现大规模分类。人工数据集和标准数据集上的比较实验验证了该方法的有效性。
-
[1] QUINLAN J R. C4.5: Programs for Machine Learning[M]. San Francisco: Morgan Kaufmann Publishers, 1993. [2] RASTOGI R, SHIM K. Public: a decision tree classifier that integrates building and pruning[C]//Proc of the Very Large Database Conference (VLDB). New York: [s.n.], 1998: 404-415. [3] MEHTA M, AGRAWAL R, RISSANEN J. SLIQ: a fast scalable classifier for data mining[C]//Proc of International Conf Extending Database Technology(EDBT'96). France: [s.n.], 1996: 18-32. [4] GEHRKE J, RAMAKRISHNAN R, GANTI V. Rainforest: a framework for fast decision tree construction of large datasets[J]. Data Mining and Knowledge Discovery, 2000, 4(2-3): 127-162. [5] LIU B, HSU W, MA Y. Integrating classification and association rule[C]//Proc of the 4th International Conf on Knowledge Discovery and Data Mining. New York, USA: AAAI Press, 1998: 80-86. [6] LI W M, HAN J, JIAN P. CMAR: Accurate and efficient classification based on multiple class association rules[C]//Proc of IEEE International Conf on Data Mining. Washington D C: IEEE Computer Society, 2001: 369-376. [7] YIN X, HAN J. Classification based on predictive association rules[C]//SIAM International Conf on Data Mining. San Francisco: [s.n.], 2003: 331-335. [8] VAPNIK V. The nature of statistical learning theory[M]. New York: Springer-Verlag, 1995. [9] 邓乃扬, 田英杰. 支持向量机——理论、算法与拓展[M]. 北京: 科学出版社, 2009. DENG Nai-yang, TIAN Ying-jie. Support vector machine: Theory, algorithm and development[M]. Beijing: Science Press, 2009. [10] PAL M, FOODY G M. Feature selection for classification of hyper spectral data by SVM[J]. IEEE Trans on Geoscience and Remote Sensing, 2010, 48(5): 2297-2307. [11] SCHOLKOPF B, SMOLA A, BARTLET P. New support vector algorithms[J]. Neural Computation, 2000, 12: 1207-1245. [12] SCHOLKOPF B, PLATT J, SHAWE-TAYLOR J, et a1. Estimating the support of high-dimensional distribution[J]. Neural Computation, 2001, 13: 1443-1471. [13] TAX D M J, DUIN R P W. Support vector data description [J]. Machine Learning, 2004(54): 45-66. [14] TSANG I W, KWOK J T, CHEUNG P M. Core vector machines: Fast SVM training on very large data sets[J]. Journal of Machine Learning Research, 2005(6): 363-392. [15] MANGASARIAN O, MUSICANT D. Lagrange support vector machines[J]. Journal of Machine Learning Research, 200l(1): 161-177. [16] SUYKENS J A, VANDEWALLE J. Least squares support vector machines classifiers[J]. Neural Processing Letters, 1999, 19(3): 293-300. [17] LEE Y J, MANGASARIAN O. SSVM: a smooth support vector machines[J]. Computational Optimization and Applications, 2001, 20(1): 5-22. [18] LANGLEY P, SAGE S. Introduction of selective Bayesian classifier[C]//Proc of the 10th Conf on Uncertainty in Artificial Intelligence. Seattle: Morgan Kaufmann Publishers, 1994: 339-406. [19] ZHENG Z H, WEBB G I. Lazy Bayesian rules[J]. Machine Learning, 2000, 32(1): 53-84. [20] FRIEDMAN N, GEIGER D, GOLDSZMIDT M. Bayesian network classifiers[J]. Machine Learing, 1997, 29(2): 131-163. [21] 刘忠宝, 潘广贞, 赵文娟. 流形判别分析[J]. 电子与信息学报, 2013, 35(9): 2047-2053. LIU Zhong-bao, PAN Guang-zhen, ZHAO Wen-juan. Manifold-based discriminant analysis[J]. Journal of Electroincs & Information Technology, 2013, 35(9): 2047-2053. [22] WESTON J, WATKINS C. Multi-class support vector machines[R]. London: Department of Computer Science, Royal Holloway University of London Technology, 1998. [23] LIN L, LIN H T. Ordinal regression by extended binary classification[J]. Advanced in Neural Information Processing Systems, 2007, 19: 865-872.
点击查看大图
计量
- 文章访问数: 5108
- HTML全文浏览量: 139
- PDF下载量: 380
- 被引次数: 0