552Accesses
12Citations
Abstract
Dimension reduction has always been a major problem in many applications of machine learning and pattern recognition. In this paper, the scaling cut criterion-based supervised dimension reduction methods for data analysis are proposed. The scaling cut criterion can eliminate the limit of the hypothesis that data distribution of each class is homoscedastic Gaussian. To obtain a more reasonable mapping matrix and reduce the computational complexity, local scaling cut criterion-based dimension reduction is raised, which utilized the localization strategy of the input data. The localized\(k\)-nearest neighbor graph is introduced , which relaxes the within-class variance and enlarges the between-class margin. Moreover, by kernelizing the scaling cut criterion and local scaling cut criterion, both methods are extended to efficiently model the nonlinear variability of the data. Furthermore, the optimal dimension scaling cut criterion is proposed, which can automatically select the optimal dimension for the dimension reduction methods. The approaches have been tested on several datasets, and the results have shown a better and efficient performance compared with other linear and nonlinear dimension reduction techniques.
This is a preview of subscription content,log in via an institution to check access.
Access this article
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
Price includes VAT (Japan)
Instant access to the full article PDF.








Similar content being viewed by others
References
Abbasnejad ME, Ramachandram D, Mandava R (2012) A survey of the state of the art in learning the kernels. Knowl Inf Syst 31(2):193–221
Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6):1373–1396
Borg I, Groenen P (2003) Modern multidimensional scaling: theory and applications. J Educ Meas 40(3):277–280
Ding C, He X, Zha H, Gu M, Simon HD (2001) A min-max cut algorithm for graph partitioning and data clustering. In: Proceedings of the IEEE international conference on data mining, pp 107–114
Fiedler M (1973) Algebraic connectivity of graphs. Czechoslov Math J 23(2):298–305
Fisher Ronald A (1936) The use of multiple measurements in taxonomic problems. Ann Eugen 7(2):179–188
Fu Y, Li Z, Yuan J et al (2008) Locality versus globality: query-driven localized linear models for facial image computing. IEEE Trans Circuits Syst Video Technol 18(12):1741–1752
Hagen L, Kahng AB (1992) New spectral methods for ratio cut partitioning and clustering. IEEE Trans Comput Aided Des Integr Circuits Syst 11(9):1074–1085
Ham J, Chen Y, Crawford MM et al (2005) Investigation of the random forest framework for classification of hyperspectral data. IEEE Trans Geosci Remote Sens 43(3):492–501
He X, Cai D, Yan S et al (2005) Neighborhood preserving embedding. In: Proceedings of the 10th international conference on computer vision, pp 1208–1213
Jia Y, Nie F, Zhang C (2009) Trace ratio problem revisited. IEEE Trans Neural Netw 20(4):729–735
Li XR, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(2):157–165
Melgani F, Bruzzone L (2004) Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans Geosci Remote Sens 42(8):1778–1790
Nie F, Xiang S, Zhang C (2007) Neighborhood minmax projections. In: Proceedings of the 20th international joint conference on artificial intelligence, pp 993–998
Nie F, Xiang S, Song Y et al (2009) Orthogonal locality minimizing globality maximizing projections for feature extraction. Opt Eng 48(1):017202–017205
Nie F, Xiang S, Song Y et al (2007) Optimal dimensionality discriminant analysis and its application to image recognition. In: Proceedings of IEEE conference on computer vision and pattern recognition, pp 1–8
Niyogi P, He X (2003) Locality preserving projections. Neural Inf Process Syst 16:153
Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326
Schlkopf B, Smola A, Mller KR (1997) Kernel principal component analysis. In: Proceedings of the 7th international conference on artificial neural networks, vol 97, pp 583–588
Shi J, Malik J (2000) Normalized cuts and image segmentation. IEEE Trans Pattern Anal Mach Intell 22(8):888–905
Song G, Cui B, Zheng B et al (2009) Accelerating sequence searching: dimensionality reduction method. Knowl Inf Syst 20(3):301–322
Sugiyama M (2006) Local Fisher discriminant analysis for supervised dimensionality reduction. In: Proceedings of the 23rd international conference on machine learning, pp 905–912
Tenenbaum Joshua B, De Silva (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323
Wang H, Yan S, Xu D et al (2007) Trace ratio vs. ratio trace for dimensionality reduction. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1–8
Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemom Intell Lab Syst 2(1):37–52
Xiang S, Nie F, Song Y et al (2009) Embedding new data points for manifold learning via coordinate propagation. Knowl Inf Syst 19(2):159–184
Xiang S, Nie F, Zhang C et al (2009) Nonlinear dimensionality reduction with local spline embedding. IEEE Trans Knowl Data Eng 21(9):1285–1298
Yan S, Xu D, Zhang B et al (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29(1):40–51
Yang J, Frangi AF, Zhang D et al (2005) KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition. IEEE Trans Pattern Anal Mach Intell 27(2):230–244
Zhang C, Nie F, Xiang S (2010) A general kernelization framework for learning algorithms based on kernel PCA. Neurocomputing 73(4):959–967
Zhang L, Chen S, Qiao L (2012) Graph optimization for dimensionality reduction with sparsity constraints. Pattern Recognit 45(3):1205–1210
Zhang L, Qiao L, Chen S (2010) Graph-optimized locality preserving projections. Pattern Recognit 43(6):1993–2002
Zhang XR, Jiao LC, Zhou S et al (2012) Adaptive multi-parameter spectral feature analysis for SAR target recognition. Opt Eng 51(8):087203. doi:10.1117/1.OE.51.8.087203
Zhang XR, Wang WW, Li YY et al (2012) A PSO-based automatic relevance determination and feature selection system for hyperspectral image classification. IET J Electron Lett 48(20):1263–1264
Acknowledgments
This work was supported by the National Basic Research Program of China (973 Program) (Grant 2013CB329402), the National Natural Science Foundation of China (Nos. 61272282, 61203303, and 61272279), the Program for New Century Excellent Talents in University (NCET-13-0948), and Fundamental Research Funds for the Central Universities (Grant K50511020011).
Author information
Authors and Affiliations
The Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, International Research Center for Intelligent Perception and Computation, Xidian University, Xi’an , 710071, China
Xiangrong Zhang, Yudi He, Licheng Jiao, Ruochen Liu, Jie Feng & Sisi Zhou
- Xiangrong Zhang
You can also search for this author inPubMed Google Scholar
- Yudi He
You can also search for this author inPubMed Google Scholar
- Licheng Jiao
You can also search for this author inPubMed Google Scholar
- Ruochen Liu
You can also search for this author inPubMed Google Scholar
- Jie Feng
You can also search for this author inPubMed Google Scholar
- Sisi Zhou
You can also search for this author inPubMed Google Scholar
Corresponding author
Correspondence toXiangrong Zhang.
Rights and permissions
About this article
Cite this article
Zhang, X., He, Y., Jiao, L.et al. Scaling cut criterion-based discriminant analysis for supervised dimension reduction.Knowl Inf Syst43, 633–655 (2015). https://doi.org/10.1007/s10115-014-0744-0
Received:
Revised:
Accepted:
Published:
Issue Date:
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative