267Accesses
5Citations
Abstract
Attribute reduction is capable of reducing the dimensionality of data and improving the performance of data mining. As a reasonable representative of relationships between samples, neighbor inconsistent pair focuses on measuring uncertainty in information systems. Nevertheless, classical attribute reduction methods are static and unsuitable for data with variations. Additionally, it is inevitable for data to undergo changes in real-life scenarios, such as an increase in the number of samples. Therefore, it is essential to identify an efficient method for reducing the dimensionality of the dataset while preserving the classification accuracy. Inspired by these deficiencies, our focus lies on developing effective and efficient incremental methods that employ the neighbor inconsistent pair selection strategy for decision tables involving object variations. At first, some concepts related to rough sets, simplified decision tables and neighbor inconsistent pairs are introduced. Then, the heuristic attribute reduction algorithms for dynamic decision tables with the variation of object sets are designed by neighbor inconsistent pairs. Next, a novel feature selection procedure, which we refer to as incremental neighbor inconsistent pair selection, is proposed to update reducts for dynamic decision tables with the variation of object sets. Finally, two incremental attribute reduction algorithms based on neighbor inconsistent pair selection are designed. Furthermore, experiments are conducted on real datasets to validate the effectiveness and benefits of the proposed incremental algorithms. The results indicate that our algorithms exhibit minimal computing time requirements while achieving the highest classification accuracy among at least ten out of thirteen datasets when compared to the comparative algorithms.
This is a preview of subscription content,log in via an institution to check access.
Access this article
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
Price includes VAT (Japan)
Instant access to the full article PDF.



Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
References
Pawlak Z (1982) Rough sets. Int J Comput Inform Sci 11(5):341–356
Li ZW, Dai JH, Chen JL, Fujita H (2020) Measures of uncertainty based on gaussian kernel for a fully fuzzy information system. Knowl-Based Syst 196:105791
Sun L, Wang LY, Ding WP, Qian YH, Xu JC (2021) Feature selection using fuzzy neighborhood entropy-based uncertainty measures for fuzzy neighborhood multigranulation rough sets. IEEE Trans Fuzzy Syst 29(1):19–33
Zhao B, Ren Y, Gao DK (2019) Prediction of service life of large centrifugal compressor remanufactured impeller based on clustering rough set and fuzzy bandelet neural network. Appl Soft Comput 78:132–140
Wang R, Wang XZ, Kwong S, Xu C (2017) Incorporating diversity and informativeness in multiple-instance active learning. IEEE Trans Fuzzy Syst 25(6):1460–1475
Deepa N, Ganesan K (2019) Decision-making tool for crop selection for agriculture development. Neural Comput Appl 31(4):1215–1225
Roy S, Maji P (2020) Medical image segmentation by partitioning spatially constrained fuzzy approximation spaces. IEEE Trans Fuzzy Syst 28(5):965–977
Cheruku R, Edla DR, Kuppili V, Dharavath R (2018) RST-BatMiner: a fuzzy rule miner integrating rough set feature selection and bat optimization for detection of diabetes disease. Appl Soft Comput 67:764–780
Su CH (2017) A novel hybrid learning achievement prediction model: A case study in gamification education applications (APPs). Int J Inform Technol Decis Mak 16(2):515–543
Zhou F, Jiao JR, Yang XJ, Lei BY (2017) Augmenting feature model through customer preference mining by hybrid sentiment analysis. Expert Syst Appl 89:306–317
Ko YC, Fujita H, Li TR (2017) An evidential analysis of Altman Z-score for financial predictions: Case study on solar energy companies. Appl Soft Comput 52:748–759
Lei L (2018) Wavelet neural network prediction method of stock price trend based on rough set attribute reduction. Appl Soft Comput 62:923–932
Liu XM, Shen C, Wang W, Guan XH (2020) CoEvil: a coevolutionary model for crime inference based on fuzzy rough feature selection. IEEE Trans Fuzzy Syst 28(5):806–817
Zhang CC, Dai JH, Chen JL (2020) Knowledge granularity based incremental attribute reduction for incomplete decision systems. Int J Mach Learn Cybern 11:1141–1157
Xu WH, Huang M, Jiang ZY, Qian YH (2023) Graph-based unsupervised feature selection for interval-valued information system. IEEE Trans Neural Netw Learn Syst.https://doi.org/10.1109/TNNLS.2023.3263684
Su YB, Guo J, Li ZJ (2015) A simple fitness function for minimum attribute reduction. Comput Intell Neurosci 2015:1–6
Fan J, Jiang YL, Liu Y (2017) Quick attribute reduction with generalized indiscernibility models. Inf Sci 397:15–36
Lazo-Cortes MS, Martinez-Trinidad JF, Carrasco-Ochoa JA, Sanchez Diaz G (2016) A new algorithm for computing reducts based on the binary discernibility matrix. Intell Data Anal 20(2):317–337
Konecny J (2017) On attribute reduction in concept lattices: methods based on discernibility matrix are outperformed by basic clarification and reduction. Inf Sci 415:199–212
Konecny J, Krajca P (2018) On attribute reduction in concept lattices: experimental evaluation shows discernibility matrix based methods inefficient. Inf Sci 467:431–445
Wang CZ, He Q, Shao MW, Hu QH (2018) Feature selection based on maximal neighborhood discernibility. Int J Mach Learn Cybern 9(11):1929–1940
Dai JH, Tian HW (2013) Entropy measures and granularity measures for set-valued information systems. Inf Sci 240(11):72–82
Dai JH, Xu Q (2013) Attribute selection based on information gain ratio in fuzzy rough set theory with application to tumor classification. Appl Soft Comput 13(1):211–221
Xu WH, Yuan KH, Li WT, Ding WP (2023) An emerging fuzzy feature selection method using composite entropy-based uncertainty measure and data distribution. IEEE Trans Emerg Top Comput Intell 7(1):76–88
Dai JH, Hu QH, Hu H, Huang DB (2018) Neighbor inconsistent pair selection for attribute reduction by rough set approach. IEEE Trans Fuzzy Syst 26(2):937–950
Bhattacharya A, Goswami RT, Mukherjee K (2019) A feature selection technique based on rough set and improvised PSO algorithm (PSORS-FS) for permission based detection of android malwares. Int J Mach Learn Cybern 10(7):1893–1907
Liu KY, Yang XB, Yu HL, Mi JS, Wang PX, Chen XJ (2019) Rough set based semi-supervised feature selection via ensemble selector. Knowl-Based Syst 165:282–296
Wang CZ, Huang Y, Shao MW, Fan XD (2019) Fuzzy rough set-based attribute reduction using distance measures. Knowl-Based Syst 164:205–212
Dai JH, Hu H, Wu WZ, Qian YH, Huang DB (2018) Maximal-discernibility-pairs-based approach to attribute reduction in fuzzy rough sets. IEEE Trans Fuzzy Syst 26(4):2174–2187
Li FC, Jin CX, Yang JN (2019) Roughness measure based on description ability for attribute reduction in information system. Int J Mach Learn Cybern 10(5):925–934
Chen Y, Wang PX, Yang XB, Mi JS, Liu D (2021) Granular ball guided selector for attribute reduction. Knowl-Based Syst 229:107326
Guo DD, Jiang CM, Sheng RX, Liu SS (2022) A novel outcome evaluation model of three-way decision: a change viewpoint. Inf Sci 607:1089–1110
Yuan KH, Xu WH, Li WT, Ding WP (2022) An incremental learning mechanism for object classification based on progressive fuzzy three-way concept. Inf Sci 584:127–147
Xu WH, Guo DD, Qian YH, Ding WP (2022) Two-way concept-cognitive learning method: a fuzzy-based progressive learning. IEEE Trans Fuzzy Syst.https://doi.org/10.1109/TFUZZ.2022.3216110
Liang JY, Wang F, Dang CY, Qian YH (2014) A group incremental approach to feature selection applying rough set technique. IEEE Trans Knowl Data Eng 26(2):294–308
Yang YY, Chen DG, Wang H (2017) Active sample selection based incremental algorithm for attribute reduction with rough sets. IEEE Trans Fuzzy Syst 25(4):825–838
Yang YY, Chen DG, Wang H (2018) Incremental perspective for feature selection based on fuzzy rough sets. IEEE Trans Fuzzy Syst 26(3):1257–1273
Ma FM, Ding MW, Zhang TF (2019) Compressed binary discernibility matrix based incremental attribute reduction algorithm for group dynamic data. Neurocomputing 294:1–17
Zhang X, Mei CL, Chen DG, Yang YY, Li JH (2020) Active incremental feature selection using a fuzzy-rough-set-based information entropy. IEEE Trans Fuzzy Syst 28(5):901–915
Zhang XY, Li JR (2023) Incremental feature selection approach to interval-valued fuzzy decision information systems based on\(\lambda\)-fuzzy similarity selfinformation. Inf Sci 625:593–619
Jing YG, Li TR, Huang JF, Zhang YY (2016) An incremental attribute reduction approach based on knowledge granularity under the attribute generalization. Int J Approx Reason 76:80–95
Wang F, Liang JY, Dang CY (2013) Attribute reduction for dynamic data sets. Appl Soft Comput 13(1):676–689
Jing YG, Li TR, Huang JF, Chen HM, Horng SJ (2017) A group incremental reduction algorithm with varying data values. Int J Intell Syst 32(9):900–925
Wei W, Wu XY, Liang JY, Cui JB, Sun YJ (2018) Discernibility matrix based incremental attribute reduction for dynamic data. Knowl-Based Syst 140:142–157
Jing YG, Li TR, Fujita H, Wang BL, Cheng N (2018) An incremental attribute reduction method for dynamic data mining. Inf Sci 465:202–218
Xu WH, Pan YZ, Chen XW, Ding WP, Qian YH (2022) A novel dynamic fusion approach using information entropy for interval-valued ordered datasets. IEEE Trans Big Data.https://doi.org/10.1109/TBDATA.2022.3215494
Xu ZY, Liu ZP, Yang BR, Song W (2006) Quick attribute reduction algorithm with complexity of max\((O(|C||U|), O(|C|^2|U/C|))\). Chin J Comput 29(3):391–399
Shu WH, Qian WB (2015) An incremental approach to attribute reduction from dynamic incomplete decision systems in rough set theory. Data Knowl Engi 100:116–132
Jing YG, Li TR, Luo C, Horng SJ, Wang GY, Yu Z (2016) An incremental approach for attribute reduction based on knowledge granularity. Knowl-Based Syst 104:24–38
Acknowledgements
This work is supported by the National Natural Science Foundation of China (61976089), the Major Program of the National Social Science Foundation of China (20 &ZD047), the Natural Science Foundation of Hunan Province (2021JJ30451, 2022JJ30397), and the Hunan Provincial Science & Technology Project Foundation (2018RS3065, 2018TP1018).
Author information
Authors and Affiliations
College of Information Science and Engineering, Hunan Normal University, Changsha, 410081, Hunan, China
Chucai Zhang, Hong Liu, Zhengxiang Lu & Jianhua Dai
Hunan Provincial Laboratory of Intelligent Computing and Language Information Processing, Hunan Normal University, Changsha, 410081, Hunan, China
Chucai Zhang & Jianhua Dai
Key Laboratory of Computing and Stochastic Mathematics (Ministry of Education), School of Mathematics and Statistics, Hunan Normal University, Changsha, 410081, Hunan, China
Jianhua Dai
- Chucai Zhang
You can also search for this author inPubMed Google Scholar
- Hong Liu
You can also search for this author inPubMed Google Scholar
- Zhengxiang Lu
You can also search for this author inPubMed Google Scholar
- Jianhua Dai
You can also search for this author inPubMed Google Scholar
Corresponding author
Correspondence toJianhua Dai.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Zhang, C., Liu, H., Lu, Z.et al. Fast attribute reduction by neighbor inconsistent pair selection for dynamic decision tables.Int. J. Mach. Learn. & Cyber.15, 739–756 (2024). https://doi.org/10.1007/s13042-023-01931-5
Received:
Accepted:
Published:
Issue Date:
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative