320Accesses
2Citations
Abstract
In this paper a novel hybrid ensemble method aiming at the improvement of models accuracy in regression tasks is presented. The basic idea of the approach is the creation of an ensemble learner composed by astrong learner which is trained by exploiting data belonging to the whole training dataset and a set of specialisedweak learners trained by using data coming from limited regions of the input space determined by means of a self organising map based clustering. In this context, different methods have been tested for the design of the learners, including a hierarchical approach. In the simulation phase, the strong and weak learners operate according to their punctual self-estimated reliabilities so as to exploit their strengths and overcome their weaknesses. The method has been tested on literature and real world datasets achieving competitive results by outperforming other ensemble methods on most of the tested datasets and reducing the average absolute error by up to 10%.
This is a preview of subscription content,log in via an institution to check access.
Access this article
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
Price includes VAT (Japan)
Instant access to the full article PDF.



Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Dietterich T (2000) Ensemble methods in machine learning. Multiple classifier systems, lecture notes in computer science. Springer, Berlin
Embrechts MJ, Gatti CJ, Linton J, Gruber T, Sick B (2012) Forecasting exchange rates with ensemble neural networks and ensemble K-PLS: a case study for the US Dollar per Indian Rupee. In: The 2012 international joint conference on neural networks (IJCNN), pp 1–8
Cheng C, Xu W, Wang J (2012) A comparison of ensemble methods in financial market prediction. In: 2012 Fifth international joint conference on computational sciences and optimization (CSO), pp 755–759
Hirose H, Zaman F (2011) More accurate diagnosis in electric power apparatus conditions using ensemble classification methods. IEEE Trans Dielectr Electr Insul 18(5):1584–1590
Jiang Y, Zhou Z-H (2004) SOM ensemble-based image segmentation. Neural Process Lett 20(3):171–178
Baños O, Galvez J-M, Damas M, GuillÃn A, Herrera L-J, Pomares H, Rojas I, Villalonga C, Hong CS, Lee S (2015) Multiwindow fusion for wearable activity recognition. Lecture Notes Comput Sci 9095:290–297
Mu X, Watta P, Hassoun MH (2009) Analysis of a plurality voting-based combination of classifiers. Neural Process Lett 29(2):89–107
Hashemi HB, Yazdani N, Shakery A, Naeini MP (2010) Application of ensemble models in web ranking. In: 2010 5th international symposium on telecommunications (IST), pp 726–731
Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001
Opitz D, Maclin R (1999) Popular ensemble methods: an empirical study. J Artif Intell Res 11:169–198
Wei W, Yaoyao Z, Xiaolei H, Lopresti D, Zhiyun X, Long R, Antani S, Thoma G (2009) A classifier ensemble based on performance level estimation. In: IEEE international symposium on biomedical imaging: from nano to macro. ISBI ’09, pp 342–345
Artan Y, Huang X (2008) Combining multiple 2v-SVM classifiers for tissue segmentation. In: Proceedings of ISBI2008, pp 488–491
Johansson U, Lofstrom T, Lofstrom T (2008) The problem with ranking ensembles based on training or validation performance. In: 2008 IEEE international joint conference on neural networks (IEEE world congress on computational intelligence), Hong Kong, pp 3222–3228
Vannucci M, Colla V, Vannocci M, Nastasi G (2012) An ensemble classification method based on input clustering and classifiers expected reliability. In: Proceedings of 6th European modelling symposium on mathematical modelling and computer simulation EMS2012, Malta, November 14–16
Kuncheva LI (2000) Clustering-and-selection model for classifier combination. In: Proceedings of fourth international conference on knowledge-based intelligent engineering systems and allied technologies. Brighton, vol 1, pp 185–188
Colla V, Vannucci M, Allotta B, Malvezzi M (2003) Comparison of traditional and neural system for train speed estimation. In: Proceedings of the 11th European symposium on artificial neural networks ESANN 2004, vol 1, pp 401–406
Duval-Poo MA, Sosa-García J, Guerra-Gandón A, Vega-Pons S, Ruiz-Shulcloper J (2012) A new classifier combination scheme using clustering ensemble. In: Progress in pattern recognition, image analysis, computer vision, and applications: 17th Iberoamerican congress, CIARP 2012, Buenos Aires, Argentina, September 3–6, pp 154–161
Acharya A, Hruschka ER, Ghosh J, Acharyya S (2011) C3E: a framework for combining ensembles of classifiers and clusterers. In: Proceedings of 10th international workshop, MCS 2011, Naples, Italy, June 15–17, pp 269–278
Masoudnia S, Ebrahimpour R, Arani SAAA (2012) Incorporation of a regularization term to control negative correlation in mixture of experts. Neural Process Lett 36(1):31–47
Loo CK, Liew WS, Seera M, Lim E (2014) Probabilistic ensemble Fuzzy ARTMAP optimization using hierarchical parallel genetic algorithms. Neural Comput Appl 26(2):263–276
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140
Freund Y, Schapire R (1999) Experiments with a new boosting algorithm. In: Proceedings of the 13th international conference on machine learning. Bari, Italy, pp 148–156
Avnimelech R, Intrator N (1999) Boosting regression estimators. Neural Comput 11:499
Karakoulas G, Shawe Taylor J (2000) Towards a strategy for boosting regressors. In: Smola A, Brattlet P, Scholkopf B, Schuurmans D (eds) Advances in large margin classifiers. MIT Press, Cambridge, p 247
Riccardi A, Fernández-Navarro F, Carloni S (2014) Cost-sensitive AdaBoost algorithm for ordinal regression based on extreme learning machine. IEEE Trans Cybern 44(10):1898–1909
Fernández-Navarro F, Campoy-Munoz P, de la Paz-Marín M, Hervás-Martínez C, Yao X (2013) Addressing the EU sovereign ratings using an ordinal regression approach. IEEE Trans Cybern 43(6):2228–2240
Liu Y, Yao X (1999) Ensemble learning via negative correlation. Neural Netw 12(10):1399–1404
Reyneri LM, Colla V, Sgarbi M, Vannucci M (2009) Self-estimation of data and approximation reliability through neural networks. In: Proceedings of 10th international work-conference on artificial neural networks, IWANN 2009, Salamanca, Spain, June 10–12, 2009. Proceedings, Part I
Haykin S (2009) Neural networks: a comprehensive foundation—chapter 9: self-organizing. Prentice-Hall, Englewood Cliffs. ISBN 0-13-908385-5
Vannucci M, Colla V, Cateni S (2015) An hybrid ensemble method based on data clustering and weak learners reliabilities estimated through neural networks. In: Advances in computational intelligence. Lecture notes in computer science, vol 9095, pp 400–411
Rencher AC, Christensen WF (2012) Chapter 10, multivariate regression, section 10.1, introduction. Methods of multivariate analysis, Wiley series in probability and statistics, vol 709, 3rd edn. Wiley, New York, p 19
Bache K, Lichman M (2013) UCI machine learning repository.http://archive.ics.uci.edu/ml. University of California, School of Information and Computer Science, Irvine
Allotta B, Colla V, Malvezzi M (2002) Train position and speed estimation using wheel velocity measurements. Proc Inst Mech Eng F J Rail Rapid Transit 216(3):207–225
Author information
Authors and Affiliations
TeCIp Institute, Scuola Superiore Sant’Anna, Via Moruzzi, 1, 56124, Pisa, Italy
Marco Vannucci, Valentina Colla & Silvia Cateni
- Marco Vannucci
You can also search for this author inPubMed Google Scholar
- Valentina Colla
You can also search for this author inPubMed Google Scholar
- Silvia Cateni
You can also search for this author inPubMed Google Scholar
Corresponding author
Correspondence toMarco Vannucci.
Rights and permissions
About this article
Cite this article
Vannucci, M., Colla, V. & Cateni, S. Learners Reliability Estimated Through Neural Networks Applied to Build a Novel Hybrid Ensemble Method.Neural Process Lett46, 791–809 (2017). https://doi.org/10.1007/s11063-017-9586-6
Published:
Issue Date:
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative