|Wlodzislaw Duch, Norbert Jankowski and Krzysztof Grabczewski, Nicolaus Copernicus University, Poland
Title: Meta-Learning: towards universal learning paradigms
Abstract: Data mining systems contain a large (and quickly growing) number of machine learning methods based on neural, fuzzy, pattern recognition and statistical ideas. Despite significant progress in various theoretical and applied areas many problems remain unsolved, comprehensive theory presenting a unified perspective on various learning methods is missing, large component-based data mining packages contain now hundreds of learning methods, input transformations, pre- and post-processing components that may be combined in more than 10 million ways. Although there is "no free lunch" (no single method is the best for all test problems) several methods that are close to optimal may be found through meta-learning based on heuristic search in the space of all possible learning models. Various model spaces are considered as the basis for meta-learning: 1) similarity-based algorithms that identify prototypes and optimize similarity measures; 2) heterogeneous systems, that include neural, fuzzy, prototype-based and hierarchical partitioning algorithms; 3) general transformation-based systems.
Most general implementation of meta-learning is possible within transformation-based learning paradigm that unifies most of computational intelligence research and shows how to solve the "crises of the richness" selecting optimal transformations to minimize complexity and maximize quality of the resulting data models. Meta-learning systems learn simplest data models that many sophisticated methods miss, generate multi-resolution models whenever needed, and solve difficult, highly non-separable problems that are beyond capabilities of current state-of-the-art algorithms, including neural networks and support vector machines. In contrast to backpropagation that tries to achieve linear separability in one shot additional criteria are defined after each transformation to create appropriate internal representations. Visualization of learning dynamics in transformation-based systems shows how to set simpler goals for learning, for example k-separability instead of linear separability.
This tutorial will include:
1) Review of various approaches to meta-learning.
2) Meta-learning as search in the model space - general idea.
3) Model space based on similarity-based learning.
4) Model space based on composition of transformations creating internal representations.
5) Complexity control of the search process.
6) Some implementation details.
7) Examples and some lessons learned from the use of meta-learnign.
1. Duch W, Towards comprehensive foundations of computational intelligence. In: W. Duch and J. Mandziuk, Challenges for Computational Intelligence. Springer Studies in Computational Intelligence, Vol. 63, 261-316, 2007
2. Duch W, Setiono R, Zurada J.M, Computational intelligence methods for understanding of data. Proc. of the IEEE 92(5) (2004) 771- 805
3. Duch W, Grudzi?ski K, Meta-learning via search combined with parameter optimization. Inteligent Information Systems, Advances in Soft Computing, Physica Verlag (Springer) 2002, pp. 13-22
4. Jankowski N, Gr?bczewski K, Building meta-learning algorithms basing on search controlled by machine's complexity and machines generators. IEEE World Congress on Computational Intelligence, IEEE Press, pp. 3600-3607, 2008.
5. Jankowski N, Grabczewski K, Increasing efficiency of metalearning machines with complexity control. Journal of Machine Learning Research (in prep).
6. Maszczyk T, Grochowski M, Duch W, Discovering Data Structures using Meta-learning, Visualization and Constructive Neural Networks. Book chapter, in print, Springer 2009
Bio Sketch: Wlodzislaw Duch heads the Department of Informatics, Nicolaus Copernicus University, Torun, Poland, and has been recently a Visiting Professor at the School of Computer Engineering, Nanyang Technological University, Singapore (2003-07). His Ph.D. was in quantum chemistry (1980), postdoc at Univ. of Souther California, Los Angeles (1980-82), D.Sc. in applied math (1987); worked at the University of Florida; Max-Planck-Institute, Munich, Germany, Kyushu Institute of Technology, Meiji and Rikkyo University in Japan, and several other institutions. He is on the editorial board of IEEE TNN, CPC, NIP-LR, Journal of Mind and Behavior, and 8 other journals; co-founder & scientific editor of the “Polish Cognitive Science” journal; president of the European Neural Networks Society executive committee (2006-2008; second term 2009-11), member of IEEE CIS Technical committee; expert of the European Union science programs; published about 400 scientific and popular articles, 4 books, edited many others, his DuchSoft company makes GhostMiner software package marketed by Fujitsu.