CS SEMINAR

Meta Learning as a unified tool for Selecting, Turning and Learning

Speaker
Andrey Filchenkov, Computer Technology Chair, IT and programming Department, Head of Machine Learning Research Group, ITMO University, St. Petersburg, Russia
Chaired by
Dr CHUA Tat Seng, KITHCT Chair Professor, School of Computing
chuats@comp.nus.edu.sg

21 Jan 2016 Thursday, 02:45 PM to 04:00 PM

Executive Classroom, COM2-04-02

ABSTRACT:
Nowadays, machine learning theory suggests a plethora of algorithms to solve various problems such as: classification, regression, clustering, etc. Hundreds of novel algorithms that outperform state-of-the-art baselines in specific domains are published each year. Wolpert and McReady's "No Free Lunch" theorems have buried the hope that one of these algorithms may outperform all the others for an arbitrary problem. This, thus, arises questions as follows.
The first question is the well-stated problem of algorithm selection. Literally, it seeks the answer to the dilemma: "Which algorithm from a predefined set will perform the best in solving a given problem? The answer could be formulated in terms of meta-learning. One need just to have the given problem de characterized with meta-features, then selection of the best algorithm is prediction of the best algorithm for a new problem. In this approach, the problem of the algorithm selection could be effectively reduced to the problem of finding proper meta-feature set.
Moreover, to enhance the best performance of algorithms, their hyperparameters must be properly tuned. Unfortunately, these hyperparamteres are usually defined on infinite sets which complicates selection of their values. Hyperparameters tuning is considered as a "black-box" optimization problem and frequently solved within active learning paradigm.
The second question is lesser studied in comparison to the first one, and could be formulated as: "How can we properly compare any two algorithms if we know that they are almost exactly identical?" Meta-learning could serve as a basis for such a comparison by reformulating the discourse above: instead of comparing algorithms, one may compare the algorithms domains of competence.
In this talk, we will briefly review research on algorithm selection, hyperparameter optimization and algorithm comparison. We will describe our current research in these directions and suggest the future work.

BIODATA:
Andrey Filchenkov graduated from St. Petersburg State University (Russia) in 2010 and International Banking University (St. Petersburg, Russia) in 2011. He received his PhD in 2013 in St. Petersburg State University (Russia). He is currently Associate Professor at Computer Technology chair and head of Machine Learning research group at International Laboratory "Computer Technologies", ITMO University (St. Petersburg, Russia).
Andrey's research interests are: meta-learning, algorithm comparison, hyperparameter optimization, ensemble learning, feature selection, structural learning and social media analysis. He is author of more than 100 publications in local and international journals and conferences.