Understanding how intelligence works and how it can be emulated in machines is an age old dream and arguably one of the biggest challenges in modern science. Learning, with its principles and computational implementations, is at the very core of this endeavor. Recently, for the first time, we have been able to develop artificial intelligence systems able to solve complex tasks considered out of reach for decades. Modern cameras recognize faces, and smart phones voice commands, cars can see and detect pedestrians and ATM machines automatically read checks. In most cases at the root of these success stories there are machine learning algorithms, that is softwares that are trained rather than programmed to solve a task. Among the variety of approaches to modern computational learning, we focus on regularization techniques, that are key to high- dimensional learning. Regularization methods allow to treat in a unified way a huge class of diverse approaches, while providing tools to design new ones. Starting from classical notions of smoothness, shrinkage and margin, the course will cover state of the art techniques based on the concepts of geometry (aka manifold learning), sparsity and a variety of algorithms for supervised learning, feature selection, structured prediction, multitask learning and model selection. Practical applications for high dimensional problems, in particular in computational vision, will be discussed. The classes will focus on algorithmic and methodological aspects, while trying to give an idea of the underlying theoretical underpinnings. Practical laboratory sessions will give the opportunity to have hands on experience.
RegML is a 22 hours advanced machine learning course including theory classes and practical laboratory sessions. The course covers foundations as well as recent advances in Machine Learning with emphasis on high dimensional data and a core set techniques, namely regularization methods. In many respect the course is compressed version of the 9.520 course at MIT".
The course started in 2008 has seen an increasing national and international attendance over the years with a peak of over 90 participants in 2014.
NOTE: the course has no registration fee, but participants need to take care of their travel and accommodation needs -- see below for a list of hotels.
Notification of acceptance: To be announced.
Related courses:
School will take place at Simula Research Laboratory, Martin Linges vei 25, 1364 Fornebu, Norway. Consult our page on directions and travelling information for more details on how to get to Simula. All lectures will take place in “Klasserommet” auditorium at Simula. All project work will take place in assigned workspaces at Simula (detailed later).
There are several hotels and options for accommodation available near Simula, and the city center is 25 minutes away by bus.
Here you can find a list of hotels near Simula.
Sandwiches will be provided at 12:30. Hot meal and other options can be found in the canteen, just outside Simula.
Università di Genova
Istituto Italiano di Tecnologia
Massachusetts Institute of Technology
lorenzo (dot) rosasco (at) unige (dot) it
University of Potsdam
University College London
Zalando Research
University College London
Istituto Italiano di Tecnologia
University of Oxford
CLASS | DAY | TIME | SUBJECT | FILES |
1 | Tue 5/2 | 9:00 - 10:30 | Introduction to Machine Learning | Lect 1 |
2 | Tue 5/2 | 11:00 - 12:30 | Local Methods and Model Selection | Lect 2 |
3 | Tue 5/2 | 14:00 - 16:00 | Laboratory 1: Local Methods for Classification | Lab 1 |
4 | Wed 5/3 | 13:00 - 14:30 | Tikhonov Regularization and Kernels | Lect 3 |
5 | Wed 5/3 | 15:00 - 17:00 | Laboratory 2: Binary classification and model selection | Lab 2 |
6 | Thu 5/4 | 9:00 - 10:30 | Early Stopping and Spectral Regularization | Lect 4 |
7 | Thu 5/4 | 11:00 - 12:30 | Regularization for Multi-task Learning | Lect 5 |
8 | Thu 5/4 | 14:00 - 16:00 | Laboratory 3: Spectral filters and multi-class classification | Lab 3 |
9 | Fri 5/5 | 9:00 - 10:30 | Sparsity Based Regularization | Lect 6 |
10 | Fri 5/5 | 11:00 - 12:30 | Structured Sparsity | Lect 7 |
11 | Fri 5/5 | 14:00 - 16:00 | Laboratory 4: Sparsity-based learning | Lab 4 |
- | Sat 5/6 | - | Workshop | |
- | Arthur Gretton | 10:00 - 10:45 | Kernel Adaptive Hamiltonian Monte Carlo using the Infinite Exponential Family | Slides |
- | Gilles Blanchard | 10:45 - 11:30 | Random Moments for Sketched Statistical Learning | Slides |
- | Lunch Break | 11:30 - 12:30 | - | |
- | Miguel Rodrigues | 12:30 - 13:15 | Multi-modal data processing with applications to art investigation and beyond | Slides |
- | Leonidas Lefakis | 13:15 - 14:00 | Information Theory Approaches to Feature Selection: Joint Informativeness and Tractability | Slides |
- | Break | 14:00 - 14:30 | - | |
- | Dino Sejdinovic | 14:30 - 15:15 | Learning with Kernel Embeddings | Slides |
- | Alessandro Rudi | 15:15 - 16:00 | How to scale up Kernel Machines for Large scale Machine Learning | Slides |
- | Q&A. Closing remarks | 16:00 - 16:30 | - |
Università di Genova
(also Istituto Italiano di Tecnologia and Massachusetts Institute of Technology)
lorenzo (dot) rosasco (at) unige (dot) it