Join us on
July 27-28, 2021
for a online Summer School on
Pattern Recognition
and Machine Learning
The AERFAI Summer School 2021 aims to offer a comprehensive overview of recent methods related to Pattern Recognition and Machine Learning. Each tutorial will be composed of a theoretical part followed by a hands-on practical session.
July 27-28. Talks from 9:30h to 18:30h (includes intermissions between talks).
WhereOnline
Price
Free. Only for AERFAI members.
Not a member? Join
AERFAI.
Since
there is a limited number of attendants, priority will be given to the AERFAI senior members and students.
Registration deadline is on July 22.
Organization: Alicia Fornés (afornes[at]cvc[dot]uab[dot]es) and Julián Fierrez (julian.fierrez[at]uam[dot]es)
Human Behavior Modeling with Machine Learning: Opportunities and Challenges
Human behavior is complex, multi-level, multimodal, culturally and contextually shaped. Computer analysis of human behavior in its multiple scales and settings leads to a steady influx of new applications in diverse domains including human-computer interaction, affective computing, social signal processing and computational social sciences, autonomous systems, smart healthcare, customer behavior analysis, urban computing and AI for social good. This tutorial will be partly based on the material we have prepared with Nuria Oliver for an invited tutorial at NeurIPS 2019 with the same title. I will share our proposed taxonomy to understand, model and predict individual, dyadic and aggregate human behaviors from a variety of data sources and using machine learning techniques. Then I will illustrate this taxonomy through relevant examples from the literature and from my own research, and I will highlight existing open challenges and research directions that might inspire attendees to embark in the fascinating and promising area of computational human behavior modeling. While the applications of human behavior analysis are very broad, I will focus on those in behavioral and clinical sciences, and give some high level pointers about the existing models, their strengths and shortcomings.
A Tutorial on Gaussian Processes and Bayesian Optimization
In this tutorial, I will introduce Gaussian processes (GPs) as powerful regression models for the prediction of unknown functions. They are non-parametric methods and hence become more expressive the larger the size of the training set. Importantly, exact Bayesian inference is tractable under these models, which means that they can be used to generate a predictive distribution that captures the uncertainty about the predictions made. Such an uncertainty estimate is very useful to know what the model does not know and has several applications. A drawback of GPs is that they do not scale very well with the size of the training set. However, approximate methods can make these models scale to datasets with millions of points. I will review these techniques in this tutorial. An important application of GPs is Bayesian optimization (BO). Many optimization problems are characterized by an objective function that is very expensive to evaluate, as it may involve carrying out a time-consuming experiment. The objective may also lack a closed-form expression and, moreover, the evaluation process can be noisy. Examples of these problems include tuning the hyper-parameters of a deep neural network, adjusting the parameters of the control system of a robot, or finding new materials for, e.g., solar energy production. In this tutorial, I will also present a general overview of BO, a collection of methods that can be used to efficiently solve problems with the characteristics described. For this, BO methods often rely on a GP model of the objective to make, at each iteration, intelligent decisions about where to evaluate next the objective to solve efficiently the optimization problem. In many practical problems this can save a lot of computational time.
Graph Neural Networks for Pattern Recognition
Many tasks in Pattern Recognition are represented as graphs. In particular, graph matching has been a core problem in Pattern Recognition when comparing objects. Despite the NP-hard nature of the problem, fast and accurate approximations have led to significant progress in a wide range of applications. Nowadays, the new advances on Deep Learning have provided new tools to tackle such problems. In this tutorial, we will present to the audience an easy introduction to the graph neural network techniques. Starting from the basic notation, we will present the recent trends in graph representation learning and its applications to various pattern recognition problem such as chemical molecule classification and community detection in graph representation of social networks. Finally, we will apply all this knowledge on a hands-on session in some toy examples.
Deepdive into components of NLP pipeline
Have you ever wondered how machines understand Natural Language? How "Google Translate" works? How "Siri", a robotic voice, responds to your voice commands? Or how a piece of software understands a text document and does automatic summarization or extract relevant context? The answer to all these questions lie in this tutorial where we explore the astounding domain of Natural Language Processing (NLP). We will unveil the very concepts of NLP with the help of your notion of how you understand the natural human language. With tons of text data being produced every day, the domain of Natural Language Processing is gaining exponential attention with simultaneous advances in Machine Learning and Deep Learning. In this tutorial, we will present you the nitty-gritties and the pipeline process of NLP along with hands-on exercises familiarizing the attendees with the NLP pipeline.
Tues 27 | |
09:30 - 11:00 |
T1: Human Behavior Modeling with Machine Learning (Albert Ali Salah)
|
11:00 - 11:30 |
Break
|
11:30 - 13:00 |
T1: Human Behavior Modeling (part 2)
|
13:15 - 15:00 |
Lunch Break
|
15:00 - 16:30 |
T2: A Tutorial on Gaussian Processes and Bayesian Optimization
(Daniel Hernández Lobato)
|
16:30 - 17:00 |
Break
|
17:00 - 18:30 |
T2: Hands-on practical part
|
Wed 28 | |
09:30 - 11:00 |
T3: Graph Neural Networks for Pattern Recognition (Pau Riba)
|
11:00 - 11:30 |
Coffee Break
|
11:30 - 13:00 |
T3: Hands-on practical part
|
13:15 - 15:00 |
Lunch Break
|
15:00 - 16:30 |
T4: Deepdive into components of NLP pipeline (Dhara Kotecha and Falak Shah)
|
16:30 - 17:00 |
Break
|
17:00 - 18:30 |
T4: Hands-on practical part
|