Banner Banner

A transformer-based Framework for Multivariate Time Series Representation Learning

Icon

March 19, 2024 Icon 14:00 - 14:45

Icon

Technische Universität Berlin, Marchstraße 23 10587 Berlin, Room MAR 4.033

Icon

Georgios Zerveas, PhD

A transformer-based Framework for Multivariate Time Series Representation Learning

Abstract: Multivariate time series (MTS) are ubiquitous in a wide variety of domains, including science, medicine, finance, engineering and industrial applications. Despite the abundance of MTS data in the much-touted era of “Big Data”, the availability of labeled data in particular is far more limited: extensive data labeling is often prohibitively expensive or impractical, as it may require much time and effort, special infrastructure or domain expertise. For this reason, there is a great demand for methods that can offer high performance through the use of only a limited amount of labeled data, or by leveraging the existing plethora of unlabeled data.

In his talk, Georgios Zerveas will introduce the first transformer-based framework for unsupervised representation learning of MTS. Pre-trained models can be used for a variety of downstream tasks, such as regression and classification, forecasting and missing value imputation. This method constitutes the first unsupervised approach shown to push performance beyond what is attainable by contemporary supervised methods. It does so by a significant margin, even when the number of training samples is very limited, while offering computational efficiency. Besides the inherent advantages of the transformer architecture itself with respect to processing MTS, experiments demonstrate that unsupervised pre-training of the transformer models offers a substantial performance benefit over fully supervised learning, even without leveraging additional unlabeled data, i.e., by reusing the same data samples through the unsupervised objective.

These results foreshadow a remarkable potential for transformer-based models to usher in an era of foundation models for time series, just like transformer-based foundation models have revolutionized the state of the art in Natural Language Processing.

Bio Sketch: Georgios Zerveas has just earned his PhD in Computer Science at Brown University, specializing in Natural Language Processing and Information Retrieval, and will be joining Microsoft as a Senior Applied Scientist. He has worked in various fields of Deep Learning, including time series and multi-modal representation learning. Prior to his PhD studies, his research portfolio extended to numerical optimization and mathematical modeling for computational physics, and he has worked in the industry as a Machine Learning R&D engineer. His academic background includes a MSc in Information Technology and Electrical Engineering from ETH Zurich, and a Diploma in Electrical and Computer Engineering from the National Technical University of Athens.