Menu

ELLIS / ELIZA Social, June 3rd 2024

Date: June 3rd, 17:30
Location: Building 80 Room 00-021

The unit invites everyone interested to the next social event on June 3rd! Get ready for an engaging gathering filled with insightful discussions. A great event to connect with the AI community in Freiburg!

This time, there are two ten-minute presentations.

  1. Tom Viering, Assistant Professor at TU-Delft, who is currently visiting the ML lab, will present a talk on “Surprising Learning Curves and Where to Find Them.”
  2. Sai Prasanna will be presenting a talk on his recent paper accepted at the RLC conference, “Dreaming of Many Worlds: Learning Contextual World Models Aids Zero-Shot Generalization.”

The talks are followed by a BBQ, providing the perfect networking opportunity! In case the weather turns bad the BBQ will be substituted by a board game night at the ML lab instead.

2024 AutoML Conference

In 2022 and 2023 we Prof. Dr. Frank Hutter, together with other postdocs and PhD students from the AutoML lab in the ELLIS Unit Freiburg, have organized the AutoML conference, the premier gathering of professionals focussed on the progressive automation of machine learning (ML), aiming to develop automated methods for making ML methods more efficient, robust, trustworthy, and available to everyone. This year the conference will be held in Paris: https://2024.automl.cc/

[ELLIS Meetup] Multi-objective Differentiable Neural Architecture Search

Speaker: Arber Zela
Date: 29th April 2024

Pareto front profiling in multi-objective optimization (MOO), i.e. finding a diverse set of Pareto optimal solutions, is challenging, especially with expensive objectives like neural network training. Typically, in MOO neural architecture search (NAS), we aim to balance performance and hardware metrics across devices. Prior NAS approaches simplify this task by incorporating hardware constraints into the objective function, but profiling the Pareto front necessitates a computationally expensive search for each constraint. In this work, we propose a novel NAS algorithm that encodes user preferences for the trade-off between performance and hardware metrics, and yields representative and diverse architectures across multiple devices in just one search run. To this end, we parameterize the joint architectural distribution across devices and multiple objectives via a hypernetwork that can be conditioned on hardware features and preference vectors, enabling zero-shot transferability to new devices. Extensive experiments with up to 19 hardware devices and 3 objectives showcase the effectiveness and scalability of our method. Finally, we show that, without extra costs, our method outperforms existing MOO NAS methods across a broad range of qualitatively different search spaces and datasets, including MobileNetV3 on ImageNet-1k, an encoder-decoder transformer space for machine translation and a decoder-only transformer space for language modelling.

[ELLIS Meetup] Prior-data Fitted Networks (PFNs)

Speaker: Samuel Müller
Date: 2022-09-09

Location: Building 101, Room 00.036

Samuel Müller from the Machine Learning Lab will give a talk titled “Prior-data Fitted Networks (PFNs): Use neural networks for 100x faster Bayesian predictions“.

Abstract

Bayesian methods can be expensive and complicated to approximate with e.g. Markov Chain Monte Carlo (MCMC) or Variational Inference (VI) approaches. Prior-data Fitted Networks (PFNs) are a new, cheap and simple method to accurately approximate Bayesian predictions. I will explain how to build a PFN out of a Transformer by learning to model artificial data. I present the results from our paper that introduces PFNs, in which PFNs beat VI and MCMC for some standard tasks. As well as some more recent results with our new TabPFN, where we show that a simple PFN can replace a full AutoML tool for small datasets.