Menu

Speaker: André Biedenkapp
Date: 2022-08-26

Abstract

The performance of many algorithms in the fields of hard combinatorial problem-solving, machine learning or AI in general depends on parameter tuning. Automated methods have been proposed to alleviate users from the tedious and error-prone task of manually searching for performance-optimized configurations. However, there is still a lot of untapped potential. Existing solution approaches often neglect the non-stationarity of hyperparameters where different hyperparameter values are optimal at different stages of an algorithms run. Taking the non-stationarity into account in the optimization procedure promises much better performances but also poses many new challenges. In this talk we will
discuss existing solution approaches to classical hyperparameter optimization and explore ways of tackling the non-stationarity of hyperparameters.

Categories: meetupnews