Skip to main content Skip to main navigation

Publication

The Power of Training: How Different Neural Network Setups Influence the Energy Demand

Daniel Geißler; Bo Zhou; Mengxi Liu; Sungho Suh; Paul Lukowicz
In: AAAI-24 Workshop Program. AAAI Conference on Artificial Intelligence (AAAI), Workshop Sustainable AI, AAAI Conference and Symposium Proceedings, 2024.

Abstract

This work examines the effects of variations in machine learning training regimes and learning paradigms on the corresponding energy consumption. While increasing data availability and innovation in high-performance hardware fuels the training of sophisticated models, it also supports the fading perception of energy consumption and carbon emission. Therefore, the goal of this work is to create awareness about the energy impact of general training parameters and processes, from learning rate over batch size to knowledge transfer. Multiple setups with different hyperparameter initializations are evaluated on two different hardware configurations to obtain meaningful results. Experiments on pretraining and multitask training are conducted on top of the baseline results to determine their potential towards sustainable machine learning.

Projekte