This thesis investigates parameter-efficient fine-tuning (PEFT) of universal time series transformer models for energy forecasting, focusing on electrical load and photovoltaic generation. Using MOIRAI as a representative foundation model, the work explores PEFT methods such as QLoRA, LoRA, and DoRA to enable domain specialization with reduced computational cost. The study systematically analyzes the impact of data granularity, ranging from country and regional levels down to city and building scale, on forecasting performance. Fine-tuned models are benchmarked against classical machine learning approaches, other foundation models, and non-fine-tuned baselines using probabilistic and deterministic metrics. The results aim to clarify how efficient adaptation strategies can improve forecasting accuracy in the energy domain.
Parameter Efficient Finetuning of Universal Time Series Transformers for Energy Forecasting
Type: MA thesis
Status: running
Date: March 1, 2026 - August 31, 2026
Supervisors: Julian Oelhaf, Siming Bayer, Andreas Maier, Jessica Deuschel (Siemens AG)