Skip to main content Skip to main navigation

Publication

Task prompt vectors: Effective initialization through multi-task soft prompt transfer

Robert Belanec; Simon Ostermann; Ivan Srba; Marie Bielikova
In: Machine Learning and Knowledge Discovery in Databases. Research Track and Applied Data Science Track. European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD-2025), September 15-19, Porto, Portugal, Pages 77-94, Lecture Notes in Computer Science (LNAI), Vol. 16020, Springer, Berlin, Heidelberg, 10/2025.

Abstract

Prompt tuning is a parameter-efficient method for adapting large language models (LLMs), where only a small continuous soft prompt is finetuned. In recent works, soft prompts have usually been trained in a task-specific way, leaving their multi-task capabilities underexplored. Our work aims to make soft prompts more task modular based on recent research on task vectors, where arithmetic operations are applied on full model weights to achieve the desired multi-task performance. To this end, we introduce Task Prompt Vectors, created by the element-wise difference between weights of tuned soft prompts and their random initialization. Experimental results on an extensive set of 19 datasets show that task prompt vectors can be used in low-resource settings to initialize prompt tuning on similar tasks effectively. In addition, we show that task prompt vectors are independent of the random initialization of prompt tuning …