Skip to main content Skip to main navigation

Publikation

Learning Parameters of Linear Models in Compressed Parameter Space

Yohannes Kassahun; Hendrik Wöhrle; Alexander Fabisch; Marc Tabie
In: Alessandro E. Villa; Włodzisław Duch; Péter Érdi; Francesco Masulli; Günther Palm (Hrsg.). Artificial Neural Networks and Machine Learning – ICANN 2012. Pages 108-115, Lecture Notes in Computer Science (LNCS), Vol. 7553, ISBN 978-3-642-33265-4, Springer, Berlin Heidelberg, 2012.

Zusammenfassung

We present a novel method of reducing the training time by learning parameters of a model at hand in compressed parameter space. In compressed parameter space the parameters of the model are represented by fewer parameters, and hence training can be faster. After training, the parameters of the model can be generated from the parameters in compressed parameter space. We show that for supervised learning, learning the parameters of a model in compressed parameter space is equivalent to learning parameters of the model in compressed input space. We have applied our method to a supervised learning domain and show that a solution can be obtained at much faster speed than learning in uncompressed parameter space. For reinforcement learning, we show empirically that searching directly the parameters of a policy in compressed parameter space accelerates learning.

Projekte

Weitere Links