Most likely heteroscedastic Gaussian process regressionKristian Kersting; Christian Plagemann; Patrick Pfaff; Wolfram Burgard
In: Zoubin Ghahramani (Hrsg.). Machine Learning, Proceedings of the Twenty-Fourth International Conference. International Conference on Machine Learning (ICML-2007), Pages 393-400, ACM International Conference Proceeding Series, Vol. 227, ACM, 2007.
This paper presents a novel Gaussian process (GP) approach to regression with input-dependent noise rates. We follow Goldberg et al.'s approach and model the noise variance using a second GP in addition to the GP governing the noise-free output value. In contrast to Goldberg et al., however, we do not use a Markov chain Monte Carlo method to approximate the posterior noise variance but a most likely noise approach. The resulting model is easy to implement and can directly be used in combination with various existing extensions of the standard GPs such as sparse approximations. Extensive experiments on both synthetic and real-world data, including a challenging perception problem in robotics, show the effectiveness of most likely heteroscedastic GP regression.