Skip to main content Skip to main navigation


Enhancing One-class Support Vector Machines for Unsupervised Anomaly Detection

Mennatallah Amer; Markus Goldstein; Slim Abdennadher
In: Proceedings of the ACM SIGKDD Workshop on Outlier Detection and Description (ODD). International Conference on Knowledge Discovery and Data Mining (KDD-2013), August 11-14, Chicago, IL, USA, Pages 8-15, ISBN 978-1-4503-2335-2, ACM, New York, NY, USA, 8/2013.


Support Vector Machines (SVMs) have been one of the most successful machine learning techniques for the past decade. For anomaly detection, also a semi-supervised variant, the one-class SVM, exists. Here, only normal data is required for training before anomalies can be detected. In theory, the one-class SVM could also be used in an unsupervised anomaly detection setup, where no prior training is conducted. Unfortunately, it turns out that a one-class SVM is sensitive to outliers in the data. In this work, we apply two modifications in order to make one-class SVMs more suitable for unsupervised anomaly detection: Robust one-class SVMs and eta one-class SVMs. The key idea of both modifications is, that outliers should contribute less to the decision boundary as normal instances. Experiments performed on datasets from UCI machine learning repository show that our modifications are very promising: Comparing with other standard unsupervised anomaly detection algorithms, the enhanced one-class SVMs are superior on two out of four datasets. In particular, the proposed eta one-class SVM has shown the most promising results.


Weitere Links