Skip to main content Skip to main navigation


Prototype selection based on set covering and large margins

Benjamin Paaßen; Thomas Villmann
In: Frank-Michael Schleif; Marika Kaden; Thomas Villmann (Hrsg.). Abstracts of the 13th Mittweida Workshop on Computational Intelligence - MiWoCI 2021. Mittweida Workshop on Computational Intelligence (MiWoCI-2021), Pages 35-42, Machine Learning Reports, Vol. 03/2021, University of Applied Sciences Mittweida, 10/2021.


Classification via nearest prototypes is a fast, interpretable, and flexible scheme for classification [5]. The selection of prototypes ought to achieve two goals: Minimizing classification errors and representing the classes well. In this paper, we explore two cost functions from the literature which incorporate these goals, namely the large margin nearest neighbor cost function of Weinberger and Saul [7] as well as the prototype selection scheme of Bien and Tibshirani [1]. We highlight similarities and differences of both, thus sharpening our understanding of the prototype selection problem.