LabelMovie: a Semi-supervised Machine Annotation Tool with Quality Assurance and Crowd-sourcing Options for Videos

Zsolt Palotai, Miklós Láng, András Sárkány, Zoltán Tősér, Daniel Sonntag, Takumi Toyama, András Lőrincz

In: Proceedings of the 12th International Workshop on Content-Based Multimedia Indexing. International Workshop on Content-Based Multimedia Indexing (CBMI-14) IEEE 2014.


For multiple reasons, the automatic annotation of video recordings is challenging: first, the amount of database video instances to be annotated is huge, second, tedious manual labelling sessions are required, third, the multimodal annotation needs exact information of space, time, and context, fourth, the different labelling opportunities (e.g., for the case of affects) require special agreements between annotators, and so forth. Crowdsourcing with quality assurance by experts may come to the rescue here. We have developed a special tool where individual experts can annotate videos over the Internet, their work can be joined and filtered, the annotated material can be evaluated by machine learning methods, and automated annotation starts according to a predefined confidence level. Qualitative manual labelling instances by humans, the seeds, assure that relatively small samples of manual annotations can effectively bootstrap the machine annotation procedure. The annotation tool features special visualization methods for crowd- sourced users not familiar with machine learning methods and, in turn, ignites the bootstrapping process.

Deutsches Forschungszentrum für Künstliche Intelligenz
German Research Center for Artificial Intelligence