Florian Daiber

Researcher | florian.daiber@dfki.de

ABOUT

Florian Daiber

I am post-doctoral researcher at the Innovative Retail Laboratory (IRL) led by Prof. Dr. Antonio Krüger at the German Research Center for Artificial Intelligence (DFKI) in Saarbrücken, Germany. My main research is in the field of human-computer interaction, 3D user interfaces and ubiquitous sports technologies.

At DFKI, I am working in the EIT ALIGRE project. The ALIGRE project investigates how lighting can help to attract shoppers to certain areas in grocery stores. It project provides new, more advanced retail lighting solutions, using light zoning and semantic lighting. The benefits are supported by real-life test installations and accompanying user study in an operational grocery store.

I was Marie Curie Early Stage Researcher at the School of Computing and Communications in Lancaster, UK. At the School of Computing and Communications I was working on in the group of Prof. Hans Gellersen. Besides developing a post-doc research profile I explored gaze-based interaction.

In 2015, I defended my doctoral thesis on "Interaction with Stereoscopic Data on and above Interactive Surfaces". In 2008, I received a diploma in geoinformatics at the Institute for Geoinformatics, University of Münster, Germany.

I have working experience in organizing different workshops and conferences, e.g. the "CHI SIG Touching the 3rd Dimension" and was also involved in the follow-up CHI workshop "The 3rd dimension of CHI (3DCHI)" and the Dagstuhl Seminar "Touching the 3rd dimension". I organized the Tutorial and Workshop on Interactive Surfaces for Interaction with Stereoscopic 3D (ISIS3D) at ITS 2013. In the context of my Ubiquitous Sports Technology research I organized the UbiMount Workshop at UbiComp 2016 and I am co-organizer of the CHI 2017 SIG on Interactive Computing in Outdoor Recreation.

I was Web and Social Media Chair at the ACM Symposium on Spatial User Interaction (SUI) 2014 and ACM Symposium on User Interface Software and Technology (UIST), Publication Chair at UbiComp 2016 and PC Member at the ITS 2015, IUI 2015 and IUI 2016. Currently, I am Poster Chair at the SUI 2017 PC and serve as a PC Member at CHI PLAY 2017.

EDUCATION

Interaction with Stereoscopic Data on and above Multi-touch Surfaces
This doctoral thesis project evaluated multi-touch and gestural 3D interaction on and above interactive surfaces and explores the design space of interaction with stereoscopic data.
Saarbrücken Graduate School of Computer Science

MAY 2015

Gestural Multi-touch Interaction with Virtual Globes
Diploma in Geoinformatics
University of Münster

JULY 2008



PROJECTS

EIT ALIGRE
Affective lighting for novel grocery retail experiences

Today's grocery retail stores are typically lit with homogeneous ambient lighting of a single color temperature. Whereas in lighting design it is well known that applying variations in brightness and color temperature gives a more immersive user experience, better guides the attention of people and enhances perception. With the latest developments in LED technology and controls, it has now become affordable for retailers to differentiate the lighting conditions for the various zones in the supermarket (e.g. for product segments like wine and health products but also to fit zones to themes like Eastern or Christmas). Moreover, zones with semantic lighting technology offer novel ways of interaction with products and smartphone.

This ALIGRE project will test and validate the effect of the new lighting solutions on the shopping experience using highly advanced sensor and data analytics tools, thereby creating the necessary quantitative proof points to commercialize the propositions. The benefits are supported by real-life test installations and accompanying user study in an operational grocery store.

2016 - current

T3D
Touching the 3rd Dimension

Two technologies have dominated recent tech exhibitions as well as the entertainment market: multi-touch surfaces and 3D stereoscopic displays. Currently, these promising technologies are combined in different setups, and first commercial systems are available that support (multi-)touch interaction as well as stereoscopic display. Recent research projects address technological questions of how users interact with stereoscopically displayed three-dimensional content on a two-dimensional touch surface. The approach of combining multi-touch surfaces and 3D stereoscopic displays has great potential to provide plausible as well as natural interaction for a wide range of applications, e.g. in entertainment, planning and design, education, and decision-making. It can also be applied to different user interface systems ranging from 3D desktop environments to more immersive collaborative large tabletop or other projection-based setups.

Although stereoscopic multi-touch enabled surfaces induce several perceptual conflicts, e.g. visual-haptic or accommodation-vergence conflicts, it is reasonable that they will further dominate future user interfaces in various settings due to their potential as well as attractiveness for human users. So far most approaches have not taken into account the mentioned perceptual conflicts and are in most cases limited in their focus on the actual moment of touch (i.e. when the finger touches the surface), whereas the essential time period before the touch is rarely considered. Obviously - in the case of stereoscopic display - these moments are particularly important since most virtual objects are rendered not on the surface, but before or behind it. Hence, usually touching virtual objects and touching the physical surface occur at different moments during the interaction. The benefits, challenges and limitations of using this combination have not been examined in-depth and are so far not well understood.

The project Touching the 3rd Dimension (T3D) therefore aims to address these questions by analyzing the perceptual aspects during the lifetime of a touch, i.e. the pre-touch, as well as the actual touch phase. On the one hand we intend to design and evaluate different interaction concepts for stereoscopic multi-touch enabled surfaces based on perceptual limitations of the user, and on the other hand we will exploit our setup to gain novel insights into the nature of touch and perception in the real world. In addition we will explore potential application areas, in particular 3D modeling in the domains of city modeling and computer-aided design (CAD).

JULY 2013 - 2016

Nuance-Project
Multi-modal interaction with distant objects using eye gaze and multi-touch input

Tabletop interaction with objects in and out of reach is a common real world as well as virtual task. Gaze as additional input mode might support this interactions in terms of search, selection and manipulation of objects on digital tabletop. The aim of this work is the design and evaluation of interaction techniques that rely on gaze and gestural multi-touch input. In particular the selection and manipulation of distant objects will be investigated. This approach allows the interaction with different kinds of distant objects. First objects out of physical reach are easily made available to the user without forcing her to extreme and exhausting body movements. We aim to investigate the performance and accuracy of combined selection and manipulation using multi-modal input through explicit manipulation on implicit selected objects. Through our multi-modal approach we expect an improvement in terms of accuracy and task performance time.

JULY 2012 - JULY 2013

iMUTS
Interscopic Multi-touch Surfaces

In recent years visualization of and interaction with three-dimensional data have become more and more popular and widespread due to the requirements of numerous application areas. Two-dimensional desktop systems are often limited in cases where natural and intuitive interfaces are desired. Sophisticated 3D user interfaces, as they are provided by virtual reality (VR) systems consisting of stereoscopic projection and tracked input devices, are rarely adopted by ordinary users or even by experts. Since most applications dealing with three- dimensional data still use traditional 2D GUIs, current user interface designs obviously lack adequate 3D features and user support.

Multi-touch interaction has received considerable attention in the last few years, in particular for non-immersive, natural 2D interaction. Some multi-touch devices even support three degrees of freedom (DoF) in terms of 2D position on the surface and varying levels of pressure. Since multi-touch interfaces represent a good trade- off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the fundaments of the next generation 2D and 3D user interfaces. Stereoscopic display of 3D data provides an additional depth cue, but until now challenges and limitations for multi-touch interaction in this context have not been considered. In this project we aim to develop interscopic multi-touch user interfaces. An interscopic multi-touch surface (iMUTS) will allow users to interact intuitively with stereoscopically displayed 3D objects and with usually monoscopically displayed 2D content.

JANUARY 2010 - DECEMBER 2012

SoKNOS
Service-orientierte ArchiteKturen zur Unterstützung von Netzwerken im Rahmen Oeffentlicher Sicherheit (Service-Oriented ArchiteCtures Supporting Networks of Public Security)

The SoKNOS research project aimed to develop concepts that are valuable in the support of governmental agencies, private companies, and other organizations active in the handling of disastrous events in the public security sector. SoKNOS was funded by the Federal Ministry of Education and Research within the security research program of the German federal government.

SoKNOS developed data-based solutions that particularly shorten the structuring phase, i.e., the phase after the occurrence of the disaster. SoKNOS aimed to support a cross-organizational collaboration – in real-time and on all levels between local, regional, national, and international organizations.

JULY 2008 - DECEMBER 2009


SELECTED PUBLICATIONS

FootStriker

FootStriker: An EMS-based Foot Strike Assistant for Running.

Mahmoud Hassan, Florian Daiber, Frederik Wiehr, Felix Kosmalla, and Antonio Krüger
Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 1, Article 2 (March 2017), 18 pages.
Electrical Muscle Stimulation, Wearable Devices, Wearables, Real-time Feedback, Motor Skills, Motor Learning, Sports Training, Running, In-situ Feedback, Online Feedback, Real-time Assistance

In running, knee-related injuries are very common. The main cause are high impact forces when striking the ground with the heel rst. Mid- or forefoot running is generally known to reduce impact loads and to be a more e cient running style. In this paper, we introduce a wearable running assistant, consisting of an electrical muscle stimulation (EMS) device and an insole with force sensing resistors. It detects heel striking and actuates the calf muscles during the ight phase to control the foot angle before landing. We conducted a user study, in which we compared the classical coaching approach using slow motion video analysis as a terminal feedback to our proposed real-time EMS feedback. The results show that EMS actuation signi cantly outperforms traditional coaching, i.e., a decreased average heel striking rate, when using the system. As an implication, EMS feedback can generally be bene cial for the motor learning of complex, repetitive movements.


ClimbSense - DIY Wrist-worn IMUs

ClimbSense - Automatic Climbing Route Recognition using Wrist-worn Inertia Measurement Units

Felix Kosmalla, Florian Daiber, Antonio Krüger
In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM International Conference on Human Factors in Computing Systems, ACM, 2015.
Climbing, Sports Technologies, Inertial Sensors, Machine Learning

Today, sports and activity trackers are ubiquitous. Especially runners and cyclists have a variety of possibilities to record and analyze their workouts. In contrast, climbing did not find much attention in consumer electronics and human-computer interaction. If quantified data similar to cycling or running data were available for climbing, several applications would be possible, ranging from simple training diaries to virtual coaches or usage analytics for gym operators. This paper introduces a system that automatically recognizes climbed routes using wrist-worn inertia measurement units (IMUs). This is achieved by extracting features of a recorded ascent and use them as training data for the recognition system. To verify the recognition system, cross-validation methods were applied to a set of ascent recordings that were assessed during a user study with eight climbers in a local climbing gym. The evaluation resulted in a high recognition rate, thus proving that our approach is possible and operational.


Hoverspace

Hoverspace

Paul Lubos, Oscar Ariza, Gerd Bruder, Florian Daiber, Frank Steinicke, Antonio Krüger
In: Julio Abascal; Simone Barbosa; Mirko Fetter; Tom Gross; Philippe Palanque; Marco Winckler (Hrsg.). Human-Computer Interaction – INTERACT 2015. Pages 259-277, Lecture Notes in Computer Science (LNCS), Vol. 9298, ISBN 978-3-319-22697-2, Springer International Publishing, 2015.
Hover Space, Touch Interaction, Stereoscopic Displays, 3D Interaction

Recent developments in the area of stereoscopic displays and tracking technologies have paved the way to combine touch interaction on interactive surfaces with spatial interaction above the surface of a stereoscopic display. This holistic design space supports novel affordances and user experiences during touch interaction, but also induce challenges to the interaction design. In this paper we introduce the concept of hover interaction for such setups. Therefore, we analyze the non-visual volume above a virtual object, which is perceived as the corresponding hover space for that object. The results show that the users’ perceptions of hover spaces can be categorized into two groups. Either users assume that the shape of the hover space is extruded and scaled towards their head, or along the normal vector of the interactive surface. We provide a corresponding model to determine the shapes of these hover spaces, and confirm the findings in a practical application. Finally, we discuss important implications for the development of future touch-sensitive interfaces.


Interacting with 3D Content on Stereoscopic Displays

Interacting with 3D Content on Stereoscopic Displays.

Florian Daiber, Marco Speicher, Sven Gehring, Markus Löchtefeld, Antonio Krüger
In: Proceedings of the International Symposium on Pervasive Displays. International Symposium on Pervasive Displays. Pages 32:32--32:37, ACM, 2014.
Spatial Interaction, Gestural Interaction, Mobile Interaction, 3D Travel, Large Displays, Media Facades

Along with the number of pervasive displays in urban environments, recent advances in technology allow to display three-dimensional (3D) content on these displays. However, current input techniques for pervasive displays usually focus on interaction with 2D data. To enable interaction with 3D content on pervasive displays, we need to adapt existing and create novel interaction techniques. In this paper we investigate remote interaction with 3D content on pervasive displays. We introduce and evaluate four 3D travel techniques that rely on well established interaction metaphors and either use a mobile device or depth tracking as spatial input. Our study on a large-scale stereoscopic display shows that the physical travel techniques outperformed the virtual techniques with respect to task performance time and error rate.


Autostereoscopic Handheld AR

Is Autostereoscopy Useful for Handheld AR?

Frederic Kerber; Pascal Lessel; Michael Mauderer; Florian Daiber; Antti Oulasvirta; Antonio Krüger
In: Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia. ACM, 2013.
Autostereoscopy, mobile devices, depth discrimination, empirical and quantitative user study, augmented reality

Some recent mobile devices have autostereoscopic displays that enable users to perceive stereoscopic 3D without lenses or filters. This might be used to improve depth discrimination of objects overlaid to a camera viewfinder in augmented reality (AR). However, it is not known if autostereoscopy is useful in the viewing conditions typical to mobile AR. This paper investigates the use of autostereoscopic displays in an psychophysical experiment with twelve participants using a state-of-the-art commercial device. The main finding is that stereoscopy has a negligible if any effect on a small screen, even in favorable viewing conditions. Instead, the traditional depth cues, in particular object size, drive depth discrimination.


Interactive surfaces for interaction with stereoscopic 3d

Interactive Surfaces for Interaction with Stereoscopic 3D (ISIS3D): Tutorial and Workshop at ITS 2013

Florian Daiber; Bruno Rodrigues De Araujo; Frank Steinicke; Wolfgang Stuerzlinger
In: Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces. Pages 483-486, ACM, 2013.
Stereoscopic Displays, 3D User Interfaces and Interaction, Touch- and Gesture-based Interfaces, Adaptive and Perception-inspired Interfaces, Psychophysiological Studies related to Stereoscopy

With the increasing distribution of multi-touch capable de- vices multi-touch interaction becomes more and more ubiq- uitous. Multi-touch interaction offers new ways to deal with 3D data allowing a high degree of freedom (DOF) without instrumenting the user. Due to the advances in 3D technolo- gies, designing for 3D interaction is now more relevant than ever. With more powerful engines and high resolution screens also mobile devices can run advanced 3D graphics, 3D UIs are emerging beyond the game industry, and recently, first prototypes as well as commercial systems bringing (auto-) stereoscopic display on touch-sensitive surfaces have been proposed. With the Tutorial and Workshop on “Interactive Surfaces for Interaction with Stereoscopic 3D (ISIS3D)” we aim to provide an interactive forum that focuses on the chal- lenges that appear when the flat digital world of surface com- puting meets the curved, physical, 3D space we live in.


Designing Gestures for Mobile 3D Gaming

Florian Daiber; Lianchao Li; Antonio Krüger
In: Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia. ACM, 2012.
3D User Interfaces, Gestural Interaction, Mobile Interaction, Mobile Gaming, Stereoscopic Display

In the last years 3D is getting more and more popular. Besides the increasing number of movies for 3D stereoscopic cinemas and television, serious steps have also been undertaken in the field of 3D gaming. Games with stereoscopic 3D output are now available not only for gamers with high-end PCs but also on handheld devices equipped with 3D autostereoscopic displays. Recent smartphone technology has powerful processors that allow complex tasks like image processing, e.g. used in augmented reality applications. Moreover these devices are nowadays equipped with various sensors that allow additional input modalities far beyond joystick, mouse, keyboard and other traditional input methods. In this paper we propose an approach for sensor-based interaction with stereoscopic displayed 3D data on mobile devices and present a mobile 3D game that makes use of these concepts.


Balloon Selection revisited - Multi-touch Selection Techniques for Stereoscopic Data

Florian Daiber; Eric Falk; Antonio Krüger
In: Proceedings of the International Conference on Advanced Visual Interfaces. Pages 441-444, ACM, 2012.
3D User Interfaces, Gestural Interaction, Selection tech- niques, Stereoscopic Display

In the last years 3D is getting more and more popular. Besides the increasing number of movies for 3D stereoscopic cinemas and television, serious steps have also been undertaken in the field of 3D gaming. Games with stereoscopic 3D output are now available not only for gamers with high-end PCs but also on handheld devices equipped with 3D autostereoscopic displays. Recent smartphone technology has powerful processors that allow complex tasks like image processing, e.g. used in augmented reality applications. Moreover these devices are nowadays equipped with various sensors that allow additional input modalities far beyond joystick, mouse, keyboard and other traditional input methods. In this paper we propose an approach for sensor-based interaction with stereoscopic displayed 3D data on mobile devices and present a mobile 3D game that makes use of these concepts.



CONTACT

Email
florian.daiber@dfki.de

Address
Innovative Retail Laboratory, DFKI GmbH
Stuhlsatzenhausweg 3, D-66123 Saarbrücken
Campus D3_2, Room 1.17

Phone
+49(0)681 85775 5115

SOCIAL LINKS