Florian Daiber

Researcher | florian.daiber@dfki.de

ABOUT

Florian Daiber

I am researcher at the German Research Center for Artificial Intelligence (DFKI) in Saarbrücken, Germany. My main research is in the field of human computer interaction, intelligent user interfaces and computer supported collaborative work with a strong interest in stereoscopic tabletop interaction. In the Innovative Retail Laboratory (IRL) led by Prof. Dr. Antonio Krüger I am currenty working in the DFG-funded project named „Touching the 3rd Dimension (T3D) - Design and Analysis of Perceptually-Inspired Interaction Concepts for Stereoscopic Multi-touch Surfaces“, which focuses on the technological questions of how users interact with stereoscopically displayed three-dimensional content on a two-dimensional touch surface. My doctoral thesis also addresses this topic and will be presumably finished in 2014. In 2008 I received a diploma in geoinformatics at the Institute for Geoinformatics, University of Münster, Germany.

I have working experience in organizing different workshops and conferences, e.g. the „CHI SIG Touching the 3rd Dimension“ and was also involved in the follow-up CHI workshop “The 3rd dimension of CHI (3DCHI)” and the Dagstuhl Seminar “Touching the 3rd dimension”. I organized the Tutorial and Workshop on Interactive Surfaces for Interaction with Stereoscopic 3D (ISIS3D) at ITS 2013. Recently, I am Web and Social Media Chair at the ACM Symposium on Spatial User Interaction (SUI) 2014 and ACM Symposium on User Interface Software and Technology (UIST).

Florian Daiber

EDUCATION

Interaction with Stereoscopic Data on and above Multi-touch Surfaces
This doctoral thesis project evaluates multi-touch and gestural 3D interaction on and above interactive surfaces and explores the design space of interaction with stereoscopic data.
Saarbrücken Graduate School of Computer Science

GRADUATING IN 2014
IN PROGRESS

Gestural Multi-touch Interaction with Virtual Globes
Diploma in Geoinformatics
University of Münster

JULY 2008



PROJECTS

T3D
Touching the 3rd Dimension

Two technologies have dominated recent tech exhibitions as well as the entertainment market: multi-touch surfaces and 3D stereoscopic displays. Currently, these promising technologies are combined in different setups, and first commercial systems are available that support (multi-)touch interaction as well as stereoscopic display. Recent research projects address technological questions of how users interact with stereoscopically displayed three-dimensional content on a two-dimensional touch surface. The approach of combining multi-touch surfaces and 3D stereoscopic displays has great potential to provide plausible as well as natural interaction for a wide range of applications, e.g. in entertainment, planning and design, education, and decision-making. It can also be applied to different user interface systems ranging from 3D desktop environments to more immersive collaborative large tabletop or other projection-based setups.

Although stereoscopic multi-touch enabled surfaces induce several perceptual conflicts, e.g. visual-haptic or accommodation-vergence conflicts, it is reasonable that they will further dominate future user interfaces in various settings due to their potential as well as attractiveness for human users. So far most approaches have not taken into account the mentioned perceptual conflicts and are in most cases limited in their focus on the actual moment of touch (i.e. when the finger touches the surface), whereas the essential time period before the touch is rarely considered. Obviously - in the case of stereoscopic display - these moments are particularly important since most virtual objects are rendered not on the surface, but before or behind it. Hence, usually touching virtual objects and touching the physical surface occur at different moments during the interaction. The benefits, challenges and limitations of using this combination have not been examined in-depth and are so far not well understood.

The project Touching the 3rd Dimension (T3D) therefore aims to address these questions by analyzing the perceptual aspects during the lifetime of a touch, i.e. the pre-touch, as well as the actual touch phase. On the one hand we intend to design and evaluate different interaction concepts for stereoscopic multi-touch enabled surfaces based on perceptual limitations of the user, and on the other hand we will exploit our setup to gain novel insights into the nature of touch and perception in the real world. In addition we will explore potential application areas, in particular 3D modeling in the domains of city modeling and computer-aided design (CAD).

JULY 2013 - CURRENT

Nuance-Project
Multi-modal interaction with distant objects using eye gaze and multi-touch input

Tabletop interaction with objects in and out of reach is a common real world as well as virtual task. Gaze as additional input mode might support this interactions in terms of search, selection and manipulation of objects on digital tabletop. The aim of this work is the design and evaluation of interaction techniques that rely on gaze and gestural multi-touch input. In particular the selection and manipulation of distant objects will be investigated. This approach allows the interaction with different kinds of distant objects. First objects out of physical reach are easily made available to the user without forcing her to extreme and exhausting body movements. We aim to investigate the performance and accuracy of combined selection and manipulation using multi-modal input through explicit manipulation on implicit selected objects. Through our multi-modal approach we expect an improvement in terms of accuracy and task performance time.

JULY 2012 - JULY 2013

iMUTS
Interscopic Multi-touch Surfaces

In recent years visualization of and interaction with three-dimensional data have become more and more popular and widespread due to the requirements of numerous application areas. Two-dimensional desktop systems are often limited in cases where natural and intuitive interfaces are desired. Sophisticated 3D user interfaces, as they are provided by virtual reality (VR) systems consisting of stereoscopic projection and tracked input devices, are rarely adopted by ordinary users or even by experts. Since most applications dealing with three- dimensional data still use traditional 2D GUIs, current user interface designs obviously lack adequate 3D features and user support.

Multi-touch interaction has received considerable attention in the last few years, in particular for non-immersive, natural 2D interaction. Some multi-touch devices even support three degrees of freedom (DoF) in terms of 2D position on the surface and varying levels of pressure. Since multi-touch interfaces represent a good trade- off between intuitive, constrained interaction on a touch surface providing tangible feedback, and unrestricted natural interaction without any instrumentation, they have the potential to form the fundaments of the next generation 2D and 3D user interfaces. Stereoscopic display of 3D data provides an additional depth cue, but until now challenges and limitations for multi-touch interaction in this context have not been considered. In this project we aim to develop interscopic multi-touch user interfaces. An interscopic multi-touch surface (iMUTS) will allow users to interact intuitively with stereoscopically displayed 3D objects and with usually monoscopically displayed 2D content.

JANUARY 2010 - DECEMBER 2012

SoKNOS
Service-orientierte ArchiteKturen zur Unterstützung von Netzwerken im Rahmen Oeffentlicher Sicherheit (Service-Oriented ArchiteCtures Supporting Networks of Public Security)

The SoKNOS research project aimed to develop concepts that are valuable in the support of governmental agencies, private companies, and other organizations active in the handling of disastrous events in the public security sector. SoKNOS was funded by the Federal Ministry of Education and Research within the security research program of the German federal government.

SoKNOS developed data-based solutions that particularly shorten the structuring phase, i.e., the phase after the occurrence of the disaster. SoKNOS aimed to support a cross-organizational collaboration – in real-time and on all levels between local, regional, national, and international organizations.

JULY 2008 - DECEMBER 2009

TEACHING

SUMMER 2013

SUMMER 2013

WINTER 2012/13

WINTER 2011/12

WINTER 2011/12

Libavg II
Seminar

WINTER 2010/11

SUMMER 2010

Libavg I
Seminar

WINTER 2009/10


SELECTED PUBLICATIONS

Autostereoscopic Handheld AR

Is Autostereoscopy Useful for Handheld AR?

Frederic Kerber; Pascal Lessel; Michael Mauderer; Florian Daiber; Antti Oulasvirta; Antonio Krüger
In: Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia. ACM, 2013.
Autostereoscopy, mobile devices, depth discrimination, empirical and quantitative user study, augmented reality

Some recent mobile devices have autostereoscopic displays that enable users to perceive stereoscopic 3D without lenses or filters. This might be used to improve depth discrimination of objects overlaid to a camera viewfinder in augmented reality (AR). However, it is not known if autostereoscopy is useful in the viewing conditions typical to mobile AR. This paper investigates the use of autostereoscopic displays in an psychophysical experiment with twelve participants using a state-of-the-art commercial device. The main finding is that stereoscopy has a negligible if any effect on a small screen, even in favorable viewing conditions. Instead, the traditional depth cues, in particular object size, drive depth discrimination.


Interactive surfaces for interaction with stereoscopic 3d

Interactive Surfaces for Interaction with Stereoscopic 3D (ISIS3D): Tutorial and Workshop at ITS 2013

Florian Daiber; Bruno Rodrigues De Araujo; Frank Steinicke; Wolfgang Stuerzlinger
In: Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces. Pages 483-486, ACM, 2013.
Stereoscopic Displays, 3D User Interfaces and Interaction, Touch- and Gesture-based Interfaces, Adaptive and Perception-inspired Interfaces, Psychophysiological Studies related to Stereoscopy

With the increasing distribution of multi-touch capable de- vices multi-touch interaction becomes more and more ubiq- uitous. Multi-touch interaction offers new ways to deal with 3D data allowing a high degree of freedom (DOF) without instrumenting the user. Due to the advances in 3D technolo- gies, designing for 3D interaction is now more relevant than ever. With more powerful engines and high resolution screens also mobile devices can run advanced 3D graphics, 3D UIs are emerging beyond the game industry, and recently, first prototypes as well as commercial systems bringing (auto-) stereoscopic display on touch-sensitive surfaces have been proposed. With the Tutorial and Workshop on “Interactive Surfaces for Interaction with Stereoscopic 3D (ISIS3D)” we aim to provide an interactive forum that focuses on the chal- lenges that appear when the flat digital world of surface com- puting meets the curved, physical, 3D space we live in.


Combining Touch and Gaze for Distant Selection in a Tabletop Setting

Michael Mauderer; Florian Daiber; Antonio Krüger
In: CHI 2013: Workshop on Gaze Interaction in the Post-WIMP World. ACM International Conference on Human Factors in Computing Systems (CHI-13), 2013.
Gaze input, touch interaction, selection, flicking, gaze-supported interaction

Tabletop interaction with objects in and out of reach is a common real world as well as virtual task. Gaze as additional input modality might support this interactions on tabletops in terms of search, selection and manipulation of distant objects. The aim of this work is to design and evaluate an interaction technique that relies on gaze and gestural touch input for the selection of distant objects. The proposed approach makes objects that are out of physical reach easily available to the user, and aims to provide an increased selection accuracy compared to single modality approaches. The paper contributes a setup that allows to track people with a static eye-tracker in front of a tabletop and investigates an interaction technique that makes use of the flicking gesture augmented by gaze information to select distant objects.


Designing Gestures for Mobile 3D Gaming

Florian Daiber; Lianchao Li; Antonio Krüger
In: Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia. ACM, 2012.
3D User Interfaces, Gestural Interaction, Mobile Interaction, Mobile Gaming, Stereoscopic Display

In the last years 3D is getting more and more popular. Besides the increasing number of movies for 3D stereoscopic cinemas and television, serious steps have also been undertaken in the field of 3D gaming. Games with stereoscopic 3D output are now available not only for gamers with high-end PCs but also on handheld devices equipped with 3D autostereoscopic displays. Recent smartphone technology has powerful processors that allow complex tasks like image processing, e.g. used in augmented reality applications. Moreover these devices are nowadays equipped with various sensors that allow additional input modalities far beyond joystick, mouse, keyboard and other traditional input methods. In this paper we propose an approach for sensor-based interaction with stereoscopic displayed 3D data on mobile devices and present a mobile 3D game that makes use of these concepts.


Balloon Selection revisited - Multi-touch Selection Techniques for Stereoscopic Data

Florian Daiber; Eric Falk; Antonio Krüger
In: Proceedings of the International Conference on Advanced Visual Interfaces. Pages 441-444, ACM, 2012.
3D User Interfaces, Gestural Interaction, Selection tech- niques, Stereoscopic Display

In the last years 3D is getting more and more popular. Besides the increasing number of movies for 3D stereoscopic cinemas and television, serious steps have also been undertaken in the field of 3D gaming. Games with stereoscopic 3D output are now available not only for gamers with high-end PCs but also on handheld devices equipped with 3D autostereoscopic displays. Recent smartphone technology has powerful processors that allow complex tasks like image processing, e.g. used in augmented reality applications. Moreover these devices are nowadays equipped with various sensors that allow additional input modalities far beyond joystick, mouse, keyboard and other traditional input methods. In this paper we propose an approach for sensor-based interaction with stereoscopic displayed 3D data on mobile devices and present a mobile 3D game that makes use of these concepts.



CONTACT

Email
florian.daiber@dfki.de

Address
Innovative Retail Laboratory, DFKI GmbH
Stuhlsatzenhausweg 3, D-66123 Saarbrücken
Campus D3_2, Room 1.17

Phone
+49(0)681 85775 5115

SOCIAL LINKS