A framework for bimanual inter-device interactions |
| |
Affiliation: | 1. Department of Computer Science and Engineering, National Chung Hsing University, Taichung, Taiwan;2. Department of Communications Engineering, National Chung Cheng University, Chiayi, Taiwan;1. School of Software, Central South University, Changsha, China;2. School of Information Science and Engineering, Hunan International Economics University, Changsha, China;1. Dipartimento di Informatica, Università degli Studi di Bari Aldo Moro, Via Orabona, 4 70125 Bari, Italy;2. Dipartimento di Informatica, Sapienza Università di Roma, Viale Regina Elena, 295 00161 Roma, Italy;3. Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano P.zza L. da Vinci, 32 201233 Milano, Italy |
| |
Abstract: | A shared interactive display (e.g., a tabletop) provides a large space for collaborative interactions. However, a public display lacks a private space for accessing sensitive information. On the other hand, a mobile device offers a private display and a variety of modalities for personal applications, but it is limited by a small screen. We have developed a framework that supports fluid and seamless interactions among a tabletop and multiple mobile devices. This framework can continuously track each user’s action (e.g., hand movements or gestures) on top of a tabletop and then automatically generate a unique personal interface on an associated mobile device. This type of inter-device interactions integrates a collaborative workspace (i.e., a tabletop) and a private area (i.e., a mobile device) with multimodal feedback. To support this interaction style, an event-driven architecture is applied to implement the framework on the Microsoft PixelSense tabletop. This framework hides the details of user tracking and inter-device communications. Thus, interface designers can focus on the development of domain-specific interactions by mapping user’s actions on a tabletop to a personal interface on his/her mobile device. The results from two different studies justify the usability of the proposed interaction. |
| |
Keywords: | Bimanual interaction Multimodal interface Tangible interface Human computer interaction |
本文献已被 ScienceDirect 等数据库收录! |
|