首页 | 本学科首页   官方微博 | 高级检索  
     


An integrated highly synchronous,high resolution,real time eye tracking system for dynamic flight movement
Affiliation:1. ISAE-SUPAERO, Université de Toulouse, France;2. Airbus Group Innovations, Toulouse, France;3. Laboratoire de neurosciences cognitives, Département d’études cognitives, École normale supérieure, INSERM, PSL Research University, 75005 Paris, France;4. ENAC, University of Toulouse, France;1. Barrow Neurological Institute, Phoenix, AZ, USA;2. Mind, Brain, and Behavior Research Center (CIMCYC), University of Granada, Granada, Spain;3. Joint Center University of Granada — Spanish Army Training and Doctrine Command, Granada, Spain;4. Department of Ophthalmology, State University of New York, Downstate Medical Center, Brooklyn, NY, USA;5. Aviation Survival Training Center Miramar, San Diego, CA, USA;6. Marine Aircraft Group 39, 3rd Marine Aircraft Wing, Marine Corps Base, Camp Pendleton, CA, USA
Abstract:Electronic surveillance systems are being used rapidly today, ranging from a simple video camera to a complex biometric surveillance system for facial patterns and intelligent computer vision based surveillance systems, which are applied in many fields such as home monitoring, security surveillance of important places and mission critical tasks like air traffic control surveillance. Such systems normally involve a computer system and a human surveillance operator, who looks at the dynamic display to perform his surveillance tasks. Exploitation of shared information between these physical heterogeneous data capture systems with human operated functions is one emerging aspect in electronic surveillance that has yet to be addressed deeply. Hence, an innovative interaction interface for such knowledge extraction and representation is required. Such an interface should establish a data activity register frame which captures information depicting various surveillance activities at a specified spatial and time reference.This paper presents a real time eye tracking system, which integrates two sets of activity data in a highly dynamic changing and synchronous manner in real-time with respect to both spatial and time frames, through the “Dynamic Data Alignment and Timestamp Synchronisation Model”. This model matches the timestamps of the two data streams, aligns them to the same spatial reference frame before fusing them together into a data activity register frame. The Air Traffic Control (ATC) domain is used to illustrate this model, where experiments are conducted under simulated radar traffic situations with participants and their radar input data. Test results revealed that this model is able to synchronise the timestamp of the eye and dynamic display data, align both of these data spatially, while taking into account dynamic changes in space and time on a simulated radar display. This system can also distinguish and show variations in the monitoring behaviour of participants. As such, new knowledge can be extracted and represented through this innovative interface, which can then be applied to other applications in the field of electronic surveillance to unearth monitoring behaviour of the human surveillance operator.
Keywords:Innovative interaction interface  Knowledge extraction and representation  Electronic surveillance  Air traffic control  Visual monitoring  Eye tracking
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号