In manned aviation, non-verbal communication via hand signals is a well-established alternative method in the event of technical or environmental communication problems. Despite state-of-the-art imaging sensor technology and increasing automation, unmanned aerial vehicles (UAVs) have so far lacked such a visual interface, so that the exchange of information even within visual range primarily takes place via a radio link, which can only be established via a compatible, technical communication device. Loss of this link is generally accompanied by loss of control of the unmanned system. In addition, the active emission of an electromagnetic signature can be undesirable or even dangerous for the system user in certain applications. In the field of collocated Human-Drone Interaction (cHDI), various efforts have therefore been made to enable natural and direct human-UAV interaction without the use of additional communication devices. In this context, nonverbal interaction via a visual interface offers the most advantages compared to other modalities, although the potential of this interface is currently not fully exploited because primarily the interaction aspect is addressed rather than the communication aspect. However, bidirectional and goal-directed information exchange is a basic requirement for a natural human-UAV interface, for which intention recognition is also needed. Thus, the goal of this work is to contribute to the expansion of the communicative capabilities of UAVs and to enable an alternative system access. For this purpose, an interdisciplinary concept for a visual human-UAV communication system was first developed, which can optically detect relevant nonverbal body signals and decode them into user intentions in a context-sensitive manner by means of a task-based dialog system. Based on this methodological foundation, a real-time capable experimental system was integrated into an UAV and its classification performance was evaluated. In subsequent human-machine experiments, the system concept was evaluated and additionally investigated which nonverbal subsignals are used for communication, especially by untrained users, which guidance procedures are suitable for collocated UAV guidance via a visual interface, and which factors promote comprehension problems. The suitability of the system concept could be proven in different usage contexts and a significant increase in effectiveness could be confirmed by context-sensitive signal decoding. Dialog support functions can also help to reduce user-induced (e.g., representation errors) as well as system-induced comprehension problems (e.g., classification errors) during a nonverbal human-UAV dialog and to achieve the communication goal more efficiently. However, for the transmission of more sophisticated communication content (e.g., mission data), higher demands are placed on the user's nonverbal capabilities, as a larger set of signals must be reproduced. An ambiguous vocabulary with gestural metaphors and synonyms can increase intuitiveness and provide the user with options for action in case of user-specific representation deficits or system-related recognition problems. Further potential lies in the three-dimensional representation of the user, the inclusion of additional visually perceptible body signals such as facial expressions, as well as in an overall more intelligent UAV behavior on the way to a natural human-UAV communication.
«In manned aviation, non-verbal communication via hand signals is a well-established alternative method in the event of technical or environmental communication problems. Despite state-of-the-art imaging sensor technology and increasing automation, unmanned aerial vehicles (UAVs) have so far lacked such a visual interface, so that the exchange of information even within visual range primarily takes place via a radio link, which can only be established via a compatible, technical communication dev...
»