Project Description

Novel In-vehicle Interaction Design and Evaluation



  • Sponsor: Hyundai Motor Company
  • Amount of Support: $130, 236
  • Duration of Support: 1 year

Purpose & Target

To investigate effectiveness of an in-vehicle gesture control system and culture-specific sound preference, Michigan Tech will design a prototype of in-air gesture system and auditory displays and conduct successive experiments using a medium fidelity driving simulator.

Technical Background

Touchscreens in vehicles have increased in popularity in recent years. Touchscreens provide many benefits over traditional analog controls like buttons and knobs. They also introduce new problems. Touchscreen use requires relatively high amounts of visual-attentional resources because they are visual displays. Driving is also a visually demanding task. Competition between driving and touchscreen use for visual-attentional resources has been shown to increase unsafe driving behaviors and crash risk [1]. Driving researchers have been calling for new infotainment system designs which reduce visual demands on drivers [2]. Recent technological advances have made it possible to develop in-air gesture controls. In-air gesture controls, if supported with appropriate auditory feedback, may limit visual demands and allow drivers to navigate menus and controls without looking away from the road. Research has shown that accuracy of surface gesture movements can be increased with addition of auditory feedback [3]. However, there are many unanswered questions surrounding the development of an auditory supported in-air gesture-controlled infotainment system: What type of auditory feedback do users prefer? How can auditory feedback be displayed to limit cognitive load? What type of menu can offer an easily navigable interface for both beginners and experienced users? More importantly, do these displays reduce the eyes-off-road time and frequency of long off-road glances? Does the system improve driving safety overall when compared to touchscreens or analog interfaces? These are among the many questions that we attempt to address in this research project. Moreover, we want to explore if there is any cultural difference in auditory perception. As a starting point, HMC and MTU will design in-vehicle sounds and conduct an experiment with American populations.


  • Horrey, and C. Wickens, “In-vehicle glance duration: distributions, tails, and model of crash risk” Transportation Research Record: Journal of the Transportation Research Board, vol. 2018, pp. 22-28, 2007.
  • Green, “Crashes induced by driver information systems and what can be done to reduce them,” In Sae Conf. Proc. SAE; 1999, 2000.
  • Hatfield, W. Wyatt, and J. Shea, “Effects of auditory feedback on movement time in a Fitts task,” Journal of Motor Behavior, vol. 42, no. 5, pp. 289-293, 2010.