AMICI: Adaptive Multimodal 
In-Car Interaction

The AMICI project develops technology for future vehicles.

About the AMICI project

The project develops methods for machine understanding of the driving situation, particularly monitoring driver state (e.g., gaze direction, alertness). This facilitates optimising presented information according to the situation and urgency. Another focus is multimodal interaction: information is shown on screens, as audio, or as haptic feedback. The chosen modality depends on the kind of information and the current driving situation. Completely new innovations would come from LIDAR sensing inside the car, which would improve monitoring humans’ state and enable new interaction methods.

AMICI will validate technological solutions through prototypes and analysing user interaction with them. One foreseen prototype expands car systems for driver and passenger use throughout the vehicle. The Origo steering wheel concept developed in the previous MIVI project forms the technological basis for this.

Six companies with parallel projects, six companies with in-kind contributions and two research partners work jointly in AMICI. The company partners develop AI, spatial audio, simulation, haptics and speech control, among other topics. The results will be integrated within a common platform. The research partners bring latest research results to the development and validate the results through scientific studies.

Funded by Business Finland

Project is funded by Business Finland and has 14 project partners, with two universities and 12 companies in the field.