You are currently viewing CHI ‘24 Feature: Innovating Medical Education with VR Childbirth Delivery Training

MobileHCI 2024 Late-Breaking Work: What's this?: Understanding User Interaction Behaviour with Multimodal Input Information Retrieval System

Congratulations to the authors of “What’s this?: Understanding User Interaction Behaviour with Multimodal Input Information Retrieval System”!

This paper has been accepted to MobileHCI 2024 Late Breaking Work track. Fun fact: the changeable lizard (Calotes versicolor) in the picture you see in this post is a real lizard the first author of the paper encountered on the campus of NUS, which made him wonder “what’s this?”, and shortly after the title of the paper was decided.

-

Authors

Silang Wang, Hyeongcheol Kim, Nuwan Janaka, Kun Yue, Hoang-Long Nguyen, Shengdong Zhao, Haiming Liu, Khanh-Duy Le

Abstract

Human communication relies on integrated multimodal channels to facilitate rich information exchange. Building on this foundation, researchers have long speculated about the potential benefits of incorporating multimodal input channels into conventional information retrieval (IR) systems to support users’ complex daily IR tasks more effectively. However, the true benefits of such integration remain uncertain. This paper presents a series of exploratory pilot tests comparing Multimodal Input IR (MIIR) with Unimodal Input IR (UIIR) across various IR scenarios, concluding that MIIR offers distinct advantages over UIIR in terms of user experiences. Our preliminary results suggest that MIIR could reduce the cognitive load associated with IR query formulation by allowing users to formulate different query-component in a unified manner across different input modalities, particularly when conducting complex exploratory search tasks in unfamiliar, in-situ contexts. The discussions stemming from this finding draw scholarly attention and suggest new angles for designing and developing MIIR systems.