Designing and evaluating multimodal interactions for facilitating visual analysis with dashboards

Imran Chowdhury, Abdul Moeid, Enamul Hoque, Ashad Kabir, Md. Sabir Hossain, Mohammad Mainul Islam

Research output: Contribution to journalArticlepeer-review

3 Downloads (Pure)

Abstract

Exploring and analyzing data using visualizations is at the heart of many decision-making tasks. Typically, people perform visual data analysis using mouse and touch interactions. While such interactions are often easy to use, they can be inadequate for users to express complex information and may require many steps to complete a task. Recently natural language interaction has emerged as a promising technique for supporting exploration with visualization, as the user can express a complex analytical question more easily. In this paper, we investigate how to synergistically combine language and mousebased direct manipulations so that the weakness of one modality can be complemented by the other. To this end, we have developed a novel system, named Multimodal Interactions System for Visual Analysis (MIVA), that allows user to provide input using both natural language (e.g., through speech) and direct manipulation (e.g., through mouse or touch) and presents the answer accordingly. To answer the current question in the context of past interactions, the system incorporates previous utterances and direct manipulations made by the user within a finite-state model. The uniqueness of our approach is that unlike most previous approaches which typically support multimodal interactions with a single visualization, MIVA enables multimodal interactions with multiple coordinated visualizations of a dashboard that visually summarizes a dataset. We tested MIVA’s applicability on several dashboards including a COVID-19 dashboard that visualizes coronavirus cases around the globe. We further empirically evaluated our system through a user study with twenty participants. The results of our study revealed that MIVA system enhances the flow of visual analysis by enabling fluid, iterative exploration and refinement of data in a dashboard with multiple-coordinated views.
Original languageEnglish
Pages (from-to)60-71
Number of pages12
JournalIEEE Access
Volume9
DOIs
Publication statusPublished - 22 Dec 2020

Fingerprint Dive into the research topics of 'Designing and evaluating multimodal interactions for facilitating visual analysis with dashboards'. Together they form a unique fingerprint.

  • York University

    Ashad Kabir (Visiting researcher)

    04 Nov 201908 Nov 2019

    Activity: Visiting an external institutionVisiting an external organisationAcademic

Cite this