A Study of User Intent in Immersive Smart Spaces

Document Type

Conference Proceeding

Publication Date

3-1-2019

Abstract

© 2019 IEEE. Smart spaces are typically augmented with devices capable of sensing various inputs and reacting to them. Data from these devices can be used to support system adaptation, reducing user intervention; however, mapping sensor data to user intent is difficult without a large amount of human-labeled data. We leverage the capabilities of head-mounted immersive technologies to actively capture users' visual attention in a unobtrusive manner. Our contributions are three-fold: (1) we developed a novel prototype that enables studies of user intent in an immersive environment, (2) we conducted a proof-of-concept experiment to capture internal and external state data from smart devices together with head orientation information from participants to approximate their gaze, and (3) we report on both quantitative and qualitative evaluations of the data logs and pre-/post-study survey data using machine learning and statistical analysis techniques. Our results motivate the use of direct user input (e.g. gaze inferred by head orientation) in smart home environments to infer user intent allowing us to train better activity recognition algorithms. In addition, this initial study paves a new way to conduct repeatable experimentation of smart space technologies at a lower cost.

Journal Title

2019 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2019

First Page

227

Last Page

232

ISBN

9781538691519

DOI

10.1109/PERCOMW.2019.8730692

First Department

Computing

Share

COinS