Measuring interaction proxemics with wearable light tags

Montanari, Alessandro, Tian, Zhao, Francu, Elena, Lucas, Benjamin, Jones, Brian, Zhou, Xia and Mascolo, Cecilia (2018) Measuring interaction proxemics with wearable light tags. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2 (1). pp. 1-30. ISSN 2474-9567

[thumbnail of 14387.pdf]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (4MB) | Preview

Abstract

The proxemics of social interactions (e.g., body distance, relative orientation) in!uences many aspects of our everyday life: from patients’ reactions to interaction with physicians, successes in job interviews, to effective teamwork. Traditionally, interaction proxemics has been studied via questionnaires and participant observations, imposing high burden on users, low scalability and precision, and often biases. In this paper we present Protractor, a novel wearable technology for measuring interaction proxemics as part of non-verbal behavior cues with# ne granularity. Protractor employs near-infrared light to monitor both the distance and relative body orientation of interacting users. We leverage the characteristics of near-infrared light (i.e., line-of-sight propagation) to accurately and reliably identify interactions; a pair of collocated photodiodes aid the inference of relative interaction angle and distance. We achieve robustness against temporary blockage of the light channel (e.g., by the user’s hand or clothes) by designing sensor fusion algorithms that exploit inertial sensors to obviate the absence of light tracking results. We fabricated Protractor tags and conducted real-world experiments. Results show its accuracy in tracking body distances and relative angles. The framework achieves less than 6 error 95% of the time for measuring relative body orientation and 2.3-cm – 4.9-cm mean error in estimating interaction distance. We deployed Protractor tags to track user’s non-verbal behaviors when conducting collaborative group tasks. Results with 64 participants show that distance and angle data from Protractor tags can help assess individual’s task role with 84.9% accuracy, and identify task timeline with 93.2% accuracy.

Item Type: Article
Additional Information: © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (Volume 2 Issue 1, March 2018) (http://doi.acm.org/10.1145/10.1145/3191757
Keywords: Human-centered computing→Ubiquitous and mobile computing systems and tools; Computer systems organization→Embedded systems; Face-to-face interactions, non-verbal behaviors, light sensing
Schools/Departments: University of Nottingham, UK > Faculty of Social Sciences > Nottingham University Business School
Identification Number: 10.1145/3191757
Depositing User: Eprints, Support
Date Deposited: 28 Mar 2018 10:57
Last Modified: 28 Mar 2018 12:24
URI: https://eprints.nottingham.ac.uk/id/eprint/50762

Actions (Archive Staff Only)

Edit View Edit View