Dunkelberger, NathanSullivan, JennyBradley, JoshuaWalling, Nickolas P.Manickam, InduDasarathy, GautamIsrar, AliLau, Frances W.Y.Klumb, KeithKnott, BrianAbnousi, FreddyBaraniuk, RichardO'Malley, Marcia K.2018-10-312018-10-312018Dunkelberger, Nathan, Sullivan, Jenny, Bradley, Joshua, et al.. "Conveying language through haptics: a multi-sensory approach." <i>Proceedings of the 2018 ACM International Symposium on Wearable Computers,</i> (2018) ACM: 25-32. https://doi.org/10.1145/3267242.3267244.https://hdl.handle.net/1911/103252In our daily lives, we rely heavily on our visual and auditory channels to receive information from others. In the case of impairment, or when large amounts of information are already transmitted visually or aurally, alternative methods of communication are needed. A haptic language offers the potential to provide information to a user when visual and auditory channels are unavailable. Previously created haptic languages include deconstructing acoustic signals into features and displaying them through a haptic device, and haptic adaptations of Braille or Morse code; however, these approaches are unintuitive, slow at presenting language, or require a large surface area. We propose using a multi-sensory haptic device called MISSIVE, which can be worn on the upper arm and is capable of producing brief cues, sufficient in quantity to encode the full English phoneme set. We evaluated our approach by teaching subjects a subset of 23 phonemes, and demonstrated an 86% accuracy in a 50 word identification task after 100 minutes of training.engThis is an author's peer-reviewed final manuscript, as accepted by the publisher. The published article is copyrighted by ACM.Conveying language through haptics: a multi-sensory approachJournal articlehttps://doi.org/10.1145/3267242.3267244