skip to main content
research-article

Learning the Vibrotactile Morse Code Alphabet

Authors Info & Claims
Published:03 August 2020Publication History
Skip Abstract Section

Abstract

Vibrotactile Morse code provides a way to convey words using the sense of touch with vibrations. This can be useful in applications for users with a visual and/or auditory impairment. The advantage of using vibrotactile Morse code is that it is technically easy to accomplish. The usefulness of tactile Morse code also depends on how easy it is to learn to use without providing a visual representation of the code. Here we investigated learning of the vibrotactile the Morse code alphabet without any visual representation of the code and whether the learned letters can immediately be used to recognize words. Two vibration motors were used: one was attached to the left arm (dots) and the other to the right arm (dashes). We gave the participants a learning session of 30 minutes and determined how many letters they had learned. All participants managed to learn at least 15 letters in this time. Directly afterward, they were presented with 2-, 3-, 4-, or 5-letter words consisting of only the letters they had learned. Participants were able to identify words, but correct rates decreased rapidly with word length. We can conclude that it is possible to learn vibrotactile Morse code using only a vibrotactile representation (15 to 24 letters in 30 minutes). After the learning session, it was possible to recognise words, but to increase the recognition rates extra training would be beneficial.

References

  1. Ashwini Aher, Karishma Musale, Surabhi Pagar, and Sayali Morwal. 2014. Implementation of smart mobile app for blind 8 deaf person using Morse code. Int. J. Res. Advent Technol. 2, 2 (2014), 151--154.Google ScholarGoogle Scholar
  2. Andras Arato, Norbert Markus, and Zoltan Juhasz. 2014. Teaching Morse language to a deaf-blind person for reading and writing SMS on an ordinary vibrating smartphone. In Proceedings of the 14th International Conference on Computers Helping People With Special Needs (ICCHP’14), PT II, Lecture Notes in Computer Science, K. Miesenberger, D. Fels, D. Archambault, P. Penaz, and W. Zagler (Eds.), Vol. 8548. 393--396.Google ScholarGoogle ScholarCross RefCross Ref
  3. P. L. Brooks and B. J. Frost. 1983. Evaluation of a tactile vocoder for word recognition. J. Acoust. Soc. Am. 74, 1 (1983), 34--39. DOI: https://doi.org/10.1121/1.389685Google ScholarGoogle ScholarCross RefCross Ref
  4. P. L. Brooks, B. J. Frost, J. L. Mason, and K. Chung. 1985. Acquisition of a 250-word vocabulary through a tactile vocoder. J. Acoust. Soc. Am. 77, 4 (1985), 1576--1579. DOI: https://doi.org/10.1121/1.392000Google ScholarGoogle ScholarCross RefCross Ref
  5. Basil Duvernoy, Sven Topp, and Vincent Hayward. 2019. “HaptiComm”, a haptic communicator device for deafblind communication. In Haptic Interaction, Hiroyuki Kajimoto, Dongjun Lee, Sang-Youn Kim, Masashi Konyo, and Ki-Uk Kyung (Eds.). Springer Singapore, Singapore, 112--115.Google ScholarGoogle Scholar
  6. M. Fontana de Vargas, A. Weill-Duflos, and J. R. Cooperstock. 2019. Haptic speech communication using stimuli evocative of phoneme production. In 2019 IEEE World Haptics Conference (WHC). In Proceedings of the 2019 IEEE World Haptics Conference (WHC’19), 610--615. DOI: https://doi.org/10.1109/WHC.2019.8816145Google ScholarGoogle Scholar
  7. Yoren Gaffary, Ferran Sanz, Maud Marchal, Adrien Girard, Florian Gosselin, Mathieu Emily, and Anatole Lecuyer. 2018. Toward haptic communication: Tactile alphabets based on fingertip skin stretch. IEEE Trans. Hapt. 11, 4 (2018), 636–645. DOI: https://doi.org/10.1109/TOH.2018.2855175Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Ulrike Gollner, Tom Bieling, and Gesche Joost. 2012. Mobile lorm glove: Introducing a communication device for deaf-blind people. In Proceedings of the 6th International Conference on Tangible, Embedded and Embodied Interaction. ACM, 127--130. DOI: https://doi.org/10.1145/2148131.2148159Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Morton A. Heller, Kimberly D. Nesbitt, Danette K. Scrofano, and DeNell Daniel. 1990. Tactual recognition of embossed Morse code, letters, and braille. Bull. Psychonom. Soc. 28, 1 (1990), 11--13. DOI: https://doi.org/10.3758/BF03337634Google ScholarGoogle ScholarCross RefCross Ref
  10. M. P. Lynch, R. E. Eilers, D. K. Oller, and L. Lavoie. 1988. Speech perception by congenitally deaf subjects using an electrocutaneous vocoder. J. Rehabil. Res. Dev. 25, 3 (1988), 41--50.Google ScholarGoogle Scholar
  11. Lena Norberg, Thomas Westin, Peter Mozelius, and Mats Wiklund. 2014. Web accessibility by Morse code modulated haptics for deaf-blind. In Proc. 10th Intl Conf. Disability, Virtual Reality and Associated Technologies. DOI: https://doi.org/10.13140/2.1.3712.3524Google ScholarGoogle Scholar
  12. Kensuke Oshima, Tetsuya Arai, Shigeru Ichihara, and Yasushi Nakano. 2014. Tactile sensitivity and Braille reading in people with early blindness and late blindness. J. Vis. Impair. Blindn. 108, 03 (2014), 122--131. DOI: https://doi.org/10.1177/0145482X1410800204Google ScholarGoogle ScholarCross RefCross Ref
  13. Erik Pescara, Tobias Polly, Andrea Schankin, and Michael Beigl. 2019. Reevaluating passive haptic learning of Morse code. In Proceedings of the 23rd International Symposium on Wearable Computers (ISWC’19). Association for Computing Machinery, New York, NY, 186--194. DOI: https://doi.org/10.1145/3341163.3347714Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Parivash Ranjbar, Dag Stranneby, Cheryl Akner Koler, and Erik Borg. 2017. Identification of vibrotactile morse code on abdomen and wrist. Int. J. Eng. Technol. Sci. Innov. 1, 4 (2017), 351--366.Google ScholarGoogle Scholar
  15. Caitlyn Seim, Rodrigo Pontes, Sanjana Kadiveti, Zaeem Adamjee, Annette Cochran, Timothy Aveni, Peter Presti, and Thad Starner. 2018. Towards haptic learning on a smartwatch. In Proceedings of the 2018 ACM International Symposium on Wearable Computers (ISWC’18). Association for Computing Machinery, New York, NY, 228--229. DOI: https://doi.org/10.1145/3267242.3267269Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Caitlyn Seim, Saul Reynolds-Haertle, Sarthak Srinivas, and Thad Starner. 2016. Tactile taps teach rhythmic text entry: Passive haptic learning of Morse code. In Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC’16). Association for Computing Machinery, 164--171. DOI: https://doi.org/10.1145/2971763.2971768Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Hong Z. Tan, Nathaniel I. Durlach, William M. Rabinowitz, Charlotte M. Reed, and Jonathan R. Santos. 1997. Reception of Morse code through motional, vibrotactile, and auditory stimulation. Percept. Psychophys. 59, 7 (1997), 1004--1017. DOI: https://doi.org/10.3758/BF03205516Google ScholarGoogle ScholarCross RefCross Ref
  18. Michael Walker and Kyle B. Reed. 2018. Tactile Morse code using locational stimulus identification. IEEE Trans. Hapt. 11, 1 (2018), 151--155. DOI: https://doi.org/10.1109/TOH.2017.2743713Google ScholarGoogle ScholarCross RefCross Ref
  19. Cheng-Huei Yang, Hsiu-Chen Huang, Li-Yeh Chuang, and Cheng-Hong Yang. 2008. A mobile communication aid system for persons with physical disabilities. Math. Comput. Model. 47, 3–4 (2008), 318--327.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Learning the Vibrotactile Morse Code Alphabet

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Applied Perception
      ACM Transactions on Applied Perception  Volume 17, Issue 3
      July 2020
      85 pages
      ISSN:1544-3558
      EISSN:1544-3965
      DOI:10.1145/3415024
      Issue’s Table of Contents

      Copyright © 2020 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 August 2020
      • Revised: 1 May 2020
      • Accepted: 1 May 2020
      • Received: 1 December 2019
      Published in tap Volume 17, Issue 3

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format