Abstract
Vibrotactile Morse code provides a way to convey words using the sense of touch with vibrations. This can be useful in applications for users with a visual and/or auditory impairment. The advantage of using vibrotactile Morse code is that it is technically easy to accomplish. The usefulness of tactile Morse code also depends on how easy it is to learn to use without providing a visual representation of the code. Here we investigated learning of the vibrotactile the Morse code alphabet without any visual representation of the code and whether the learned letters can immediately be used to recognize words. Two vibration motors were used: one was attached to the left arm (dots) and the other to the right arm (dashes). We gave the participants a learning session of 30 minutes and determined how many letters they had learned. All participants managed to learn at least 15 letters in this time. Directly afterward, they were presented with 2-, 3-, 4-, or 5-letter words consisting of only the letters they had learned. Participants were able to identify words, but correct rates decreased rapidly with word length. We can conclude that it is possible to learn vibrotactile Morse code using only a vibrotactile representation (15 to 24 letters in 30 minutes). After the learning session, it was possible to recognise words, but to increase the recognition rates extra training would be beneficial.
- Ashwini Aher, Karishma Musale, Surabhi Pagar, and Sayali Morwal. 2014. Implementation of smart mobile app for blind 8 deaf person using Morse code. Int. J. Res. Advent Technol. 2, 2 (2014), 151--154.Google Scholar
- Andras Arato, Norbert Markus, and Zoltan Juhasz. 2014. Teaching Morse language to a deaf-blind person for reading and writing SMS on an ordinary vibrating smartphone. In Proceedings of the 14th International Conference on Computers Helping People With Special Needs (ICCHP’14), PT II, Lecture Notes in Computer Science, K. Miesenberger, D. Fels, D. Archambault, P. Penaz, and W. Zagler (Eds.), Vol. 8548. 393--396.Google ScholarCross Ref
- P. L. Brooks and B. J. Frost. 1983. Evaluation of a tactile vocoder for word recognition. J. Acoust. Soc. Am. 74, 1 (1983), 34--39. DOI: https://doi.org/10.1121/1.389685Google ScholarCross Ref
- P. L. Brooks, B. J. Frost, J. L. Mason, and K. Chung. 1985. Acquisition of a 250-word vocabulary through a tactile vocoder. J. Acoust. Soc. Am. 77, 4 (1985), 1576--1579. DOI: https://doi.org/10.1121/1.392000Google ScholarCross Ref
- Basil Duvernoy, Sven Topp, and Vincent Hayward. 2019. “HaptiComm”, a haptic communicator device for deafblind communication. In Haptic Interaction, Hiroyuki Kajimoto, Dongjun Lee, Sang-Youn Kim, Masashi Konyo, and Ki-Uk Kyung (Eds.). Springer Singapore, Singapore, 112--115.Google Scholar
- M. Fontana de Vargas, A. Weill-Duflos, and J. R. Cooperstock. 2019. Haptic speech communication using stimuli evocative of phoneme production. In 2019 IEEE World Haptics Conference (WHC). In Proceedings of the 2019 IEEE World Haptics Conference (WHC’19), 610--615. DOI: https://doi.org/10.1109/WHC.2019.8816145Google Scholar
- Yoren Gaffary, Ferran Sanz, Maud Marchal, Adrien Girard, Florian Gosselin, Mathieu Emily, and Anatole Lecuyer. 2018. Toward haptic communication: Tactile alphabets based on fingertip skin stretch. IEEE Trans. Hapt. 11, 4 (2018), 636–645. DOI: https://doi.org/10.1109/TOH.2018.2855175Google ScholarDigital Library
- Ulrike Gollner, Tom Bieling, and Gesche Joost. 2012. Mobile lorm glove: Introducing a communication device for deaf-blind people. In Proceedings of the 6th International Conference on Tangible, Embedded and Embodied Interaction. ACM, 127--130. DOI: https://doi.org/10.1145/2148131.2148159Google ScholarDigital Library
- Morton A. Heller, Kimberly D. Nesbitt, Danette K. Scrofano, and DeNell Daniel. 1990. Tactual recognition of embossed Morse code, letters, and braille. Bull. Psychonom. Soc. 28, 1 (1990), 11--13. DOI: https://doi.org/10.3758/BF03337634Google ScholarCross Ref
- M. P. Lynch, R. E. Eilers, D. K. Oller, and L. Lavoie. 1988. Speech perception by congenitally deaf subjects using an electrocutaneous vocoder. J. Rehabil. Res. Dev. 25, 3 (1988), 41--50.Google Scholar
- Lena Norberg, Thomas Westin, Peter Mozelius, and Mats Wiklund. 2014. Web accessibility by Morse code modulated haptics for deaf-blind. In Proc. 10th Intl Conf. Disability, Virtual Reality and Associated Technologies. DOI: https://doi.org/10.13140/2.1.3712.3524Google Scholar
- Kensuke Oshima, Tetsuya Arai, Shigeru Ichihara, and Yasushi Nakano. 2014. Tactile sensitivity and Braille reading in people with early blindness and late blindness. J. Vis. Impair. Blindn. 108, 03 (2014), 122--131. DOI: https://doi.org/10.1177/0145482X1410800204Google ScholarCross Ref
- Erik Pescara, Tobias Polly, Andrea Schankin, and Michael Beigl. 2019. Reevaluating passive haptic learning of Morse code. In Proceedings of the 23rd International Symposium on Wearable Computers (ISWC’19). Association for Computing Machinery, New York, NY, 186--194. DOI: https://doi.org/10.1145/3341163.3347714Google ScholarDigital Library
- Parivash Ranjbar, Dag Stranneby, Cheryl Akner Koler, and Erik Borg. 2017. Identification of vibrotactile morse code on abdomen and wrist. Int. J. Eng. Technol. Sci. Innov. 1, 4 (2017), 351--366.Google Scholar
- Caitlyn Seim, Rodrigo Pontes, Sanjana Kadiveti, Zaeem Adamjee, Annette Cochran, Timothy Aveni, Peter Presti, and Thad Starner. 2018. Towards haptic learning on a smartwatch. In Proceedings of the 2018 ACM International Symposium on Wearable Computers (ISWC’18). Association for Computing Machinery, New York, NY, 228--229. DOI: https://doi.org/10.1145/3267242.3267269Google ScholarDigital Library
- Caitlyn Seim, Saul Reynolds-Haertle, Sarthak Srinivas, and Thad Starner. 2016. Tactile taps teach rhythmic text entry: Passive haptic learning of Morse code. In Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC’16). Association for Computing Machinery, 164--171. DOI: https://doi.org/10.1145/2971763.2971768Google ScholarDigital Library
- Hong Z. Tan, Nathaniel I. Durlach, William M. Rabinowitz, Charlotte M. Reed, and Jonathan R. Santos. 1997. Reception of Morse code through motional, vibrotactile, and auditory stimulation. Percept. Psychophys. 59, 7 (1997), 1004--1017. DOI: https://doi.org/10.3758/BF03205516Google ScholarCross Ref
- Michael Walker and Kyle B. Reed. 2018. Tactile Morse code using locational stimulus identification. IEEE Trans. Hapt. 11, 1 (2018), 151--155. DOI: https://doi.org/10.1109/TOH.2017.2743713Google ScholarCross Ref
- Cheng-Huei Yang, Hsiu-Chen Huang, Li-Yeh Chuang, and Cheng-Hong Yang. 2008. A mobile communication aid system for persons with physical disabilities. Math. Comput. Model. 47, 3–4 (2008), 318--327.Google ScholarDigital Library
Index Terms
- Learning the Vibrotactile Morse Code Alphabet
Recommendations
Reevaluating passive haptic learning of morse code
ISWC '19: Proceedings of the 2019 ACM International Symposium on Wearable ComputersPassive Haptic Learning (PHL) describes the learning of a motion, sequence or pattern without voluntary involvement of attention, focus or motivation through a haptic interface. In previous PHL studies about teaching Morse code, we suspect that active ...
Adaptive Morse Code Recognition Using Support Vector Machines for Persons with Physical Disabilities
In this paper, Morse code is selected as a communication adaptive device for persons whose hand coordination and dexterity are impaired by such ailments as amyotrophic lateral sclerosis, multiple sclerosis, muscular dystrophy, and other severe ...
Vibrollusion: Creating a Vibrotactile Illusion Induced by Audiovisual Touch Feedback
MUM '23: Proceedings of the 22nd International Conference on Mobile and Ubiquitous MultimediaVibrations are the dominant way to create haptic feedback for interactive systems and are most often induced by vibrotactile actuators. However, virtual content created for augmented reality usually does not support that modality, instead relying mainly ...
Comments