Exploring touch feedback display of virtual keyboards for reduced eye movements☆
Introduction
Mobile devices are becoming an integral part of our daily lives, with the time people spend on mobile devices continuing to grow. According to recent studies [1], the average American adult (18+) spends about 3 h on their smartphone every day. Younger adults are known to spend even more time on mobile devices than older demographics. The majority of those users routinely type on smartphones and palm tablets.
Mobile typing is a complex process involving vision, touch, motion, memory, learning, and other cognitive functions [2]. Most of today’s mobile devices have touchscreen interfaces with virtual keyboards based on QWERTY layout. Typing on a virtual keyboard is performed by tapping virtual buttons displayed on the touchscreen. Despite the familiarity with the QWERTY layout for most people, typing on a virtual keyboard is not as easy as typing on a physical keyboard. Virtual keyboards do not provide tactile feedback, and the small-sized keys let fingers often block or go over other keys. User experience issues of mobile high-touch surfaces requiring virtual keyboards are becoming more critical as shared technology in classrooms, and hospitals are increasingly popular. From the public health and well-being perspectives, well-designed virtual keyboard interfaces offer a significant benefit towards the successful adoption of non-touch key input surfaces to prevent germ transmission in public service settings. Although people are mobile typing more than ever, one of the most common problems is the small virtual keyboard [3]. If mobile devices replace computers completely as some predict [4], the issues with the virtual keyboard will become more important and need greater attention.
App developers and researchers have made substantial efforts to tackle the shortcomings of virtual keyboards by exploring keyboard layouts, touch gesture recognition, word prediction, visual highlighting, and multimodal feedback. Virtual keyboards have employed various keyboard layouts, handwriting recognition approaches, word prediction, visual highlighting, and multimodal feedback in order to overcome various limitations. Despite the continuous efforts of the mobile industry and interested researchers to overcome the critical issues with virtual keyboards, to our best knowledge, no study has specifically explored visual touch feedback displays focusing on reducing cognitive loads with eye tracking analysis. In this study, we attempted to provide a new approach to improve virtual keyboards with visual feedback in the display for reduced eye movements.
Section snippets
Issues with virtual keyboards
In text input related studies, the most common performance measure is typing speed using words per minute (WPM). The average input speed with physical QWERTY keyboards on desktop PCs is 40–60 WPM [5]. According to West [6], the QWERTY users reached a typing speed of approximately 80 WPM. Feit et al. [7] found that routine typists could achieve entry speeds above 70 WPM regardless of the number of fingers involved. Even with two thumbs on a physical mini-QWERTY keyboard, Clawson [8] showed that
Visual touch feedback display
Typical virtual keyboards on smartphones or palm tablets provide visual touch feedback similar to what is shown in Fig. 1, where the typed character appears as a popup when the user touches a key. We refer to this type of touch feedback display as character-by-character (CBC) display. The popup is usually located to the upper right of the key the user’s finger touches. However, with this type of touch feedback, users look frequently and alternately at both the keyboard and the text to check the
Materials and procedure
In addition to a control system representing a more conventional display with character-by-character (CBC) feedback, two proposed display systems were developed with an identical virtual keyboard (Fig. 4). Participants were asked to follow and type the exact phrase on the screen using the virtual keyboard. The participant could see the text being typed between the given phrase and the keyboard.
Depending on the mobile app, the space between the keyboard and the text may vary. Most chatting apps
Subjective ratings
The average subjective ratings given by the participants regarding the effectiveness, satisfaction, and preference for each type of touch feedback display as shown in the graph below (Fig. 5).
Discussion
From a series of analysis, we found that the dynamic and static WBW displays allow higher typing performance than the commonly used character-by-character (CBC) system especially in the SLOW group. However, we did not find significant differences in the WPM or error rate between the static and dynamic WBW displays by the SLOW group. This could be because the user can better detect errors when typing slower on a word-by-word basis with the WBW feedback displays when typing while viewing the
Conclusions
It is a natural behavior to look at the keyboard and what is being typed in the text display area while typing. When typing using a virtual keyboard on smartphones or small size tablets, the frequency of looking at each area substantially increases in part because users no longer benefit from long-term established typing motor skill, i.e., muscle memory. Additionally, the substantial reduced keyboard often allow only two thumbs for text inputting. The absence of tactile feedback can make screen
Conflicts of interest
None.
Acknowledgements
This study was supported by the Research Program funded by Seoul National University of Science and Technology, South Korea.
References (46)
- et al.
New chording text entry methods combining physical and virtual buttons on a mobile phone
Appl. Ergon.
(2014) - et al.
Effect of key size and activation area on the performance of a regional error correction method in a touch-screen QWERTY keyboard
Int. J. Ind. Erg.
(2009) - et al.
Enhanced auditory feedback for Korean touch screen keyboards
Int. J. Hum. Comput. Stud.
(2015) - et al.
Hierarchical control of cognitive processes: the case for skilled typewriting
Psychol. Learn. Motiv.
(2011) - et al.
Postures, typing strategies, and gender differences in mobile device usage: an observational study
Appl. Ergon.
(2012) - et al.
Investigating text input methods for mobile phones
Telemat. Informatics.
(2006) - et al.
Eye tracking in human-computer interaction and usability research: ready to deliver the promises
Mind’s Eye.
(2003) - A. Lipsman, A. Lella, The 2017 U.S. Cross-Platform Future in Focus,...
Two Ways to Fix the Typing-on-Touch-Screens Problem
MIT Technol Rev
(2013)In less than two year, a smartphone could be your only computer
Wired Mag
(2015)
Theoretical upper and lower bounds on typing speeds using a stylus and keyboard
Behav. Info. Tech.
The Standard and Dvorak Keyboards Revisited: Direct Measures of Speed
Res. Econ.
How we type: Movement strategies and performance in everyday typing
On-the-go text entry: Evaluating and improving mobile text input on mini-qwerty keyboards
Georgia Institute of Technology
Investigating touchscreen typing: The effect of keyboard size on typing speed
Behav. Info. Tech.
Prediction text entry speeds on mobile phones
Hum. Factors.
An evaluation of text-entry in Palm OS – Graffiti and the virtual keyboard
Multidimensional pareto optimization of touchscreen keyboards for speed familiarity and improved spell checking
Text Entry on Tiny QWERTY Soft Keyboards
Performance and User Experience of Touchscreen and Gesture Keyboards in a Lab Setting and in the Wild
Enhancing typing performance of older adults on tablets
Univers. Access Inf. Soc.
Smartphone text input method performance, usability, and preference with younger and older adults
Hum. Factors.
Predictive text entry methods for mobile phones
Pers. Technol.
Cited by (11)
A machine learning approach to primacy-peak-recency effect-based satisfaction prediction
2023, Information Processing and Management3D face reconstruction and dense alignment with a new generated dataset
2021, DisplaysCitation Excerpt :3D face reconstruction and face alignment are two fundamental topics in computer vision, as they are essential preprocessing steps for many facial analysis tasks [1–4], such as recognition [5–10], animation [11,12], tracking [13–16], attribute classification [17], and image restoration [18–21]. It also can be applied to virtual reality [2,22], visual interactive [23,24] and display [25]. Most traditional studies [26,27] have focused on 3D Morphable Model (3DMM) parametric regression based on optimization algorithms and the iterative closest point [26].
Impact of button position and touchscreen font size on healthcare device operation by older adults
2020, HeliyonCitation Excerpt :Presentation time and font size (Borg et al., 2015; Mahmud et al., 2010; Huang and Yeh, 2007) also have an impact on older users’ processing and interpretation of information. Also, for less experienced users (Kim et al., 2019), character display will affect frequent eye movements, which impacts information processing. Charness and Bosman (1990) found that older user groups preferred black and white characters and backgrounds over similar colored features, whereas younger user groups exhibited no such preference.
Eye movement measures for predicting eye gaze accuracy and symptoms in 2D and 3D displays
2019, DisplaysCitation Excerpt :To assess VR, both subjective and objective methods have been proposed. The subjective method mostly consists of symptoms questionnaires [16,28] and the objective method mostly measure the performance of the task using an eye-tracker [19,27,21,35,37]. Eye-tracker has been extensively used as a device to evaluate VR, especially for collecting and analyzing information about the users [17].
T-Force: Exploring the Use of Typing Force for Three State Virtual Keyboards
2023, Conference on Human Factors in Computing Systems - Proceedings
- ☆
This paper was recommended for publication by Richard H.Y. So.