Elsevier

Displays

Volume 56, January 2019, Pages 38-48
Displays

Exploring touch feedback display of virtual keyboards for reduced eye movements

https://doi.org/10.1016/j.displa.2018.11.004Get rights and content

Highlights

  • User experience of word-by-word feedback displays for smartphones/palm tablets were tested.

  • Participants reported higher satisfaction with the static and dynamic WBW than CBC.

  • SLOW typists performed better with dynamic WBW than CBC.

  • SLOW typists looked at the text display less while typing with WBW than CBC.

Abstract

When typing on smartphones or palm tablets, users generally make an effort to type correctly while simultaneously checking the small keyboard and the text display. Unlike physical keyboards that allow users to perform typing based on long-term muscle memory, virtual keyboards typically require more frequent eye movements between the keyboard and the text display areas.

This study proposes a new way of designing a virtual keyboard display to reduce the effort associated with frequent eye-movements. For this study, we developed virtual keyboard display systems featuring both static and dynamic word-by-word (WBW) feedback displays. The two display systems were examined in comparison with a more conventional method known as character-by-character (CBC) feedback display. We investigated user satisfaction, typing performance and the user’s eye gaze shifts. Eye gaze shifts were measured between the keyboard and the text display areas across the three conditions using self-report, log, and eye-tracking measures. In the static WBW condition, the words being typed displayed in a fixed area at the top of the virtual keyboard; in the dynamic WBW display, the words displayed in a small popup window at the tip of the selected key.

Using a repeated measure experiment for the three display conditions, participants were asked to type fifteen phrases using a palm tablet while wearing eye-tracking glasses for each condition. We conducted a mixed-model design ANOVA with group (SLOW vs. FAST typing; men vs. women) as between-subject factors and display condition (CBC vs. WBW). We found a significant (11%) improvement in typing speed with the dynamic WBW over the CBC display for less experienced keyboard users. In addition, participants reported higher satisfaction with the two WBW conditions than the CBC condition. Eye fixations, dwell times, and heat map data also supported that WBW displays are advantageous for less experienced, slower typists by helping them stay focused more on the keyboard, thus reducing eye transitions to the text display. Our study systematically demonstrates how and to what extent the virtual keyboard display strategy influences typing performance and subjective experience based on self-reports and eye-tracking measures. The approach and findings of this study should provide useful information and practical guidance to mobile application developers and designers who are interested in improving virtual keyboard functionalities and user satisfaction.

Introduction

Mobile devices are becoming an integral part of our daily lives, with the time people spend on mobile devices continuing to grow. According to recent studies [1], the average American adult (18+) spends about 3 h on their smartphone every day. Younger adults are known to spend even more time on mobile devices than older demographics. The majority of those users routinely type on smartphones and palm tablets.

Mobile typing is a complex process involving vision, touch, motion, memory, learning, and other cognitive functions [2]. Most of today’s mobile devices have touchscreen interfaces with virtual keyboards based on QWERTY layout. Typing on a virtual keyboard is performed by tapping virtual buttons displayed on the touchscreen. Despite the familiarity with the QWERTY layout for most people, typing on a virtual keyboard is not as easy as typing on a physical keyboard. Virtual keyboards do not provide tactile feedback, and the small-sized keys let fingers often block or go over other keys. User experience issues of mobile high-touch surfaces requiring virtual keyboards are becoming more critical as shared technology in classrooms, and hospitals are increasingly popular. From the public health and well-being perspectives, well-designed virtual keyboard interfaces offer a significant benefit towards the successful adoption of non-touch key input surfaces to prevent germ transmission in public service settings. Although people are mobile typing more than ever, one of the most common problems is the small virtual keyboard [3]. If mobile devices replace computers completely as some predict [4], the issues with the virtual keyboard will become more important and need greater attention.

App developers and researchers have made substantial efforts to tackle the shortcomings of virtual keyboards by exploring keyboard layouts, touch gesture recognition, word prediction, visual highlighting, and multimodal feedback. Virtual keyboards have employed various keyboard layouts, handwriting recognition approaches, word prediction, visual highlighting, and multimodal feedback in order to overcome various limitations. Despite the continuous efforts of the mobile industry and interested researchers to overcome the critical issues with virtual keyboards, to our best knowledge, no study has specifically explored visual touch feedback displays focusing on reducing cognitive loads with eye tracking analysis. In this study, we attempted to provide a new approach to improve virtual keyboards with visual feedback in the display for reduced eye movements.

Section snippets

Issues with virtual keyboards

In text input related studies, the most common performance measure is typing speed using words per minute (WPM). The average input speed with physical QWERTY keyboards on desktop PCs is 40–60 WPM [5]. According to West [6], the QWERTY users reached a typing speed of approximately 80 WPM. Feit et al. [7] found that routine typists could achieve entry speeds above 70 WPM regardless of the number of fingers involved. Even with two thumbs on a physical mini-QWERTY keyboard, Clawson [8] showed that

Visual touch feedback display

Typical virtual keyboards on smartphones or palm tablets provide visual touch feedback similar to what is shown in Fig. 1, where the typed character appears as a popup when the user touches a key. We refer to this type of touch feedback display as character-by-character (CBC) display. The popup is usually located to the upper right of the key the user’s finger touches. However, with this type of touch feedback, users look frequently and alternately at both the keyboard and the text to check the

Materials and procedure

In addition to a control system representing a more conventional display with character-by-character (CBC) feedback, two proposed display systems were developed with an identical virtual keyboard (Fig. 4). Participants were asked to follow and type the exact phrase on the screen using the virtual keyboard. The participant could see the text being typed between the given phrase and the keyboard.

Depending on the mobile app, the space between the keyboard and the text may vary. Most chatting apps

Subjective ratings

The average subjective ratings given by the participants regarding the effectiveness, satisfaction, and preference for each type of touch feedback display as shown in the graph below (Fig. 5).

Discussion

From a series of analysis, we found that the dynamic and static WBW displays allow higher typing performance than the commonly used character-by-character (CBC) system especially in the SLOW group. However, we did not find significant differences in the WPM or error rate between the static and dynamic WBW displays by the SLOW group. This could be because the user can better detect errors when typing slower on a word-by-word basis with the WBW feedback displays when typing while viewing the

Conclusions

It is a natural behavior to look at the keyboard and what is being typed in the text display area while typing. When typing using a virtual keyboard on smartphones or small size tablets, the frequency of looking at each area substantially increases in part because users no longer benefit from long-term established typing motor skill, i.e., muscle memory. Additionally, the substantial reduced keyboard often allow only two thumbs for text inputting. The absence of tactile feedback can make screen

Conflicts of interest

None.

Acknowledgements

This study was supported by the Research Program funded by Seoul National University of Science and Technology, South Korea.

References (46)

  • I.S.M.W. Soukoreff

    Theoretical upper and lower bounds on typing speeds using a stylus and keyboard

    Behav. Info. Tech.

    (1995)
  • L.J. West

    The Standard and Dvorak Keyboards Revisited: Direct Measures of Speed

    Res. Econ.

    (1998)
  • A.M. Feit et al.

    How we type: Movement strategies and performance in everyday typing

  • J. Clawson

    On-the-go text entry: Evaluating and improving mobile text input on mini-qwerty keyboards

    Georgia Institute of Technology

    (2012)
  • A. Sears et al.

    Investigating touchscreen typing: The effect of keyboard size on typing speed

    Behav. Info. Tech.

    (1993)
  • M. Silfverberg et al.

    Prediction text entry speeds on mobile phones

    Hum. Factors.

    (2000)
  • M.D. Fleetwood et al.

    An evaluation of text-entry in Palm OS – Graffiti and the virtual keyboard

  • M. Dunlop et al.

    Multidimensional pareto optimization of touchscreen keyboards for speed familiarity and improved spell checking

  • A.S.L.A. Leiva et al.

    Text Entry on Tiny QWERTY Soft Keyboards

  • S. Reyal et al.

    Performance and User Experience of Touchscreen and Gesture Keyboards in a Lab Setting and in the Wild

  • É. Rodrigues et al.

    Enhancing typing performance of older adults on tablets

    Univers. Access Inf. Soc.

    (2016)
  • A.L. Smith et al.

    Smartphone text input method performance, usability, and preference with younger and older adults

    Hum. Factors.

    (2015)
  • M.D. Dunlop et al.

    Predictive text entry methods for mobile phones

    Pers. Technol.

    (2000)
  • Cited by (11)

    • 3D face reconstruction and dense alignment with a new generated dataset

      2021, Displays
      Citation Excerpt :

      3D face reconstruction and face alignment are two fundamental topics in computer vision, as they are essential preprocessing steps for many facial analysis tasks [1–4], such as recognition [5–10], animation [11,12], tracking [13–16], attribute classification [17], and image restoration [18–21]. It also can be applied to virtual reality [2,22], visual interactive [23,24] and display [25]. Most traditional studies [26,27] have focused on 3D Morphable Model (3DMM) parametric regression based on optimization algorithms and the iterative closest point [26].

    • Impact of button position and touchscreen font size on healthcare device operation by older adults

      2020, Heliyon
      Citation Excerpt :

      Presentation time and font size (Borg et al., 2015; Mahmud et al., 2010; Huang and Yeh, 2007) also have an impact on older users’ processing and interpretation of information. Also, for less experienced users (Kim et al., 2019), character display will affect frequent eye movements, which impacts information processing. Charness and Bosman (1990) found that older user groups preferred black and white characters and backgrounds over similar colored features, whereas younger user groups exhibited no such preference.

    • Eye movement measures for predicting eye gaze accuracy and symptoms in 2D and 3D displays

      2019, Displays
      Citation Excerpt :

      To assess VR, both subjective and objective methods have been proposed. The subjective method mostly consists of symptoms questionnaires [16,28] and the objective method mostly measure the performance of the task using an eye-tracker [19,27,21,35,37]. Eye-tracker has been extensively used as a device to evaluate VR, especially for collecting and analyzing information about the users [17].

    • T-Force: Exploring the Use of Typing Force for Three State Virtual Keyboards

      2023, Conference on Human Factors in Computing Systems - Proceedings
    View all citing articles on Scopus

    This paper was recommended for publication by Richard H.Y. So.

    View full text