Skip to main content
Log in

Affording embodied cognition through touchscreen and above-the-surface gestures during collaborative tabletop science learning

  • Published:
International Journal of Computer-Supported Collaborative Learning Aims and scope Submit manuscript

Abstract

This paper draws upon the theory of embodied cognition to provide a robust account of how gestural interactions with and around multi-touch tabletops can play an important role in facilitating collaborative meaning-making, particularly in the context of science data visualizations. Embodied cognition is a theory of learning that implies that thinking and perception are shaped by interactions with the physical environment. Previous research has used embodied cognition as a theoretical framework to inform the design of large touchscreen learning applications such as for multi-touch tabletops. However, this prior work has primarily assumed that learning is occurring during any motion or interaction, without considering how specific interactions may be linked to particular instances of collaborative learning supported by embodiment. We investigate this question in the context of collaborative learning from data visualizations of global phenomena such as ocean temperatures. We followed a user-centered, iterative design approach to build a tabletop prototype that facilitated collaborative meaning-making and used this prototype as a testbed in a laboratory study with 11 family groups. We qualitatively analyzed learner groups’ co-occurring utterances and gestures to identify the nature of gestural interactions groups used when their utterances signaled the occurrence of embodiment during collaborative meaning-making. Our findings present an analysis of both touchscreen and above-the-surface gestural interactions that were associated with instances of embodied cognition. We identified four types of gestural interactions that promote scientific discussion and collaborative meaning-making through embodied cognition: (T1) gestures for orienting the group; (T2) cooperative gestures for facilitating group meaning-making; (T3) individual intentional gestures for facilitating group meaning-making; and (T4) gestures for articulating conceptual understanding to the group. Our work illustrates interaction design opportunities for affording embodied cognition and will inform the design of future interactive tabletop experiences in the domain of science learning.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. https://neo.sci.gsfc.nasa.gov/

  2. http://www.nextgenscience.org/get-to-know

  3. http://openexhibits.org/downloads/sdk

  4. http://www.bom.gov.au/

  5. http://www.grida.no/resources/7045

  6. https://www.accuweather.com/

  7. https://svs.gsfc.nasa.gov/2915

References

Download references

Acknowledgments

This work is partially supported by National Science Foundation Grant Awards #DRL-1612485 and #IIS-1552598. Any opinions, findings, and conclusions, or recommendations expressed in this paper are those of the authors and do not necessarily reflect these agencies’ views. The authors also thank the Florida Museum of Natural History for allowing us to recruit participants from their visitors; our advisory board members, Betty Dunckel and Julia Plummer, for contributing to our thinking related to embodied cognition; and all our pilot participants. A preliminary version of the analysis in this paper that included only T1 and T2 was presented at CSCL 2019 (Soni et al. 2019b). This work was conducted while author Jeremy Alexandre was a summer intern at the University of Florida.

Availability of data and material

The authors plan to make the anonymized study data publicly available on the open science framework data repository in the future: https://osf.io/rjzkh/

Code availability

The authors plan to make the codebase for the multi-touch tabletop application used in the study publicly available on the open science framework data repository in the future: https://osf.io/rjzkh/

Funding

This work is partially supported by National Science Foundation Grant Awards #DRL-1612485 and #IIS-1552598.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nikita Soni.

Ethics declarations

Conflicts of interest/competing interests

Not Applicable.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Soni, N., Darrow, A., Luc, A. et al. Affording embodied cognition through touchscreen and above-the-surface gestures during collaborative tabletop science learning. Intern. J. Comput.-Support. Collab. Learn 16, 105–144 (2021). https://doi.org/10.1007/s11412-021-09341-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11412-021-09341-x

Keywords

Navigation