Skip to main content
Log in

RFID-based tangible and touch tabletop for dual reality in crisis management context

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Robots are becoming more and more present in many domains of our daily lives. Their usage encompasses industry, home automation, space exploration, and military operations. Robots can also be used in crisis management situations, where it is impossible to access or dangerous to send humans into the intervention area. The present work compares users’ performances on tangible and on touch user interfaces, for a crisis management application on tabletop. The studied task consists of remotely controlling robots in a simulated disaster/intervention area using a tabletop equipped with a layer of RFID antennas, by displacing mini-robots on its surface matching the situation of the real robots on the ground. Dual reality enforces an accurate and up-to-date mapping between the real robots and the mini robots on the tabletop surface. Our findings show that tangible interaction outperforms touch interaction in effectiveness, efficiency and usability, in a task of remote control of one and two robots; only when the user manipulates a single robot remains the efficiency dimension unchanged between tangible and touch interaction. Results also show that tangible interaction technique does not significantly lower the users’ workload. We finally expose a post-experiment interview and questionnaire results, assessing the participants’ overall satisfaction and agreement on using tangible objects on a tabletop.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

Similar content being viewed by others

Notes

  1. Still accessible on 09-16-2019.

  2. German Research Center for Artificial Intelligence. Website: www.dfki.de.

  3. Massachusetts Institute of Technology; www.mit.edu.

  4. The tabletop was designed by Rfidées company: www.rfidees.com.

References

  1. Boden M, Bryson J, Caldwell D, Dautenhahn K, Edwards L, Kember S, Newman P, Parry V, Pegman G, Rodden T et al (2017) Principles of robotics: regulating robots in the real world. Connect Sci 29(2):124–129

    Article  Google Scholar 

  2. Dunbabin M, Marques L (2012) Robots for environmental monitoring: Significant advancements and applications. IEEE Robot Autom Mag 19(1):24–39

    Article  Google Scholar 

  3. Manti M, Cacucciolo V, Cianchetti M (2016) Stiffening in soft robotics: a review of the state of the art. IEEE Robot Autom Mag 23(3):93–106

    Article  Google Scholar 

  4. Marconi L, Melchiorri C, Beetz M, Pangercic D, Siegwart R, Leutenegger S, Carloni R, Stramigioli S, Bruyninckx H, Doherty P et al (2012) The sherpa project: smart collaboration between humans and ground-aerial robots for improving rescuing activities in alpine environments. In: 2012 IEEE international symposium on safety, security, and rescue robotics (SSRR), pp. 1–4. IEEE

  5. Chen JYC, Terrence PI (2008) Effects of tactile cueing on concurrent performance of military and robotics tasks in a simulated multitasking environment. Ergonomics 51(8):1137–1152

    Article  Google Scholar 

  6. Habib L, Pacaux-Lemoine MP, Millot P (2016) Towards adaptability of levels of automation with Human-machine cooperation approach. In: 2016 IEEE international conference on systems, man, and cybernetics (SMC), pp 001081–001086, Budapest, Hungary. IEEE

  7. Habib L, Pacaux-Lemoine MP, Millot P (2017) Adaptation of the level of automation according to the type of cooperative partner. In: IEEE international conference on systems, man, and cybernetics (SMC), 2017, pp 864–869. IEEE

  8. Habib L, Pacaux-Lemoine MP, Millot P (2018) Human-robots team cooperation in crisis management mission. In: IEEE international conference on systems, man, and cybernetics. Miyazaki, Japan, pp 3209–3214

  9. Lifton J, Paradiso JA (2009) Dual reality: merging the real and virtual. In: International conference on facets of virtual environments, pp 12–28. Springer

  10. Lifton JH (2007) Dual reality: an emerging medium. PhD thesis, Massachusetts Institute of Technology

  11. Merrad W, Habib L, Heloir A, Kolski C, Krüger A (2019) Tangible tabletops and dual reality for crisis management: case study with mobile robots and dynamic tangible objects. In: ANT, (2019) The 10th international conference on ambient systems. Networks and Technologies, Leuven, Belgium

  12. LindenLab. Second life (2003). https://secondlife.com/

  13. Rymaszewski Michael A, James W, Mark W, Catherine W, Cory O, Benjamin B-C (2007) Second life: the official guide. Wiley, Chichester

    Google Scholar 

  14. Back M, Kimber D, Rieffel E, Dunnigan A, Liew B, Gattepally S, Foote J, Shingu J, Vaughan J(2010) The virtual chocolate factory: mixed reality industrial collaboration and control. In: Proceedings of the 18th ACM international conference on multimedia, ACM, pp 1505–1506

  15. Tcho Ventures, INC. Tcho Chocolate. https://tcho.com/. Accessed on 2019-05-02

  16. Lifton J, Feldmeier M, Ono Y, Lewis C, Paradiso JA (2007) A platform for ubiquitous sensor deployment in occupational and domestic environments. In: Proceedings of the 6th international conference on information processing in sensor networks, pp 119–127. ACM

  17. Lifton J, Mittal M, Lapinski M, Paradiso JA (2007) Tricorder: a mobile sensor network browser. In: Proceedings of the ACM CHI 2007 conference-mobile spatial interaction workshop

  18. Raber F, Krüger A, Kahl G (2015) The comparison of performance, efficiency, and task solution strategies in real, virtual and dual reality environments. In: INTERACT 2015. Springer

  19. Kahl G, Warwas S, Liedtke P, Spassova L, Brandherm B (2011) Management dashboard in a retail scenario. In: Workshop on location awareness in dual and mixed reality. International conference on intelligent user interfaces (IUI-11), pp 22–25

  20. Spassova L, Schöning J, Kahl G, Krüger A (2009) Innovative retail laboratory. In: Roots for the future of ambient intelligence. European conference on ambient intelligence (AmI-09), 3rd, November, pp 18–21. Citeseer

  21. Khan VJ, Nuijten KCM, Deslé N (2011) Pervasive application evaluation within virtual environments. In: 1st international conference on pervasive and embedded computing and communication system, pp 261–264

  22. Merrad W, Heloir A, Kolski C (2017) Reformulating Clancey’s generic tasks for bridging both sides of dual reality. In: 29ème conférence francophone sur l’Interaction Homme-Machine, pp 137–146. ACM

  23. Arfib D, Filatriau JJ, Kessous L (2009) Prototyping musical experiments for tangisense, a tangible and traceable table. In: Proceedings of SMC 2009–6th sound and music computing conference 2009, Porto, Portugal, pp 247–252

  24. Kubicki S, Lepreux S, Kolski C (2011) Evaluation of an interactive table with tangible objects: application with children in a classroom. In: Proceedings 2nd workshop on child computer interaction ”UI technologies and educational pedagogy, at CHI

  25. Lebrun Y, Lepreux S, Haudegond S, Kolski C, Mandiau R (2014) Management of distributed RFID surfaces: a cooking assistant for ambient computing in kitchen. Procedia Comput Sci 32:21–28

    Article  Google Scholar 

  26. Rekik Y, Grisoni L, Roussel N (2013) Towards many gestures to one command: a user study for tabletops. In: Paula K, Gary M, Gitte L, Janet W, Marco W (eds) Human-computer interaction – INTERACT 2013, pp 246–263, Springer, Berlin

  27. Rekik Y, Vatavu RD, Grisoni L (2014) Match-up & conquer: a two-step technique for recognizing unconstrained bimanual and multi-finger touch input. In: Proceedings of the 2014 international working conference on advanced visual interfaces, AVI ’14, pp 201–208, New York, NY, USA. ACM

  28. Rekik Y, Vatavu RD, Grisoni L (2014) Understanding users’ perceived difficulty of multi-touch gesture articulation. In: Proceedings of the 16th international conference on multimodal interaction, ICMI ’14, pp 232–239, New York, NY, USA. ACM

  29. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’09, pp 1083–1092, New York, NY, USA. ACM

  30. Goguey A, Nancel M, Casiez G, Vogel D (2016) The performance and preference of different fingers and chords for pointing, dragging, and object transformation. In: Proceedings of the 2016 CHI conference on human factors in computing systems, ACM, pp 4250–4261

  31. Cattan E, Rochet-Capellan A, Bérard F (2016) Effect of touch latency on elementary vs. bimanual composite tasks. In: Proceedings of the 2016 ACM on interactive surfaces and spaces–ISS ’16, Niagara Falls, Ontario, Canada, ACM Press, pp 103–108

  32. Anthony L, Vatavu RD, Wobbrock JO (2013) Understanding the consistency of users’ pen and finger stroke gesture articulation. In: Proceedings of graphics interface 2013, GI ’13, Toronto, Ont., Canada, 2013. Canadian Information Processing Society, pp 87–94

  33. Vatavu RD, Vogel D, Casiez G, Grisoni L (2011) Estimating the perceived difficulty of pen gestures. In: Campos P, Graham N, Jorge J, Nunes N, Palanque P, Winckler M (eds) Human-computer interaction - INTERACT 2011. Springer, Berlin, pp 89–106

    Chapter  Google Scholar 

  34. Rekik Y, Merrad W, Kolski C(2019) Understanding the attention demand of touch and tangible interaction on a composite task. In: Proceedings of the 21st international conference on multimodal interaction, ICMI ’19, New York, NY, USA. ACM. accepted

  35. Mott ME, Vatavu RD, Kane SK, Wobbrock JO (2016) Smart touch: improving touch accuracy for people with motor impairments with template matching. In: Proceedings of the 2016 CHI conference on human factors in computing systems, CHI ’16, pp 1934–1946, New York, NY, USA. ACM

  36. Ungurean OC, Vatavu RD, Leiva LA, Plamondon R (2018) Gesture input for users with motor impairments on touchscreens: empirical results based on the kinematic theory. In: extended abstracts of the (2018) CHI conference on human factors in computing systems, CHI EA ’18, pp LBW537:1-LBW537:6, New York. ACM, USA, NY

  37. Tuddenham P, Kirk D, Izadi S (2010) Graspables revisited: multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 2223–2232. ACM

  38. Don L, Smith SP (2010) Applying bimanual interaction principles to text input on multi-touch surfaces and tabletops. In: ACM international conference on interactive tabletops and surfaces - ITS ’10, Saarbrücken, Germany. ACM Press, pp 253–254

  39. Terrenghi L, Kirk D, Sellen A, Izadi S (2007) Affordances for manipulation of physical versus digital media on interactive surfaces. In: Proceedings of the SIGCHI conference on Human factors in computing systems, pp 1157–1166. ACM

  40. Appert C, Zhai S (2009) Using strokes as command shortcuts: cognitive benefits and toolkit support. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’09, New York, NY, USA, 2009. ACM, pp 2289–2298

  41. Nacenta MA, Kamber Y, Qiang Y, Kristensson PO (2013) Memorability of pre-designed and user-defined gesture sets. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’13, pp 1099–1108, New York, NY, USA. ACM

  42. Nielsen M, Störring M, Moeslund TB, Granum E (2004) A procedure for developing intuitive and ergonomic gesture interfaces for hci. In: Antonio C, Gualtiero V (eds) Gesture-based communication in human-computer interaction, pp 409–420, Springer, Berlin

  43. Morris MR, Wobbrock JO, Wilson AD (2010) Understanding users’ preferences for surface gestures. In: Proceedings of graphics interface 2010, GI ’10, pp 261–268, Toronto, ON, Canada, Canada. Canadian Information Processing Society

  44. Kato J, Sakamoto D, Inami M, Igarashi T (2009) Multi-touch interface for controlling multiple mobile robots. In: CHI’09 extended abstracts on human factors in computing systems, pp 3443–3448. ACM

  45. Hornecker E, Buur J (2006) Getting a grip on tangible interaction: a framework on physical space and social interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’06, New York, NY, USA. ACM, pp 437–446

  46. Ishii H (2008) Tangible bits: beyond pixels. In: Proceedings of the 2nd international conference on Tangible and embedded interaction. ACM

  47. Ullmer B, Ishii H (2000) Emerging frameworks for tangible user interfaces. IBM Syst J 39(34):915–931

    Article  Google Scholar 

  48. Michael R, Alejandro C, Radhika N (2014) Programmable self-assembly in a thousand-robot swarm. Science 345(6198):795–799

    Article  Google Scholar 

  49. Guo C, Sharlin E (2008) Utilizing physical objects and metaphors for human robot interaction. In: Proceedings of the artificial intelligence and the simulation of behaviour convention (AISB ’08)

  50. Guo C, Young JE, Sharlin E (2009) Touch and toys: new techniques for interaction with a remote group of robots. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 491–500. ACM

  51. Krzywinski A, Mi H, Chen W, Sugimoto M (2009) Robotable: a tabletop framework for tangible interaction with robots in a mixed reality. In: Proceedings of the international conference on advances in computer enterntainment technology, ACM, pp 107–114

  52. Jordà S, Geiger Günter AM, Kaltenbrunner M (2007) The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st international conference on Tangible and embedded interaction, pp 139–146. ACM

  53. Anastasiou D, Ras E (2017) A questionnaire-based case study on feedback by a tangible interface. In: Proceedings of the 2017 ACM workshop on intelligent interfaces for ubiquitous and smart learning, pp 39–42. ACM

  54. Kubicki S, Lepreux S, Kolski C (2012) RFID-driven situation awareness on TangiSense, a table interacting with tangible objects. Pers Ubiquit Comput 16(8):1079–1094

    Article  Google Scholar 

  55. Schneider B, Jermann P, Zufferey G, Dillenbourg P (2011) Benefits of a tangible interface for collaborative learning and interaction. IEEE Trans Learn Technol 4(3):222–232

    Article  Google Scholar 

  56. Tabard A, Hincapié-Ramos JD, Esbensen M, Bardram JE (2011) The elabbench: an interactive tabletop system for the biology laboratory. In: Proceedings of the ACM international conference on interactive tabletops and surfaces, pp 202–211. ACM

  57. Detken K, Martinez C, Schrader A (2009) The search wall: tangible information searching for children in public libraries. In: Proceedings of the 3rd international conference on tangible and embedded interaction, pp 289–296. ACM

  58. Buur J, Jensen MV, Djajadiningrat T (2004) Hands-only scenarios and video action walls: novel methods for tangible user interaction design. In: Proceedings of the 5th conference on designing interactive systems: processes, practices, methods, and techniques, ACM, pp 185–192

  59. Mandryk RL, Scott SD, Inkpen KM (2002) Display factors influencing co-located collaboration. In: Conference supplement to ACM CSCW, p 2

  60. Yvonne Rogers Y, Lindley S (2004) Collaborating around large interactive displays: which way is best to meet? Interact Comput 16(6):1133–1152

    Article  Google Scholar 

  61. Scott SD, Grant KD, Mandryk RL (2003) System guidelines for co-located, collaborative work on a tabletop display. In: ECSCW 2003. Springer

  62. Bouabid A, Lepreux S, Kolski C (2018) Study on generic tangible objects used to collaborate remotely on rfid tabletops. J Multimodal User Interfaces 12(3):161–180

    Article  Google Scholar 

  63. Kubicki S, Lepreux S, Kolski C, Perrot C, Caelen J (2009) TangiSense: présentation d’une table interactive avec technologie RFID permettant la manipulation d’objets tangibles et traçables. In: Proceedings of the 21st international conference on association francophone d’Interaction Homme-Machine - IHM ’09, Grenoble, France. ACM Press

  64. Mediamatic. Symbolic table : 100% interface-free media player. https://www.mediamatic.net/en/page/15897/symbolic-table-100-interface-free-media-player. Accessed on 2019-08-21

  65. Wall J (2009) Demo i microsoft surface and the single view platform. In: 2009 international symposium on collaborative technologies and systems, pp xxxi–xxxii. IEEE

  66. Mazalek A, Reynolds M, Davenport G, Magerkurth C, Röcker C (2007) The tviews table for storytelling and gameplay. In: Concepts and technologies for pervasive games: a reader for pervasive gaming research, vol 1, pp 265–290. Shaker Verlag

  67. Baudisch P, Becker T, Rudeck F (2010) Lumino: tangible blocks for tabletop computers based on glass fiber bundles. In: Proceedings of the 28th international conference on human factors in computing systems–CHI ’10, pp 1165, Atlanta, Georgia, USA. ACM Press

  68. Weiss M, Schwarz F, Jakubowski S, Borchers J (2010) Madgets: actuating widgets on interactive tabletops. In: Proceedings of the 23nd annual ACM symposium on user interface software and technology, pp 293–302. ACM

  69. Hauert S, Reichmuth D (2006) Instant city, ein elektronischer musik bau spiel automat. http://www.hauert-reichmuth.ch/en/projekte/instant-city/, Accessed on 2019-08-21

  70. Kubicki S (2011) Contribution à la prise en considération du contexte dans la conception de tables interactives sous l’angle de l’IHM, application à des contextes impliquant table interactive RFID et objets tangibles. PhD thesis, Université de Valenciennes et du Hainaut-Cambresis, France

  71. Nakagaki K, Fitzgerald D, Ma ZJ, Vink L, Levine D, Ishii H (2019) inforce: Bi-directionalforce’shape display for haptic interaction. In: Proceedings of the thirteenth international conference on tangible, embedded, and embodied interaction, pp 615–623. ACM

  72. Nakagaki K, Vink L, Counts J, Windham D, Leithinger D, Follmer S, Ishii H (2016) Materiable: rendering dynamic material properties in response to direct physical touch with shape changing interfaces. In: Proceedings of the 2016 CHI conference on human factors in computing systems–CHI ’16, pp 2764–2772, Santa Clara, California, USA. ACM Press

  73. Colter A, Davivongsa P, Haddad DD, Moore H, Tice B, Ishii H (2016) SoundFORMS: manipulating sound through touch. In: Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems–CHI EA ’16, Santa Clara, California, USA, 2016. ACM Press, pp 2425–2430

  74. Levine D, Shortridge W, Retzepi K, Alkestrup J, Ishii H (2018) Conjure. http://tangible.media.mit.edu/project/conjure/

  75. Tang SK, Sekikawa Y, Leithinger D, Follmer S, Ishii H (2019) Tangible cityscape. http://tangible.media.mit.edu/project/tangible-cityscape/

  76. Leithinger D, Follmer S, Olwal A, Ishii H (2014) Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration. In: Proceedings of the 27th annual ACM symposium on user interface software and technology, pp 461–470. ACM

  77. Leithinger D, Follmer S, Olwal A, Ishii H (2015) Shape displays: spatial interaction with dynamic physical form. IEEE Comput Graphics Appl 35(5):5–11

    Article  Google Scholar 

  78. Umapathi U, Shin P, Nakagaki K, Chin S, Xu T, Gu J, Walker W, Leithinger D, Ishii H (2018) Programmable droplets for interaction. https://tangible.media.mit.edu/project/programmable-droplets-for-interaction/

  79. Umapathi U, Shin P, Nakagaki K, Leithinger D, Ishii H (2018) Programmable droplets for interaction. In: Extended abstracts of the 2018 CHI conference on human factors in computing systems - CHI ’18, pp 1–1, Montreal QC, Canada. ACM Press

  80. Fleck S, Baraudon C, Frey J, Lainé T, Hachet M (2018) Teegi, he’s so cute: example of pedagogical potential testing of an interactive tangible interface for children at school. In: Proceedings of the 29th conference on l’Interaction Homme–machine - IHM ’17, ACM Press, Poitiers, France, pp 1–12

  81. Frey J, Gervais R, Fleck S, Lotte S, Hachet M (2014) Teegi: tangible EEG interface. In: Proceedings of the 27th annual ACM symposium on User interface software and technology–UIST ’14, Honolulu, Hawaii, USA. ACM Press, pp 301–308

  82. Frey J, Gervais R, Lainé T, Duluc M, Germain H, Fleck S, Lotte F, Hachet M (2017) Scientific outreach with Teegi, a tangible EEG interface to talk about neurotechnologies. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems–CHI EA ’17, Denver, ACM Press, pp 405–408

  83. Goc ML, Kim LH, Parsaei A, Fekete JD, Dragicevic JD, Follmer S (2016) Zooids: building blocks for swarm user interfaces. In: Proceedings of the 29th annual symposium on user interface software and technology, pp 97–109. ACM

  84. Melcer EF, Isbister K (2018) Bots & (Main) frames: exploring the impact of tangible blocks and collaborative play in an educational programming game. In: Proceedings of the 2018 CHI conference on human factors in computing systems–CHI ’18, pp 1–14, Montreal QC, Canada. ACM Press

  85. Nakagaki K, Follmer S, Dementyev A, Paradiso JA, Ishii H (2017) Designing line-based shape-changing interfaces. IEEE Pervasive Comput 16(4):36–46

    Article  Google Scholar 

  86. Nakagaki K, Dementyev A, Follmer S, Paradiso JA, Ishii H (2016) ChainFORM: a linear integrated modular hardware system for shape changing interfaces. In: Proceedings of the 29th annual symposium on user interface software and technology - UIST ’16, pp 87–96, Tokyo, Japan, ACM Press

  87. Villar N, Zhang H, Cletheroe D, Saul G, Holz C, Tim R, Oscar S, Misha S, Hui-Shyong Y, William F (2018) Project zanzibar: a portable and flexible tangible interaction platform. In: Proceedings of the 2018 CHI conference on human factors in computing systems–CHI ’18, pp 1–13, Montreal QC, Canada. ACM Press

  88. Fitzmaurice GW, Ishii H, Buxton WAS (1995) Bricks: laying the foundations for graspable user interfaces. In: Proceedings of the SIGCHI conference on Human factors in computing systems, ACM Press, pp 442–449

  89. Fitzmaurice GW, Buxton W (1997) An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input. In: Proceedings of the ACM SIGCHI conference on human factors in computing systems, CHI ’97, ACM, New York, NY, USA, pp 43–50

  90. Almukadi W, Stephane AL (2015) Blackblocks: tangible interactive system for children to learn 3-letter words and basic math. In: Proceedings of ITS, pp 421–424

  91. Manches A, O’Malley C, Benford S (2009) Physical manipulation: evaluating the potential for tangible designs. In: Proceedings of the 3rd international conference on tangible and embedded interaction, pp 77–84. ACM

  92. Marshall P (2007) Do tangible interfaces enhance learning? In: Proceedings of the 1st international conference on Tangible and embedded interaction, pp 163–170. ACM

  93. Kubicki S, Wolff M, Lepreux S, Kolski C (2015) RFID interactive tabletop application with tangible objects: exploratory study to observe young children’ behaviors. Pers Ubiquit Comput 19(8):1259–1274

    Article  Google Scholar 

  94. Everitt KM, Klemmer SR, Lee R, Landay JA (2003) Two worlds apart: bridging the gap between physical and virtual media for distributed design collaboration. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, pp 553–560

  95. Ullmer B, Ishii H (1997) The metadesk: models and prototypes for tangible user interfaces. In: Proceedings of symposium on user interface software and technology (UIST 97), ACM

  96. Price S, Rogers Y, Scaife M, Stanton D, Neale H (2003) Using tangibles to promote novel forms of playful learning. Interact Comput 15(2):169–185

    Article  Google Scholar 

  97. Pangaro G, Maynes-Aminzade D, Ishii H (2002) The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. In: Proceedings of the 15th annual ACM symposium on User interface software and technology, pp 181–190. ACM

  98. Richter J, Thomas BH, Sugimoto M, Inami M (2007) Remote active tangible interactions. In: Proceedings of the 1st international conference on Tangible and embedded interaction, pp 39–42. ACM

  99. Riedenklau E, Hermann T, Ritter H (2012) An integrated multi-modal actuated tangible user interface for distributed collaborative planning. In: Proceedings of the sixth international conference on tangible, embedded and embodied interaction, pp 169–174. ACM

  100. Ducasse J, Macé M, Oriola B, Jouffrais C (2018) Botmap: non-visual panning and zooming with an actuated tabletop tangible interface. ACM Trans Comput Hum Interact 25(4):24:1–24:42

    Article  Google Scholar 

  101. McGookin D, Robertson E, Brewster S (2010) Clutching at straws: using tangible interaction to provide non-visual access to graphs. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1715–1724. ACM

  102. Lucchi A, Jermann P, Zufferey G, Dillenbourg P (2010) An empirical evaluation of touch and tangible interfaces for tabletop displays. In: Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction, pp 177–184. ACM

  103. North C, Dwyer T Lee Bongshin, Fisher Danyel, Isenberg Petra, Robertson George, Inkpen Kori (2009) Understanding multi-touch manipulation for surface computing. In: IFIP Conference on Human-Computer Interaction, pages 236–249. Springer

  104. Ayoung H, Gun LD, Bülthoff Heinrich H, Il SH (2017) Multimodal feedback for teleoperation of multiple mobile robots in an outdoor environment. J Multimodal User Interfaces 11(1):67–80

    Article  Google Scholar 

  105. Abich J, Barber DJ (2017) The impact of human-robot multimodal communication on mental workload, usability preference, and expectations of robot behavior. J Multimodal User Interfaces 11(2):211–225

    Article  Google Scholar 

  106. Biswas M, Romeo M, Cangelosi A, Jones RB (2020) Are older people any different from younger people in the way they want to interact with robots? scenario based survey. J Multimodal User Interfaces 14(1):61–72

    Article  Google Scholar 

  107. ITU-T. The Tactile Internet ITU-T Technology Watch Report, 2014

  108. Kubicki S, Lepreux S, Lebrun Y, Dos Santos P, Kolski C, Caelen J (2009) New human-computer interactions using tangible objects: Application on a digital tabletop with RFID technology. In: International conference on human-computer interaction, Springer, pp 446–455

  109. Nowacka D, Ladha K, Hammerla NY, Jackson D, Ladha C, Rukzio E, Olivier P (2013) Touchbugs: actuated tangibles on multi-touch tables. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 759–762

  110. Christensen Dusty. 30 years after the chernobyl meltdown, why is the ukrainian government pushing nuclear energy? https://www.thenation.com/article/30-years-after-the-chernobyl-meltdown-why-is-the-ukrainian-government-pushing-nuclear-energy/. Accessed on 2019-08-04

  111. Sylvain Pedneault. Fire inside an abandoned convent in massueville, quebec, canada. https://en.wikipedia.org/wiki/Structure_fire#/media/File:Fire_inside_an_abandoned_convent_in_Massueville,_Quebec,_Canada.jpg. Accessed on 2019-08-04

  112. CNN. The battle of aleppo in 20 photos. https://edition.cnn.com/2016/12/22/middleeast/syrian-regime-takes-full-control-of-aleppo/index.html, 2014. Accessed on 2019-09-19

  113. Marzat J, Piet-Lahanier H, Kahn A (2014) Cooperative guidance of lego mindstorms nxt mobile robots. In: 2014 11th international conference on informatics in control, automation and robotics (ICINCO), vol 2, pp 605–610. IEEE

  114. Hart SG (1986) NASA task load index (TLX). Vol 1.0; Paper and pencil package

  115. Brooke J (1996) Sus-a quick and dirty usability scale. In: Usability evaluation in industry. Taylor and Francis, London, pp 189–194

  116. IEC. ISO/IEC TR 9126-4:2004 - Software engineering – Product quality – Part 4: Quality in use metrics. https://www.iso.org/standard/39752.html. Accessed on 2019-08-04

  117. Justin Mifsud. Usability Metrics – A Guide To Quantify The Usability Of Any System. https://usabilitygeek.com/usability-metrics-a-guide-to-quantify-system-usability/. Accessed on 2019-08-05

  118. Assila A, Ezzedine H, deOliveiraKathia M (2016) Standardized usability questionnaires: Features and quality focus. Electron J Comput Sci Inf Technol eJCIST 6(1):15–31

    Google Scholar 

  119. Hornecker E (2006) Physicality in tangible interaction: bodies and the world. First Steps Phys, p 21

  120. Saskia B, van den Hoven E, Eggen B (2010) Design for the periphery. EuroHaptics 2010:71

  121. Pacaux-Lemoine MP, Millot P (2016) Adaptive Level of Automation for risk management. IFAC-PapersOnLine 49(19):48–53

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We thank the automation department of the Laboratory of Industrial and Human Automation control, Mechanical engineering and Computer Science (LAMIH), particularly Marie-Pierre Pacaux-Lemoine and Patrick Millot for lending us the Lego robots (used in SUCRé project [6, 121]). We also thank Lydia Habib for her help with the robots’ programming and communication. The authors also thank warmly the anonymous reviewers for their numerous constructive remarks.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christophe Kolski.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

A Appendix: mobile-robots study questionnaires

This study has other questionnaires than those presented in Sect. 3: pre-experiment questionnaire and post-experiment questionnaire.

1.1 A.1 Pre-experiment questionnaire

  • Participant ID (given by experimenter): . . . . . . . . . . . .

  • Age: . . . . . . . . . . . .

  • Gender:

    figure a

    Male    

    figure b

    Female

  • Occupation and field: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

  • Dominant hand:

    figure c

    Left    

    figure d

    Right

Circle your answers for each of the following questions.

  • Have you ever used an interactive tabletop before?

    Very infrequently    1    2    3    4    5    very frequently

  • Have you ever used a tangible interactive tabletop before?

    Very infrequently    1    2    3    4    5    very frequently

  • How frequently do you use a touch-interface (smartphone, tablet ...)?

    Very infrequently    1    2    3    4    5    very frequently

  • Have you ever used a big sized interface (such as a tabletop)?

    Very infrequently    1    2    3    4    5    very frequently

  • Have you ever remotely controlled a robot?

    Very infrequently    1    2    3    4    5    very frequently

1.2 A.2 Post-experiment questionnaire

Questionnaire about the tangible objects and the tabletop general usage feedback: For each of the following statements, circle one answer that best describes your reactions to the objects.

  • The “robot” tangible object is easy to manipulate.

    Strongly disagree    1    2    3    4    5 strongly agree

    Justification (optional):

  • The “robot” tangible object seems significant (meaningful) to you in relation to its role in the application.

    Strongly disagree    1    2    3    4    5    strongly agree

    Justification (optional):

  • I had a full control on the “robot tangible object” while using it (not the robot).

    Strongly disagree    1    2    3    4    5 strongly agree

    Justification (optional):

  • I had a full control on the “graphical robot object” while using it (not the robot).

    Strongly disagree    1    2    3    4    5 strongly agree

    Justification (optional):

  • The tangible object “take picture” is easy to manipulate.

    Strongly disagree    1    2    3    4    5    strongly agree Justification (optional):

  • The “take picture” tangible object seems significant (meaningful) to you in relation to its role in the application.

    Strongly disagree    1    2    3    4    5    strongly agree

    Justification (optional):

Comments and suggestion about the experimentation (optional):

B Appendix: Tasks scenario and sequence

Fig. 20
figure 20

Tasks and scenario progress. “T1” refers to task one, “T2” refers to task two, “S1” refers to system one (tangible version of the application) and “S2” refers to system two (touch version of the application)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Merrad, W., Héloir, A., Kolski, C. et al. RFID-based tangible and touch tabletop for dual reality in crisis management context. J Multimodal User Interfaces 16, 31–53 (2022). https://doi.org/10.1007/s12193-021-00370-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-021-00370-2

Keywords

Navigation