Next Article in Journal
Development of Attached Cavitation at Very Low Reynolds Numbers from Partial to Super-Cavitation
Next Article in Special Issue
A Gamified Augmented Reality Application for Digital Heritage and Tourism
Previous Article in Journal
Influence of Recycled Precast Concrete Aggregate on Durability of Concrete’s Physical Processes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AR Book-Finding Behavior of Users in Library Venue

Department of Multimedia and Animation Arts, National Taiwan University of Arts, New Taipei City 22058, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(20), 7349; https://doi.org/10.3390/app10207349
Submission received: 4 September 2020 / Revised: 27 September 2020 / Accepted: 19 October 2020 / Published: 20 October 2020
(This article belongs to the Special Issue Extended Reality: From Theory to Applications)

Abstract

:
ARKit and ARCore, key technologies in recent augmented reality (AR) development, have allowed AR to become more integrated in our lives. However, how effective AR is in an auxiliary role in venue guidance and how to collect the actual behaviors of users in physical venues are worth exploring. This study used navAR, a spatial behavior analysis system app that our research team developed, to collect the actual behaviors of participants in physical space via a smartphone, such as time, distance travelled, and trajectory, and compared their book-finding behaviors in a library venue in a text scenario and an AR scenario without any additional sensors or cameras. The experiment results revealed that (1) AR targets made a significant difference in book search time, and the participants found some of the books significantly faster; (2) the participants presented no significant differences in distance travelled; (3) with an AR target, the book-finding trajectories of the participants were significantly more regular; (4) the AR guidance system had good usability. The results of this study can facilitate planning with AR in indoor venue routes, improve venue and exhibition tour experiences, and enable AR to be used for crowd flow diversion. Furthermore, this study provides a methodology for future analyses on user behavior in physical spaces.

1. Introduction

Research has shown that including augmented reality (AR) in exhibition content sparks greater interest in exhibition visitors and also creates virtual and real learning environments [1]. Furthermore, incorporating AR into guided tours results in better learning effectiveness than audio guides or text-only guides [2]. Research has also shown that scanning labels in museums is not very intuitive [3] and that visitors tend to neglect the physical environment [4] and may even leave the scenario [5,6] when AR is the focus of an exhibition experience. If using AR on a mobile phone plays an auxiliary role in venue tours, such as navigation, route planning, the usability, functionality and the ways of presenting of AR guidance in the venue must be considered to prevent it losing its intended auxiliary function. For example, users may be distracted by the inconsistent display, complicated execution, inappropriate functions, and operating methods [7]. Thus, the means of helping AR play its role in the venue and enabling AR to really provide assistance to users is an issue worth studying.
To understand whether AR can achieve auxiliary effects in a venue, we chose a library as our experiment venue. We investigated whether users can find books more efficiently with the aid of AR than with text and assessed the usability of this approach. This experiment can provide reference for libraries aiming to incorporate AR book finding in the future, and the data analysis can also provide subsequent AR guide applications with more reliable reference, furthering investigations on whether AR technology can help with route planning, gamified guidance, and diverting crowd flow. This study also demonstrated a methodology using AR technology to analyze spatial behavior, which can facilitate the analysis of the trajectories of various users in indoor venues and aid in venue traffic flow adjustments in the future.

2. Related Work

2.1. AR Navigation

Presently, there are many related studies and applications for using AR for positioning and navigation functions. For example, visual tags which are pasted in various places can be scanned by smartphones to determine one’s current location [8] and can be used in the library to help readers locate the bookshelf [9] and to access books metadata [10], or can be used to label misplaced books on the shelf [11,12], or to provide a library instruction for students [13]. But in order to counteract these visual tags’ vulnerability to ambient light and shadows, a hybrid Bluetooth positioning and visual tag scanning method has been developed. Bluetooth positioning is used to determine the general position. Visual tags can then be subsequently scanned to refine the positioning. This has significantly improved indoor navigation and positioning [14]. In recent years, researchers have successively proposed to combine AR with WiFi positioning technology [15], or with the accelerator camera and compass function in smartphone [16], or even built a HyMoTrack system [17,18] to develop indoor navigation systems.
To date, AR’s spatial positioning technology has made breakthroughs, mainly in Apple’s ARKit and Google’s Tango and ARCore. Apple’s ARkit can be used with CoreLocation for GPS outdoor positioning and road navigation. Tango technology needs to be supplemented with depth photographic lenses and motion capture photographic lenses. Tango’s core technologies include motion tracking, area learning, and depth perception, which jointly allows for the scanning of a 3D space to create a point cloud to facilitate spatial positioning. Google’s ARCore [19] includes motion tracking, environmental understanding, and light estimation, which can detect a mobile phone’s direction of movement, and identify surrounding features such as the ground and walls as well as the position and brightness of virtual objects. Using this technology, a user can perform virtual object positioning without frequently scanning visual tags. With ARKit and ARCore’s current AR spatial positioning technology, these applications can be used on a regular smartphone to execute physical space positioning without additional hardware. Even without installing additional positioning hardware, it can achieve a more accurate indoor position than Beacon [20]. With the gradual development of the two technologies and the popularity of smartphones, there are already relevant examples of using these two technologies to construct indoor navigation [21] and extended applications. For example, Pei-Huang and Naai-Jung developed mobile AR indoor navigation system (MARINS) using a smartphone as a device to guide users to exits in a dark environment with the path only illuminated by the phone camera’s LED [22].

2.2. AR Effectiveness

Swan and Gabbard [23] found that from a human-machine (HCI) perspective, user-centric designs in AR accounted for 14.3% (38 papers out of 266), while AR experiments for general users accounted for only 7.9% (21 papers out of 266). Dünser [24] also pointed out that only 10% of the 161 studies conducted AR experiments performed by users. The insufficient research on AR user experience has led to insufficient training on how to evaluate AR experience, how to design experiments, how to choose and apply correct methods, and how to analyze experimental results. Therefore, finding effective experiments to analyze AR user experience will help improve AR usability. At present, AR effectiveness is mainly analyzed using questionnaires, observation, semi-structured interviews, and video analysis [5,25]. The System Usability Scale (SUS) developed by John Brooke [26] to measure usability is a widely used standardized questionnaire-based method for determining subjective usability [27,28]. Sang Min, Won Suk, and Yong Gu [29] developed usability principles for smartphone AR applications by analyzing existing research about heuristic evaluation methods and developed an AR application prototype which conducted usability testing by observation of and the questionnaire from the participant to verify the effectiveness of usability principles. However, questionnaires and observation collect data from the user perspective, whereas data on the actual behaviors of users in a venue, such as distance travelled and trajectory, usually requires additional equipment to collect. Trajectory data can be collected by analyzing videos [30,31], or placed external sensors into the pockets [32] or shoes [33] of participants to determine the direction, trajectory, and distance travelled of walking. However, it is limited with regard to collecting large quantities of behavioral data in the future. Thus, to understand whether AR actually helps users in a space, we developed an AR space behavior analysis system navAR [34] to serve as a research tool without any additional sensors or cameras.

2.3. NavAR System

As we know, the AR Toolkit based on visual tags has extended many user interfaces, such as JARToolKit [35], MX Toolkit [36], Augmented Reality Interface Toolkit [37], etc. However, these ToolKits were not developed for AR venue navigation interfaces. Therefore, this paper proposed a reliable system to achieve compliance with the user interface of the AR navigation. NavAR is an ARKit-based system which supports four of the resulting AR-application requirements: reliable real-time tracking, real-time 3D rendering of the presentation techniques, handling of user interaction, recording of user interaction [38]. This system offers three functions for the visitors, content providers, and researchers (Figure 1): (1) target tour locations and information that content providers can customize (Figure 2a,b), (2) AR guide models for visitors (Figure 3a,b), and (3) 2D and 3D records that visualize the behavioral information of visitors in real space, including trajectory, distance travelled, and time (Figure 4a,b) and can be uploaded to clouds for researchers to analyze. The purpose of this experimental design was to understand whether AR can assist users in finding their targets in a space based on time, distance travelled, and trajectory. The navAR system play three roles in this study: (1) for the experiment designers, they can use the navAR system to set their own target points in a space; (2) for the participants, the navAR system serves as the target point of book finding in a space; (3) for researchers, the navAR system can be used to collect the time, distance travelled, and trajectories of users in physical venues. We incorporated Placenote SDK [39] into the system to provide an environment scanning function that enhances the stability of indoor AR positioning and reduces offset.

3. Methodology

To understand whether the assistance of the AR guidance system can benefit users, we conducted this study using two designs: an experiment to record and analyze the book-finding behavior of users and a questionnaire to determine the usability of the AR guidance system as perceived by users.

3.1. Experiment

For the sake of convenience, we chose the fourth floor of the nearby National Taiwan University of Arts Library as the experiment venue. It is 16.5 m × 13.24 m. We adopted a between subject design with one participant at a time to avoid any unnecessary collisions. During the experiment, the participant was given an iPhone XR manufactured by Apple. The experiment was divided into two scenarios. A text scenario and an AR scenario. In the text scenario, the smartphone screen only displayed the title, author, and call number of a book, and the participant was required to located the book based on the displayed text. In the AR scenario, the location of the book in the physical space was directly displayed along with the title, author, and call number of the book. The details are as shown in Table 1 below.
We chose six random books in the library and referred to them as Books A, B, C, D, E, and F. Figure 5 displays their locations in the library. The experiment required a physical venue; to recruit participants nearby who were familiar with operating the smartphone, we adopted purposive sampling, focusing on students and faculty members at National Taiwan University of Arts over the age of 18. We recruited a total of 60 participants, who were randomly assigned to Group I or Group II. These two groups only differed in the order in which the books were searched for. The experiment was divided into two stages for each participant. In Group I, the participants first searched for Books A, B, and C using the text scenario. After completing this stage, they then searched for Books D, E, and F using the AR scenario. In Group II, the participants first searched for Books D, E, and F using the text scenario. After completing this stage, they then searched for Books A, B, and C using the AR scenario. We compared the behaviors of the Group I participants who searched for Books A, B, and C using the text scenario and those of the Group II participants who searched for Books A, B, and C using the AR scenario. Then, we compared the behaviors of the Group II participants who searched for Books D, E, and F using the text scenario and those of the Group I participants who searched for Books D, E, and F using the AR scenario. Finally, we analyzed the differences between the book-finding behaviors in the text and AR scenarios. Figure 6 presents the procedures of the experiment.
In the experiment, the participants first scanned a start image (Figure 7a) as instructed and then searched for the book based on the information provided on the smartphone screen (Figure 7b). Once they scanned the cover of the right book to indicate that they had successfully found it (Figure 7c), the screen would then show the information of the next book. Whether the participant was in the text scenario or in the AR scenario, the smartphone would record the trajectory, time, distance travelled, and direction as they moved through the physical space. At the end of the experiment, it automatically uploaded the data to a cloud for further analysis and comparison of the participants in the two scenarios. After the end of the experiment, the participants were given a questionnaire regarding the usability of the AR guidance system and any suggestions they might have for future improvements.

3.2. User Experience Questionnaire

After the experiment, we conducted a user experience questionnaire survey using the System Usability Scale (SUS) to assess the usability of the AR guidance system in book finding. The SUS contains 10 question items, with Items 1, 3, 5, 7, and 9 being positively worded and Items 2, 4, 6, 8, and 10 being negatively worded. We also added Item 11, which was an open question asking participants for feedback. The first 10 items were measured on a five-point Likert scale including strong disagreement, disagreement, neutral, agreement, and strong agreement to assess whether the participants approved of the AR guidance system during the experiment and found it to be useful (Table 2).
In the SUS, 1–5 points were respectively given to the responses “strongly disagree,” “disagree,” “neutral,” “agree,” and “strongly agree.” A lower score indicated stronger disagreement, and a higher score indicated stronger agreement. The total SUS score is calculated by subtracting 1 point from the scores of the positively worded question items, subtracting the scores of the negatively worded question items from 5, adding all of the results together, and then multiplying the sum by 2.5. A full score is 100 points. The formula is as follows (1):
((Q1 − 1) + (5 − Q2) + (Q3 − 1) + (5 − Q4) + (Q5 − 1) + (5 − Q6) + (Q7 − 1) + (5 − Q8) + (Q9 − 1) + (5 − Q10)) * 2.5 = Total score
Finally, we calculated the SUS scores and divided them into five grades as done by Bangor et al. [40] to determine the usability of this guidance system. A SUS score of 70 indicates that the system is average and in the acceptable range, as shown in Figure 8.

4. Data Analysis

To understand how the book-finding behaviors of the participants differ with the six books in the two scenarios, we adopted an independent sample t-test for analysis with the following hypotheses:
Hypotheses 1.
No significant differences exist between book-finding in the text scenario and that in the AR scenario.
Hypotheses 2.
Significant differences exist between book-finding in the text scenario and that in the AR scenario.
Through analysis of the time consumed by participants, distance travelled, trajectories, and usability of app, we obtained the following results:

4.1. Time and Distance Travelled Analysis

Analysis of the actual behaviors of the participants in the venue (Table 3) indicated that the test statistic for Book A was t (36.096) = 2.234, p < 0.05. Thus, the null hypothesis (H1) was rejected. With an α = 0.05 level of significance, finding Books A presented significant differences with the aid of the AR guidance system. In addition, the average amount of time (103.9403 s) spent on finding Book A in the text scenario was more than that (71.4423 s) spent with the aid of the AR guidance system, thereby indicating that the AR guidance system made finding Book A faster.
The test statistic for Book C was t (41.849) = 3.569, p < 0.05. Thus, the null hypothesis (H1) was rejected. With an α = 0.05 level of significance, finding Book C presented significant differences with the aid of the AR guidance system. In addition, the average amount of time (98.7240 s) spent on finding Book C in the text scenario was also more than that (65.4917 s) spent with the aid of the AR guidance system, thereby indicating that the AR guidance system made finding Book C faster.
The test statistic of the p value for Book B, D, E and F were more than 0.05. Thus, the null hypothesis (H2) was rejected. With an α = 0.05 level of significance, finding these four books presented no significant differences in book search time with the aid of the AR guidance system.
In distance travelled, the test statistic of the p value for Book A, B, C, D, E and F were more than 0.05. Thus, the null hypothesis (H2) was rejected. With an α = 0.05 level of significance, the text scenario and the AR scenario showed no significant differences in distance travelled with the aid of the AR guidance system.

4.2. Trajectory Analysis

Analysis of the trajectories automatically recorded by the smartphone during the experiment (Table 4) revealed that with different colors to represent different trajectory samples, book finding with the text scenario resulted in more irregular trajectory patterns in both Group I and Group II, even causing significant deviations from the test area. This shows that with only the call number of the book, participants may have had no sense of where direction to go in the beginning. In contrast, book finding with the AR scenario resulted in significantly more regular and concentrated trajectories, showing that with an AR target, the locations of the books were more specific, enabling participants to have a clearer target to aim for.

4.3. Usability Analysis

According to the paragraph 3.2, after the experiment, the 60 participants filled out the user experience questionnaire regarding book finding with the AR guidance system. We calculated each of their scores, and the mean score was 74.75, which is a grade C and higher than 70 points. Thus, on the whole, the participants found that book finding using the AR guidance system was good.
Table 5 presents the mean scores of each question item; as can be seen, the 60 participants held a positive attitude toward book finding with the AR guidance system. However, in the last question of the questionnaire, 20 participants indicated that the positioning function of the AR guidance system was sometimes inaccurate. Two participants remarked that because the AR target only showed the direction of the book, it was difficult to ascertain whether the book was on the bookcase in front of them or on another one behind it. As a result, the book-finding process did not always go smoothly; this explains why Item 6, which involves the inconsistency in the system, derived a lower score.

5. Discussions

Based on the analysis of our results, we arrived at the following discussions:
  • The AR guidance system can reduce the search time for users; however, current AR technologies still have positioning issues that may lead to misjudgment and increase the search time. Thus, with this positioning method, it is currently not recommended for use in venues where targets are densely distributed and the precision requirement is high.
  • In distance travelled, the results from the six books presented no significant differences between the two scenarios. We speculate that this is because the bookcases were spaced quite close to one another. With positioning inaccuracies, the distance travelled increased. If shorter distance travelled is a consideration, then navigation routes will be required to reduce unnecessary distance travelled.
  • With the assistance of AR, users can clearly know where a book is, which enhances the directionality of their searches.
  • Users perceived good usability in book finding with the AR guidance system and hold positive attitudes towards using the system to help with book finding in the future.
The limitations of this study were as follows. Using AR for positioning may result in errors ranging from 0.1 m to 0.5 m, which makes it difficult to precisely position books in a library. The participants expected the target to be accurate, which led to some misjudgment and caused participants to go to the wrong bookcase. Furthermore, books may not always be in the same spot on library bookshelves, so if AR is incorporated into library collection search functions, the target should be set on the shelf in coordination with the call number, instead of directly on the book. We suggest that with current AR technology limitations, if this AR target function was used in venues where targets are less densely distributed, the slight positioning errors would be more acceptable to users.

6. Conclusions

With no additional hardware equipment for spatial positioning installed, this study used only a smartphone and a navAR system that we developed to analyze the book-finding behavior of participants in a library. Out of the six books in this study, the results of two books indicated that with the assistance of the AR guidance system, participants could find books much faster. In distance travelled, the results presented no significant differences, which means that due to the current AR positioning technology is still immature; with only the aid of the AR target, the participants may have misjudged the location of the book during their search, which increased the search time and the distance travelled. However, the trajectory diagrams show that their trajectories were regular and concentrated, which means that the AR target enhanced the directionality of their searches. In the usability of the AR guidance system, the AR target had auxiliary effects, and based on the SUS standards, it was good.
This study analyzed the effectiveness of AR as an auxiliary tool. We suggest that future studies assess and analyze navigation routes, analyze crowd flow diversion, and incorporate GPS positioning for trajectory analysis in outdoor venues. In the application aspect, using AR technology for target positioning can be used in not only libraries but also museums and art galleries, and it can even be used to develop reality games. This study also demonstrated how to use AR trajectory analysis to perform analyses and assessments in indoor venues to understand the behaviors of viewers in physical venues and improve user experience.

Author Contributions

Conceptualization, C.-I.L.; methodology, C.-I.L.; software, C.-I.L., F.-R.X. and Y.-W.H.; validation, C.-I.L., F.-R.X. and Y.-W.H.; formal analysis, C.-I.L.; investigation, F.-R.X. and Y.-W.H.; resources, C.-I.L.; data curation, C.-I.L.; writing—original draft preparation, C.-I.L.; writing—review and editing, C.-I.L.; visualization, Y.-W.H.; supervision, C.-I.L.; project administration, C.-I.L.; funding acquisition, C.-I.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Ministry of Science and Technology (MOST), grant number MOST 107-2410-H-144 -006 -MY2.

Acknowledgments

The study is supported by Ministry of Science and Technology (MOST) and National Taiwan University of Arts Library.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Damala, A.; Cubaud, P.; Bationo, A.; Houlier, P.; Marchal, I. Bridging the gap between the digital and the physical: Design and evaluation of a mobile augmented reality guide for the museum visit. In Proceedings of the 3rd International Conference on Digital Interactive Media in Entertainment and Arts, DIMEA ‘08, Athens, Greece, 10–12 September 2008; pp. 120–127. [Google Scholar]
  2. Chang, K.E.; Chang, C.T.; Hou, H.T.; Sung, Y.T.; Chao, H.L.; Lee, C.M. Development and behavioral pattern analysis of a mobile guide system with augmented reality for painting appreciation instruction in an art museum. Comput. Educ. 2014, 71, 185–197. [Google Scholar] [CrossRef]
  3. Mata, F.; Claramunt, C.; Juarez, A. An experimental virtual museum based on augmented reality and navigation. In Proceedings of the 19th ACM SIGSPATIAL International Symposium on Advances in Geographic Information Systems, Chicago, IL, USA, 1–4 November 2011. [Google Scholar]
  4. Billinghurs, M.; Belcher, D.; Gupta, A.; Kiyokawa, K. Communication behaviors in colocated collaborative AR interfaces. Int. J. Hum. Comput. Interact. 2009, 16, 395–423. [Google Scholar] [CrossRef]
  5. McCall, R.; Wetzel, R.; Löschner, J.; Braun, A.-K. Using presence to evaluate an augmented reality location aware game. Pers. Ubiquitous Comput. 2011, 15, 25–35. [Google Scholar] [CrossRef]
  6. Chincholle, D.; Goldstein, M.; Nyberg, M.; Eriksson, M. Lost or found? A usability evaluation of a mobile navigation and locationbased service. In Proceedings of the 4th International Symposium on Mobile Human–Computer Interaction, Pisa, Italy, 18–20 September 2002; pp. 211–224. [Google Scholar]
  7. Shneiderman, B. Designing the User Interface: Strategies for Effective Human–Computer Interaction, 5th ed.; Addison-Wesley: Boston, MA, USA, 2010. [Google Scholar]
  8. Bellot, S.A. Visual Tag Recognition for Indoor Positioning. Master’s Thesis, Signal Processing Scholl of Electrical Engineering Kungliga Tekniska Hogskolan, Stockholm, Sweden, April 2011. [Google Scholar]
  9. Umlauf, E.J.; Piringer, H.; Reitmayr, G.; Schmalstieg, D. ARLib: The augmented library. In Proceedings of the First IEEE International Augmented Reality Toolkit Workshop, Darmstadt, Germany, 29 September 2002. [Google Scholar]
  10. Alex, A.G.A.; Jegatha, S.; Gnana, J.J.; Albert, R.S. SaaS framework for library augmented reality application. In Proceedings of the 2014 World Congress on Computing and Communication Technologies, Tamilnadu, India, 27 February–1 March 2014. [Google Scholar]
  11. Brinkman, B.; Brinkman, S. AR in the library: A pilot study of multi-target acquisition usability. In Proceedings of the 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Adelaide, Australia, 1–4 October 2013. [Google Scholar]
  12. Liu, D.Y. Combined with augmented reality navigation applications in the library. In Proceedings of the 2016 International Conference on Advanced Materials for Science and Engineering (ICAMSE), Tainan, Taiwan, 12–13 November 2016. [Google Scholar]
  13. Wang, Y.S.; Chen, C.M.; Hong, C.M.; Tsai, Y.N. Interactive augmented reality game for enhancing library instruction in elementary schools. In Proceedings of the 2013 IEEE 37th Annual Computer Software and Applications Conference Workshops, Kyoto, Japan, 22–26 July 2013. [Google Scholar]
  14. La Delfa, G.C.; Catania, V. Accurate indoor navigation using Smartphone, Bluetooth Low Energy and Visual Tags. In Proceedings of the MobileMed 2014, 2nd International Conference on Mobile and Information Technologies in Medicine, Prague, Czech Republic, 20–21 November 2014. [Google Scholar]
  15. Yan, X.; Liu, W.; Cui, X. Research and application of indoor guide based on mobile augmented reality. In Proceedings of the 2015 International Conference on Virtual Reality and Visualization, Xiamen, China, 17–18 October 2015. [Google Scholar]
  16. Cankaya, I.A.; Koyun, A.; Yigit, T.; Yuksel, A.S. Mobile indoor navigation system in iOS platform using augmented reality. In Proceedings of the 2015 9th International Conference on Application of Information and Communication Technologies (AICT), Rostov on Don, Russia, 14–16 October 2015. [Google Scholar]
  17. Gerstweiler, G.; Vonach, E.; Kaufmann, H. HyMoTrack: A Mobile AR Navigation System for Complex Indoor Environments. Sensors 2016, 16, 17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Gerstweiler, G. Guiding people in complex indoor environments using augmented reality. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces, Reutlingen, Germany, 18–22 March 2018. [Google Scholar]
  19. Google. ARCore Overview. 2017. Available online: https://developers.google.com/ar/discover/ (accessed on 10 February 2020).
  20. Makarov, A. How Augmented Reality-Based Indoor Navigation Works. 2019. Available online: https://mobidev.biz/blog/augmented-reality-indoor-navigation-app-developement-arkit (accessed on 10 February 2020).
  21. Hardy, J. Creating an ARCore Powered Indoor Navigation Application in Unity. 2019. Available online: https://blog.raccoons.be/arcore-powered-indoor-navigation-unity (accessed on 10 February 2020).
  22. Diao, P.H.; Shih, N.J. MARINS: A Mobile Smartphone AR System for Pathfinding in a Dark Environment. Sensors 2018, 18, 3442. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Swan, J.E.; Gabbard, J.L. Survey of user-based experimentation in augmented reality. In Proceedings of the 1st International Conference on Virtual Reality, Las Vegas, NV, USA, 22–27 July 2005; pp. 1–9. [Google Scholar]
  24. Dünser, A.; Grasset, R.; Billinghurst, M. A survey of evaluation techniques used in augmented reality studies. In Proceedings of the ACM SIGGRAPH ASIA 2008 Courses, Singapore, 10–13 December 2008. [Google Scholar]
  25. Dias, M.; Jorge, J.; Carvalho, J.; Santos, P.; Luzio, J. Usability evaluation of tangible user interfaces for augmented reality. In Proceedings of the 2003 IEEE International Augmented Reality Toolkit Workshop, Tokyo, Japan, 7 October 2003. [Google Scholar]
  26. Brooke, J. SUS—A Quick and Dirty Usability Scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L., Eds.; Taylor & Francis: London, UK, 1996. [Google Scholar]
  27. Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. Hum. Comput. Interact. 2018, 34, 577–590. [Google Scholar] [CrossRef]
  28. Quandta, M.; Beinkea, T.; Freitag, M. User-centered evaluation of an augmented reality-based assistance system for maintenance. In Proceedings of the 53rd CIRP Conference on Manufacturing Systems, Chicago, IL, USA, 1–3 July 2020. [Google Scholar]
  29. Ko, S.M.; Chang, W.S.; Ji, Y.G. Usability Principles for Augmented Reality Applications in a Smartphone Environment. Int. J. Hum. Comput. Interact. 2013, 29, 501–515. [Google Scholar] [CrossRef]
  30. Moore, D. A Real-World System for Human Motion Detection and Tracking; California Institute of Technology: Pasadena, CA, USA, 2003. [Google Scholar]
  31. Drab, S.; Artner, N. Motion Detection as Interaction Technique for Games & Applications on Mobile Devices. In Proceedings of the Interaction Devices (PERMID 2005) Workshop, Munich, Germany, 11 May 2005. [Google Scholar]
  32. Diaz, E.M. Inertial Pocket Navigation System: Unaided 3D Positioning. Sensors 2015, 15, 9156–9178. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Saponas, T.S.; Lester, J.; Hartung, C.; Kohno, T. Devices That Tell on You: The Nike + iPod Sport Kit; University of Washington: Seattle, WA, USA, 2006. [Google Scholar]
  34. Lee, C.I.; Xiao, F.R.; Hsu, Y.W. Using augmented reality technology to construct a venue navigation and spatial behavior analysis system. In Proceedings of the 5th International Augmented and Virtual Reality Conference, Munich, Germany, 12–14 June 2019; pp. 161–170. [Google Scholar]
  35. Geigerm, C.; Reimann, C.; Sticklein, J.; Paelke, V. JARToolKit—A java binding for ARToolKit. In Proceedings of the The First IEEE International Workshop Agumented Reality Toolkit, Darmstadt, Germany, 29 September 2002. [Google Scholar]
  36. Dias, J.M.S.; Monteiro, L.; Santos, P.; Silvestre, R.; Bastos, R. Developing and authoring mixed reality with MX toolkit. In Proceedings of the 2003 IEEE International Augmented Reality Toolkit Workshop, Tokyo, Japan, 7 October 2003. [Google Scholar]
  37. Liarokapis, F.; White, M.; Lister, P. Augmented reality interface toolkit. In Proceedings of the Eighth International Conference on Information Visualisation (IV’04), London, UK, 14–16 July 2004. [Google Scholar]
  38. Paelke, V.; Stocklein, J.; Reimann, C.; Rosenbach, W. Supporting user interface evaluation of AR presentation and interaction techniques with ARToolkit. In Proceedings of the 2003 IEEE International Augmented Reality Toolkit Workshop, Tokyo, Japan, 7 October 2003. [Google Scholar]
  39. Placenote. What Is a Spatial App? 2019. Available online: https://docs.placenote.com/ (accessed on 10 February 2020).
  40. Bangor, A.; Kortum, P.; Miller, J. Determining what individual SUS scores mean: Adding an adjective rating scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
Figure 1. Augmented Reality (AR) Venue Navigation System-navAR Service Explanation.
Figure 1. Augmented Reality (AR) Venue Navigation System-navAR Service Explanation.
Applsci 10 07349 g001
Figure 2. (a) Content Provider sets target positioning; (b) Content Provider customizes the navigational positioning target and tag information.
Figure 2. (a) Content Provider sets target positioning; (b) Content Provider customizes the navigational positioning target and tag information.
Applsci 10 07349 g002
Figure 3. (a) User enters into Viewer Navigational Mode; (b) Exhibit detailed information and audio-guided tours.
Figure 3. (a) User enters into Viewer Navigational Mode; (b) Exhibit detailed information and audio-guided tours.
Applsci 10 07349 g003
Figure 4. (a) 2D Visualizations of the Viewers Paths; (b) View Path on Smartphone Screen.
Figure 4. (a) 2D Visualizations of the Viewers Paths; (b) View Path on Smartphone Screen.
Applsci 10 07349 g004
Figure 5. Location of six books in experiment venue.
Figure 5. Location of six books in experiment venue.
Applsci 10 07349 g005
Figure 6. Experiment procedures.
Figure 6. Experiment procedures.
Applsci 10 07349 g006
Figure 7. (a) Scanning start image; (b) Looking for books based on information displayed on screen; (c) Scanning of book cover indicating success in book finding.
Figure 7. (a) Scanning start image; (b) Looking for books based on information displayed on screen; (c) Scanning of book cover indicating success in book finding.
Applsci 10 07349 g007
Figure 8. System Usability Scale (SUS) grade scale.
Figure 8. System Usability Scale (SUS) grade scale.
Applsci 10 07349 g008
Table 1. Two experiment scenarios.
Table 1. Two experiment scenarios.
ScenarioText ScenarioAR Scenario
ExplanationOnly name and call number of book are displayed in text formLocation of book marked using AR in addition to name and call number of book
Screenshot Applsci 10 07349 i001 Applsci 10 07349 i002
Table 2. Questionnaire survey.
Table 2. Questionnaire survey.
ItemContent of Question ItemStrongly DisagreeDisagreeNeutralAgreeStrongly Agree
1I think that I would like to use this AR guidance system frequently.
2I found the AR guidance system unnecessarily complex.
3I thought the AR guidance system was easy to use.
4I think that I would need the support of a technical person to be able to use this AR guidance system.
5I found the various functions in this AR guidance system were well integrated.
6I thought there was too much inconsistency in this AR guidance system.
7I would imagine that most people would learn to use this AR guidance system very quickly.
8I found the AR guidance system very cumbersome to use.
9I felt very confident using this app.
10I needed to learn a lot of things before I could get going with this AR guidance system.
11How was your overall experience with this AR guidance system? Do you have any suggestions?
Table 3. Summary of T test results for book finding in two scenarios.
Table 3. Summary of T test results for book finding in two scenarios.
StatisticScenarioNo. of SamplesTime (s)Distance Travelled (m)
Book Mean (n)Standard Deviation (SD)t Value (t)Degree of Freedom (df)p Value (p)Mean (n)Standard Deviation (SD)t Value (t)Degree of Freedom (df)p Value (p)
AText30103.940375.135992.23436.0960.032 *18.469217.487801.14636.4140.259
AR3071.442326.4823314.57826.30503
BText3083.562250.236381.438580.15626.57767.699890.620580.538
AR3068.250629.6124025.127010.24684
CText3098.72408.384183.56941.8490.001 ***24.55089.142270.940580.351
AR3065.49174.0525622.30999.31598
DText3061.464325.84715−0.324580.74716.97063.73355−0.76940.0580.446
AR3063.603525.2141618.26008.38771
EText3095.643957.313881.50537.7010.14119.30126.51152−1.230580.224
AR3078.733222.4584521.63968.12591
FText3062.107521.56796−1.826580.07326.13859.04486−0.952580.345
AR3071.753719.2848728.41069.42975
*: p ≤ 0.05, **: p ≤ 0.01, ***: p ≤ 0.001.
Table 4. Trajectory analysis diagrams (N = 30 in each group).
Table 4. Trajectory analysis diagrams (N = 30 in each group).
Group I (Text Scenario)Group II (AR Scenario)
Applsci 10 07349 i003 Applsci 10 07349 i004
Group II (text scenario)Group I (AR scenario)
Applsci 10 07349 i005 Applsci 10 07349 i006
Table 5. Statistical results of questionnaire (N = 60).
Table 5. Statistical results of questionnaire (N = 60).
ItemContent of Question ItemMean
1I think that I would like to use this AR guidance system frequently.3.87
2I found the AR guidance system unnecessarily complex.2.08
3I thought the AR guidance system was easy to use.4.12
4I think that I would need the support of a technical person to be able to use this AR guidance system.2.27
5I found the various functions in this AR guidance system were well integrated.4.03
6I thought there was too much inconsistency in this AR guidance system.3.08
7I would imagine that most people would learn to use this AR guidance system very quickly.4.25
8I found the AR guidance system very cumbersome to use.1.73
9I felt very confident using this app.4.37
10I needed to learn a lot of things before I could get going with this AR guidance system.1.57
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lee, C.-I.; Xiao, F.-R.; Hsu, Y.-W. AR Book-Finding Behavior of Users in Library Venue. Appl. Sci. 2020, 10, 7349. https://doi.org/10.3390/app10207349

AMA Style

Lee C-I, Xiao F-R, Hsu Y-W. AR Book-Finding Behavior of Users in Library Venue. Applied Sciences. 2020; 10(20):7349. https://doi.org/10.3390/app10207349

Chicago/Turabian Style

Lee, Chun-I, Fu-Ren Xiao, and Yi-Wen Hsu. 2020. "AR Book-Finding Behavior of Users in Library Venue" Applied Sciences 10, no. 20: 7349. https://doi.org/10.3390/app10207349

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop