Procedural versus human level generation: Two sides of the same coin?
Introduction
Many students perceive math as difficult, do not like it and consider the subject to be displeasing (Biswas et al., 2001). Digital Math Games (DMG) might be used to remedy this problem, improving students math learning (McLaren et al., 2017), while increasing their positive attitudes toward the subject (Ke, 2008), and it can even reduce the users anxiety while increasing their engagement (Kiili and Ketamo, 2017). Additionally, rather than conventional paper and pencil exercises, computer-based practice is preferred by them (Yurdabakan and Uzunkavak, 2012), which can also aid in solving the mentioned problems. Based on this context, the usage of this games type is fundamental, which is demonstrated by the significant attention that DMG have been receiving by the academia (de Carvalho, Gasparini, da Silva Hounsell, 2016, Cheng, Lin, Wang, Chan, 2015, Ibarra, Soto, Ataucusi, Ataucusi, 2016, Kiili, Moeller, Ninaus, 2018). However, even for general purpose games, its development is still a slow and costly task, which commonly requires several designers, artists, and developers (Amato, Moscato, 2017, Hendrikx, Meijer, Van Der Velden, Iosup, 2013).
An alternative that might tackle these problems is Procedural Content Generation (PCG) (Carli, Bevilacqua, Pozzer, d’Ornellas, 2011, Hendrikx, Meijer, Van Der Velden, Iosup, 2013, Togelius, Kastbjerg, Schedl, Yannakakis, 2011). It has shown to be a reliable tool that can provide diversified, automatically generated outputs, which can be controlled through generation parameters (Horn et al., 2014), and it has great potential for educational games (Hooshyar, Yousefi, Wang, Lim, 2018, Horn, Clark, Strom, Chao, Stahl, Harteveld, Smith, 2016). It has been mainly used in games to automate, aid in creativity and speed up the creation of various types of content (Korn, Blatz, Rees, Schaal, Schwind, Görlich, 2017, Moghadam, Rafsanjani, 2017, Smith, Whitehead, 2010) such as vegetation, rivers, terrains, networks, scenarios, levels, non-player characters behavior, and control games difficulty level (Hendrikx et al., 2013). Furthermore, it is a powerful technique to tackle another problem that is faced in the aforementioned context: the fact that technologies must provide positive experiences; otherwise, it is unlikely that players will interact or accept it, especially children (Bauckhage, Kersting, Sifa, Thurau, Drachen, Canossa, 2012, Sim, Horton, 2012). To tackle this problem, PCG might be used as a way to constantly provide players with new, unseen content and therefore promote positive outcomes (Horn, Dahlskog, Shaker, Smith, Togelius, 2014, Korn, Blatz, Rees, Schaal, Schwind, Görlich, 2017, Rodrigues, Bonidia, Brancher, 2017, Togelius, Yannakakis, Stanley, Browne, 2011). This context demonstrates the value of this technique to enhance game development, as well as how it can increase the amount of content available to a game without over-charging developers. However, what is the real impact of PCG on players has received little attention from the academic community (Korn et al., 2017).
Therefore, this work will expand on the literature through the investigation of this gap. Within the DMG context, this research will investigate the influences of procedurally generated levels on players feelings, using an A/B test. Thereby, according to the baseline of a game version that contains human-authored content, we demonstrate how procedural level generation influences players in a DMG. Thus, we contribute by presenting an empirical analysis of the effects that a computational intervention (PCG), which improves game development, has on users perception of their interaction with a game, demonstrating how their experience is expressed in terms of psychological aspects. Thus, this research is valuable to professionals who want to employ similar interventions, showcasing how it impacts users perceptions.
Hereafter, we refer to the game version using human-designed levels as the static version, and the other, which uses procedurally generated levels, as the dynamic version. Additionally, in the scope of this work, we consider playing a game level to be gameplay, while playing a set of levels is considered to be a game session. Hence, each level finished by a player (winning or losing it) originates gameplay. Considering this context and based on this researchs goal, when comparing the experiences of players of procedurally generated versus human-designed levels, we assume the following:
Hypothesis 1 (H1): Players’ fun levels do not differ.
Hypothesis 2 (H2): Players’ willingness to play the game again - returnance - does not differ.
Hypothesis 3 (H3): Players’ curiosity levels do not differ.
Hypothesis 4 (H4): Players’ descriptions of their experiences do not differ.
The remainder of this article presents background on PCG in Section 2.1, related work in Section 2.2, the research method in Section 3, analysis and results in Section 4, a discussion of its findings in Section 5, and our final considerations in Section 6.
Section snippets
Related work
First, this section introduces a brief background of what is PCG, what is content in the context of a game, a taxonomy of terms used to specify a PCG system, and the methods used to evaluate these systems. Then, it reviews research studies that performed A/B comparisons of procedurally generated content versus human-designed content in terms of players perspectives, which is the main concern of this article.
Method
This section describes our method in terms of design, material, participants, measures, procedure, and data analysis.
Results
This section presents results from the analysis of this studys hypotheses, beginning with the overall findings. Subsequently, further analyses were conducted in terms of the participants performances and the differences in the responses within subsamples.
Discussion
This section discusses our findings in terms of whether they support our hypotheses, the rationales for the results achieved, and the limitations and issues that represent threats to the validity of our study. Overall, our findings support three of the four experimental predictions that were assumed in this research. These results are in line with previous research in this field, wherein PCG has demonstrated to provide experiences that are almost as good as human-designed levels (Butler,
Conclusions
The research presented in this article investigated players interactions with a DMG. The goal was to identify whether a computational intervention that improves game development, creating game levels through PCG, could lead to PX that are as good as the ones led by human-designed levels. Our hypotheses were based on the assumption that the experience of players from one intervention would not differ from the experience of players from the other, considering four types of measures (i.e., fun,
Declarations of Competing Interest
None.
CRediT authorship contribution statement
Luiz Rodrigues: Conceptualization, Methodology, Software, Formal analysis, Investigation, Resources, Writing - original draft, Project administration. Robson Bonidia: Conceptualization, Methodology, Software, Validation, Writing - review & editing. Jacques Brancher: Conceptualization, Supervision, Writing - review & editing.
Acknowledgments
L. Rodrigues was supported by Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001. R. Bonidia was supported by Federal University of Technology - Paran (UTFPR - Grant: April/2018).
References (44)
A case study of computer gaming for math: engaged learning from gameplay?
Comput. Educ.
(2008)- et al.
Procedural content generation for game props? a study on the effects on user experience
Comput. Entertain.
(2017) - et al.
Experience-driven procedural content generation
IEEE Trans Affect Comput
(2011) - et al.
Formal procedural content generation in games driven by social analyses
2017 31st International Conference on Advanced Information Networking and Applications Workshops (WAINA)
(2017) - et al.
How players lose interest in playing a game: An empirical study based on distributions of total playing times
2012 IEEE Conference on Computational Intelligence and Games (CIG)
(2012) - et al.
Extending intelligent learning environments with teachable agents to enhance learning
Artificial Intelligence in Education, J.D. Moore et al. (Eds.) IOS
(2001) - et al.
Automatic game progression design through analysis of solution features
Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
(2015) - et al.
Interactive evolution for the procedural generation of tracks in a high-end racing game
Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation
(2011) - et al.
A survey of procedural content generation techniques suitable to game development
SBGAMES 2011
(2011) - et al.
Digital games for math literacy: A systematic literature mapping on brazilian publications
Math detective: Digital game-based mathematical error dectection, correction and explanation
2015 IEEE 15th International Conference on Advanced Learning Technologies
Evaluating the impact of procedurally generated content on game immersion
Comput. Games J.
Linear levels through n-grams
Proceedings of the 18th International Academic MindTrek Conference: Media Business, Management, Content & Services
Procedural content generation for games: a survey
ACM Trans. Multimedia Comput. Commun. Appl.
A data-driven procedural-content-generation approach for educational games
J. Comput. Assist. Learn.
Design insights into the creation and evaluation of a computer science educational game
Proceedings of the 47th ACM Technical Symposium on Computing Science Education
A comparative evaluation of procedural level generators in the mario ai framework
Foundations of Digital Games 2014
Mathfraction: Educational serious game for students motivation for math learning
2016 XI Latin American Conference on Learning Objects and Technology (LACLO)
Mixed-initiative design of game levels: integrating mission and space into level generation
FDG 2015
General video game level generation
2016 GECCO
Evaluating cognitive and affective outcomes of a digital game-based math test
IEEE Transactions on Learning Technologies
Evaluating the effectiveness of a game-based rational number training - in-game metrics as learning indicators
Comput. Educ.
Cited by (3)
A Transfiguration Paradigm for Quest Design
2023, Games and CultureAre They Learning or Playing? Moderator Conditions of Gamification's Success in Programming Classrooms
2022, ACM Transactions on Computing EducationPersonalized gamification: A literature review of outcomes, experiments, and approaches
2020, ACM International Conference Proceeding Series