UNIVERSIDAD DE COSTA RICA SISTEMA DE ESTUDIOS DE POSGRADO EMOTION DETECTION THROUGH FACIAL RECOGNITION IN ADAPTIVE VIDEO GAMES: USING AFFECT FOR CONTROLLING PROCEDURAL CONTENT GENERATION (DETECCIÓN DE EMOCIONES A TRAVÉS DEL RECONOCIMIENTO FACIAL EN VIDEOJUEGOS ADAPTATIVOS: USO DEL AFECTO PARA CONTROLAR LA GENERACIÓN DE CONTENIDO PROCESAL) Tesis sometida a la consideración de la Comisión del Programa de Estudios de Posgrado en Computación e Informática para optar al grado y título de Maestría Académica en Computación e Informática. JONATHAN ESQUIVEL MONTOYA Ciudad Universitaria Rodrigo Facio, Costa Rica 2024 iii TABLE OF CONTENTS HOJA DE APROBACIÓN iii TABLE OF CONTENTS iv RESUMEN EN ESPAÑOL vi ABSTRACT vii LIST OF TABLES viii LIST OF FIGURES ix LIST OF ABBREVIATIONS xi CHAPTER I. INTRODUCTION 1 1.1 Problem Statement 3 1.2 Justification 4 1.3 Objectives 5 1.3.1 General Objective 5 1.3.2 Specific Objectives 5 1.4 Scope and Delimitation 6 1.5 Related Work 7 CHAPTER II. THEORETICAL FRAMEWORK 11 2.1 The Player is Emotional 12 2.1.1 Plutchik’s Wheel of Emotions 13 2.1.2 The Psychology of Color 14 2.1.3 Architectural Atmospheres 16 2.1.4 Emotional Design in Video Games 19 2.1.5 Emotional Music in Video Games 21 2.2 Emotions Can be Predicted 22 2.2.1 Facial Expression Recognition 23 2.2.1.1 Landmarks 23 2.2.1.2 Fisherfaces 25 2.2.1.3 Eigenfaces 26 2.2.1.4 Deepface 26 2.3 The Video Game Adapts 28 2.3.1 Adaptive Video Games 28 2.3.2 Procedural Content Generation 28 2.4 Improving the Experience 34 2.4.1 Game Design 35 2.4.1.1 Challenge 35 iv 2.4.1.2 Skill 35 2.4.1.3 The Flow 36 2.4.2 Affective Video Games 37 2.5 Development Environment 38 CHAPTER III. METHODOLOGY 40 3.1 Investigation and Analysis 40 3.2 Image Perception Questionnaire 41 3.3 Video Game Design 41 3.3.1 Creating the Player and NPCs 42 3.3.2 Modulating the Environment and Architecture 45 3.3.3 Setting the Colors 49 3.3.4 Emotional Music 52 3.4 Emotion Detection 52 3.4.1 Facial Recognition 53 3.4.2 Parametrizing Emotions 54 3.5 Video Game Adaptation 54 3.5.1 Input: Player’s Emotions 55 3.5.2 Output: Mechanics and Dynamics 56 3.5.3 Output: Environment 57 3.6 Affective Loop 59 3.6.1 Emotional Feedback 59 3.6.2 Reevaluating Parameters 61 3.6 Evaluation Mechanism 62 3.7 User Experience Questionnaire 63 CHAPTER IV. RESULTS 65 4.1 Perception Results 65 4.1.1 Anxiety 66 4.1.1 The Flow 66 4.1.1 Boredom 68 4.2 User Experience Results 69 4.3 Emotion Recognition Results 75 4.4 Discussion 77 CHAPTER V. CONCLUSIONS 79 5.1 Possible further work 81 GLOSSARY 83 BIBLIOGRAPHY APPENDIX. QUESTIONNAIRES 93 v RESUMEN EN ESPAÑOL La investigación explora la integración del reconocimiento de emociones en tiempo real y la Generación de Contenido Procedural (PCG, en inglés) en videojuegos para mejorar la experiencia del jugador. El estudio se centra en crear un videojuego adaptativo que modifica dinámicamente su contenido, mecánicas y dificultad basándose en el estado emocional del jugador, detectado a través de sus expresiones faciales. El juego fue diseñado utilizando Unreal Engine 5, incorporando elementos de teoría del color y arquitectura emocional para crear una experiencia inmersiva y emocional. El estudio involucró a 22 participantes que jugaron dos versiones del juego: una con el sistema afectivo y otra sin él. Los resultados mostraron que el sistema afectivo impactó positivamente la autonomía, competencia e inmersión del jugador, destacando el potencial de la adaptación basada en emociones para crear experiencias de juego más personalizadas y atractivas. El trabajo futuro podría involucrar ampliar el tamaño de la muestra, investigar los efectos a largo plazo, incorporar modalidades adicionales para el reconocimiento de emociones y explorar la aplicación de sistemas adaptativos en diferentes géneros de juegos. vi ABSTRACT The research explores the integration of real-time emotion recognition and Procedural Content Generation (PCG) in video games to enhance player experiences. The study focuses on creating an adaptive video game that dynamically modifies its content, mechanics, and difficulty based on the player's emotional state, detected through facial expressions. The game was designed using Unreal Engine 5, incorporating elements of color theory and emotional architecture to create an immersive and emotionally resonant experience. The study involved 22 participants who played two versions of the game: one with the adaptive system and one without. The results showed that the adaptive system positively impacted player autonomy, competence, and immersion, highlighting the potential of emotion-driven adaptation in creating more personalized and engaging gaming experiences. Future work could involve expanding the sample size, investigating long-term effects, incorporating additional modalities for emotion recognition, and exploring the application of adaptive systems in different game genres. vii LIST OF TABLES Table 1. Color in Films…………………………………………………………………..12 Table 2. Building Atmospheres.…………………………………………………………14 Table 3. Emotion Recognition Methods.………………………………………………...19 Table 4. Predicted Emotion of Figure 6.…………………………………………………21 Table 5. Adapting Mechanisms.………………………………………………………….26 Table 6. NPC States.…………………………………..…..……………………………..40 Table 7. Emotional States.……………………………………………………………….47 Table 8. Emotional Adaptation.………………………………………………………….51 Table 9. Emotional adaptation in game.………………………………………………….57 Table 10. Flow Adaptation.………………………………………………………………58 Table 11. Anxiety Perception.……………………………………………………………63 Table 12. Flow Perception.………………………………………………………………64 Table 13. Boredom Perception.…………………………………………………………..65 Table 14. UPEQ-PENS Results (With/Without)………………………………………....67 Table 15. Anxiety, boredom and flow.…………………………………………………...71 viii LIST OF FIGURES Figure 1. Responsive gaming experiences in Nevermind…………………………………7 Figure 2. Color-emotion theory in the game To the Skies!…….………………………….8 Figure 3. PCG actors in Unreal Engine 5.………………………………………………... 9 Figure 4. Yannakakis and Togelius research on game adaptation.……………………….. 9 Figure 5. Affective Video Games.………………………………………………………..11 Figure 6. Plutchik’s Wheel of Emotions.………………………………………………...13 Figure 7. Balanced Colors.……………………………………………………………….15 Figure 8. The Fountain Chamber, Hades.……………………………………………….. 18 Figure 9. Game System Emotional Loop.………………………………………………..20 Figure 10. Landmarks……………………………………………………………………24 Figure 11. Linear Discriminant Analysis.………………………………………………..26 Figure 12. Deepface analysis.……………………………………………………………27 Figure 13. Design your Wig, V&A.……………………………………………………...30 Figure 14. Character Creator, Spore.……………………………………………………. 31 Figure 15. The Woman in the Red Dress, The Matrix.…………………………………..34 Figure 16. The Flow Channel.…………………………………………………………... 37 Figure 17. Affective Game Feedback Loop.……………………………………………..38 Figure 18. First Character Tests.…………………………………………………………43 Figure 19. Finite State Machine.…………………………………………………………44 Figure 20. Base behavior tree for the enemy…………………………………………….45 Figure 21. PCG in Diablo.………………………………………………………………. 46 Figure 22. Base persistent level.…………………………………………………………46 Figure 23. Changes in persistent level…………………………………………………...47 Figure 24. Brutalism in game design.…………………………………..………………..48 Figure 25. A dark night with multiple colors in the game……………………………….49 Figure 26. Enemies color changes……………………………………………………….51 Figure 27. Color Symbolism……………………………………………………………..51 ix Figure 28. Detecting Emotions…………………………………………………………..53 Figure 29. Adapting the Game…………………………………………………………...55 Figure 30. Increasing difficulty…………………………………………………………..57 Figure 31. Base/Anxiety/Flow/Boredom tiles…………………………………………...58 Figure 32. The Affective System………………………………………………………...59 Figure 33. I experienced a lot of freedom in the game…………………………………..71 Figure 34. The choices I made while playing influenced what happened……………….71 Figure 35. I was good at playing…………………………………………………………72 Figure 36. My ability to play the game is well matched…………………………………73 Figure 37. The game felt lively and engaging with my actions………………………….73 Figure 38. My mastery of the game improved with practice…………………………….74 Figure 39. UPEQ - PENS subscales……………………………………………………..75 Figure 40. Emotions Distribution………………………………………………………..77 x LIST OF ABBREVIATIONS [1] PCG: Procedural Content Generation [2] ER: Emotion Recognition [3] NPC: Non-Player Character [4] UE5: Unreal Engine 5 [5] UPEQ: Ubisoft Perceived Experience Questionnaire [6] PENS: Player Experience of Needs Satisfaction [7] SDT: Self-Determination Theory [8] LDA: Linear Discriminant Analysis [9] CNN: Convolutional Neural Network [10] ERT: Ensemble of Regression Trees [11] SVM: Support Vector Machine [12] EEG: Electroencephalography [13] NIRS: Near-Infrared Spectroscopy [14]MOBA:Multiplayer Online Battle Arena xi 1 CHAPTER I. INTRODUCTION Video games are a rich area for investigation in several different fields, including artificial intelligence, but also extending to fields beyond Computer Science, like Psychology, Architecture and Graphic Design. Playing a game is an action that directly affects the emotions of the player. With the advance of artificial intelligence the use of “Procedural Content Generation” (known as PCG by its initials) to enhance the player’s experience in video games. There are different applications of PCG, including the use of emotions of the player to adapt an unique profile to the person, to better fit each player’s preferences. In 1980 the release of Rogue [1], created by Michael Toy, Glenn Wichman, and Ken Arnold for Unix systems as a freely-distributed software, introduced a game that was never the same. The dungeons, monsters placement and items throughout the game were set to different values by an algorithm on each playthrough. This set up the beginnings of a new video game genre, the roguelikes. Although Rogue did not really introduce the concept of procedural content in video games, it did start an important stepping stone for the video game’s industry that later will help create games that can adapt to certain conditions. Since the release of Rogue, adaptive video games have become increasingly common. Blizzard’s Diablo [2] uses PCG to create new dungeons for the player. This helps the development team during the creation process and the player to have much more places to explore and play. Not only maps and dungeons are built using procedural generation. Gearbox Software [3] have been working on PCG for their game, Borderlands 3, which is an action first-person shooter video game published by 2K Games [4]. The first game offers six different parts for a weapon (stock, grip, barrel, body, magazine and scope) and can be further improved with different actions and settings for the weapon. This set of parameters for a weapon leaves the game with around 17.75 million different weapons, which was increased with more parts, actions, perks and even elemental damage that 2 leaves a billion gun variations in Borderlands 3. Approaches like this show how powerful and variable a game can be using PCG. Game engines like the one developed by Hello Games [5] can even help create entire worlds with different flora and fauna based on geometry, textures and animation pieces. This allows game designers and even modders to add new pieces to each module and exponentially increase the variability in the game without expending too much time within the content creation process. The constant use of algorithmic content generation can present problems in distribution of the space, association of mechanics with the environment and other problems with aspects of game design. It is clearer in games like No Man’s Sky [5], the game was severely criticized by its design. The game was one of the most anticipated of the year (2016) and it promised a whole universe created by an algorithm, leaving you in one (hopefully unique) of the 18 quintillion planets created (including the planet’s flora and fauna). The problem was that the algorithm did not have enough elements to prevent similar worlds or did not represent the experienced promise, so the player needed to pass through several similar planets in order to find a completely different experience [6]. The adaptation of the game environment, difficulty and mechanics to certain unknown events is still work in progress, and specially used in the research area and game industry. This concept, often referred to as "dynamic difficulty adjustment" or "procedural content generation," aims to enhance player engagement and provide personalized experiences by tailoring the game to individual play styles and preferences [64, 65]. While still in its nascent stages, this approach holds significant promise for creating more immersive and adaptive game experiences, particularly in conjunction with real-time player emotion recognition. Games like Max Payne [7] change the game’s difficulty dynamically to provide assistance to players that need it. In games like Mario Kart [8] algorithms give different items based on the player’s position and rank. Players in higher ranks had lower tier items, while low ranked players have a higher probability of getting a high tier item. 3 The intrinsic connection between emotions and video games is evident in various aspects of the gaming experience, from the narrative arcs that evoke empathy and suspense to the carefully crafted soundtracks that amplify the emotional impact of key moments. Games like Unravel [9] leverage environmental storytelling and evocative music to create a deeply moving experience, while titles like The Last of Us Part II [10] utilize narrative and gameplay elements to elicit a wide range of emotions in players. Emotions together with adaptive video games, can help set up a different scenario that enhances the player’s experience instead of using static parameters to create new worlds. 1.1 Problem Statement Adapting a video game using procedurally generated content does not necessarily mean a new and better experience per game session. A problem most major video games publishers working with certain procedurally generated content, is the use of repetitive strategies and game dynamics that affects the overall player experience. As Johnson and While stated [11], “if a game does not generate positive emotions in the user it is unlikely to succeed”, generating positive emotions is crucial to a better player experience and even more important to the game itself, its success. There are studies that show how emotions can be used to change different parameters in a game and how it affects the player’s emotional state. These studies propose a model to be used in games like Super Mario (1983), Half-Life (1998), or even Pac-Man (1980) [12, 13], but mostly used in renowned games not necessarily made to impact the player’s emotions by its own. Positive emotions and adaptations in a game can be further improved using affective systems with the help of emotion recognition algorithms to create an effective video game. 4 Affective video games are still a subject of active research [12, 13, 64, 65], a more smooth and personal interaction of the player with the game is still a problem to be investigated. This investigation worked on video games that change to different conditions by first providing an overview of PCG approaches commonly used in practice and emotion recognition methods, to work on a new playable experience that uses game design as its root, and adapts to different configurations of parameters set by emotions as input. The idea is to build a video game that uses game design theory to adapt challenges, design and content with help of an affective system, enhancing the player’s game experience using their emotions and reactions to in-game dynamics. 1.2 Justification PCG tries to assist with massive content creation in games or to ease the game developer’s work but at the cost of game design and decisions that can be better for the general experience. Emotions can be used to improve the game mechanics and challenges, or even modify the world to adapt to the player’s abilities. By game design, difficult challenges are created to cause the emotion of anxiety in the player [14], but giving the players too powerful tools or abilities may make the challenge too easy and cause boredom. On the other hand, surprise is an emotion that enhances or participates in other emotions that can be used to define a player's profile of what he likes and this profile, in turn, can be used to modify the game’s environment or mechanics. There are different methods to recognize and analyze emotions in real time, and even new studies that use headsets, or use just a part of the face to try and detect a person’s emotions. Other information can come from what the player is doing or how the character behaves in game, and this information can be used to analyze and determine an emotion. 5 Using this as the basis for our investigation, the question arises: Can recognized emotions be used to procedurally change the environment and challenge of a video game without sacrificing good game design practices and create a better experience for the player? This investigation will provide an overview on the current state of adapting video games, to then create an implementation to analyze emotions in a new adaptive and affective video game. The aim of the investigation comes in the next section where the objectives are described. 1.3 Objectives After exploring the interdisciplinary and broad field of investigation, this section includes the general objective followed by the description of the specific objectives of this work. 1.3.1 General Objective The main objective of this investigation is to evaluate the effects on the player’s experience of adapting a video game using PCG, taking into account the player’s emotions and game design theory. 1.3.2 Specific Objectives The following objectives have been set in order to fulfill the general objective: 1. To analyze existing theories of emotions and procedural content generation methods used in video games. 2. To design and develop a video game that uses PCG to alter its content. 6 3. To define an affective system for the video game that uses emotions as input to adapt the game parameters according to the player’s current mood. 4. To evaluate the player’s experience between game sessions using PCG with and without emotion detection through facial recognition. 1.4 Scope and Delimitation This investigation focuses on the creation of a game using a known game engine. The game uses an emotion recognition algorithm feeding a PCG system, and evaluates the impact on the game’s overall experience. The emotion recognition hardware was limited to a single camera for the player, and a known library for recognizing emotions. This investigation scope does not cover the actual creation of emotion recognition algorithms in detail. Video games use a lot of optimized systems to enhance the overall performance of the game, and are known to use most computational resources. Adding more layers to the game can directly impact the game experience, for this matter, both the PCG and emotion recognition algorithms were limited to the capabilities of the game engine. The video game was created for people in the age range of 18 to 64, who have played and liked games similar to the one proposed. The evaluation process of emotion recognition can have a lot of noise if the person does not have any experience with games and/or does not like a similar game genre. With the delimitations and scope defined, in the next chapter we discuss the most important definitions and theories for the investigation. 7 1.5 Related Work The exploration of adaptive video games and the utilization of player emotions to enhance gameplay experiences has been an ongoing area of research. Several studies have investigated the use of biofeedback and physiological signals to adapt game elements, aiming to create more personalized and engaging experiences. For instance, researchers have explored the use of heart rate, skin conductance, and facial expressions to dynamically adjust game difficulty, narrative, and even the behavior of non-player characters (NPCs) [66]. Figure 1. Responsive gaming experiences in Nevermind. Nevermind [78] is a game that uses biofeedback technology to adapt gameplay based on player physiology. By directly measuring physiological signals like heart rate, eye movement, and facial expressions, the investigation demonstrated the potential for creating truly personalized and responsive gaming experiences. Our investigation builds upon this foundation by focusing on facial expression analysis as a primary input for emotion recognition, allowing for a less intrusive and more accessible approach to player affect sensing. Aulden Carter, K. Ludvig and S. Oliver [79] explored the use of PCG to create game levels that evoke specific emotions in players. By leveraging the color-emotion theory and the Circumplex Model of Affect, this work provided a framework for mapping emotional parameters to game content. We expand upon this concept by integrating real-time emotion recognition to dynamically adjust not only the level design but also 8 game mechanics and difficulty, creating a more responsive gameplay. Figure 2. Color-emotion theory in the game To the Skies! [79] The dissertation by Jeroen van Lankveld [80] provides a comprehensive overview of Dynamic Difficulty Adjustment (DDA) techniques and their impact on player experience. It explores various approaches to DDA, including methods based on player performance, physiological signals, and emotional state. Our study aligns with van Lankveld's work by focusing on emotion-driven DDA, while van Lankveld's research primarily focuses on adjusting difficulty parameters within a pre-designed game structure, we utilizes PCG to dynamically alter the game's content and challenges, offering a more flexible and responsive approach to DDA. This investigation further explores the concepts introduced by focusing on the integration of real-time emotion recognition through facial expressions with procedural content generation (PCG). While previous studies have explored the use of emotions in adaptive games [66, 67, 68], this research aims to create a seamless and personalized experience by leveraging PCG to dynamically generate new game content based on the player's emotional state. 9 Figure 3. PCG actors in Unreal Engine 5 The research by Yannakakis and Togelius (2011) [46] proposed a framework for adapting game content based on player emotions, aiming to enhance engagement and immersion. Taking their study into consideration, we work by implementing and evaluating a specific instance of this framework within a game prototype. It also extends the scope of adaptation to include not only content generation but also dynamic difficulty adjustment and character behavior modification. Figure 4. Yannakakis and Togelius research on game adaptation The incorporation of Unreal Engine 5's advanced toolsets, such as Lumen, Nanite, and the PCG framework, presents exciting opportunities for further enhancing the adaptive capabilities of the game and player experience. 10 Furthermore, this investigation emphasizes the importance of game design principles in the development of affective video games. By incorporating elements of challenge, skill, and flow, the aim is to create an adaptive system that not only responds to the player's emotions but also maintains a balanced and engaging gameplay experience. This focus on game design distinguishes this research from previous studies, which may have prioritized the technical aspects of emotion recognition and adaptation over the core principles of game design. We combine and build upon these existing research areas, aiming to contribute to a deeper understanding of how real-time emotion recognition and PCG can be effectively integrated to create more engaging and personalized video game experiences. 11 CHAPTER II. THEORETICAL FRAMEWORK This chapter explains terms crucial for the understanding of this investigation and is structured to fulfill the first objective described in section 1.3. As shown in Figure 1, there are different categories or disciplines important to describe what affective video games are. Figure 5. Affective Video Games [11] Video games are an increasingly important source of entertainment. They exist on all types of platforms like Smart TVs, consoles, PC, cellphones and even clocks have the possibility of gaming on their screens. Video games are a multi-billion dollar industry that is on the rise [15]. Video games can change and adapt to situations or parameters. They, as a source of entertaining experiences, can evoke or use emotions via music, content, or even a story, as we explain in the next section. 12 2.1 The Player is Emotional According to Plutchik [16] an emotion is a complex chain of loosely connected events that begins with a stimulus and includes feelings, psychological changes, impulses to action and specific, goal-directed behavior. This means that feelings are responses to significant situations, they do not happen in isolation, and they often motivate actions. There are different models to analyze basic emotions and how they relate to each other. To evaluate and describe the emotions needed for this investigation, we used Robert Plutchik’s wheel of emotions [16], where he describes these relationships accurately. For this investigation we select and analyze inputs, using an emotion recognition algorithm, and create different outputs for the game using procedurally generated content. The main idea is to maintain the player’s interest in the game, so two important emotions to analyze were anxiety/annoyance and boredom to keep the player in the flow channel. Another emotion important to evaluate was surprise. Anxiety is by definition a feeling of worry, nervousness, or unease, typically about an imminent event or something with an uncertain outcome. Anxiety is one of the negative emotions in game design, and is mostly being accompanied by a difficult challenge that the player is facing in the game. Boredom is a feeling of weariness because one is unoccupied or lacks interest in one's current activity. This emotion might appear if the challenge of the game is too low for the player’s skill [14]. Surprise is an unexpected or astonishing event, fact, or thing. This is one of the most important emotions in games, since it improves the player experience and might enhance other emotions such as fear or awe. Using empathy for the player we can use different psychological theories. For example, creating a setting where the character fights alone can create a feeling of insecurity, as Allport [17] describes, socialization generally leads to pronounced 13 conflict-creating prejudices that makes us feel uncomfortable, while integrating the individual in the common world can lead to a more secure feeling. 2.1.1 Plutchik’s Wheel of Emotions This author proposes a three-dimensional circumplex model, as seen in Figure 2, that describes the relations among emotion concepts, representing the intensity and similarity among these emotions. Figure 6. Plutchik’s Wheel of Emotions [16] This model was considered in the project to assess the categories that were evaluated in the emotion recognition system. For example, an emotion like contempt is the intersection between disgust and anger, so the system can analyze both emotions in contempt to increase the probability of predicting the emotion. Awe is a mixture of fear and surprise, both emotions have similar apex but mean something completely different in a game context, emotions like these need to be analyzed separately. Emotions can be evoked by objects, memories or actions. Several studies 14 demonstrate how emotions can be evoked by colors [18], music or even architecture [19] [20]. 2.1.2 The Psychology of Color Colors are everywhere, they are part of our everyday actions. It is known that colors can affect our emotions [18]. One color can evoke different emotions, and each emotion can be associated with different colors [21]. Colors consist of color hue, the color itself, saturation, color intensity, value, lightness and darkness of a color, each of which has a different effect on these evoked emotions. As Risk states [22], colors can affect us emotionally, psychologically and even physically, and this is a reason why color themes are so important when telling a story. A story can be told in different fashions, images like comics, videos like short films or movies, through books or even blogs. Some examples of how colors can affect the story told: ● Elicit psychological reactions with the audience. ● Draw focus to significant details. ● Set the tone of a short film or movie. ● Represent character traits and more. ● Show changes or arcs in the story. Colors can evoke emotions and set the tone for a movie or game and these effects have predictably similar reactions. Table 1 shows which emotions or moods are typically evoked by seven unalike colors. [22, 23]: 15 Table 1. Colors in Films. Color Evoked Emotion or Mood Red Love, passion, violence, danger, anger, power. Pink Innocence, sweetness, femininity, playfulness, empathy, beauty. Orange Warmth, friendly, happiness, exotic, youth. Yellow Madness, sickness, insecurity, obsessive, idyllic, naive. Green Nature, immaturity, corruption, ominous, darkness, danger Blue Cold, isolation, cerebral, melancholy passivity, calm. Purple Fantasy, ethereal, eroticim, illusory, mystical, ominous. There are different relations between color and emotions, we can configure emotions as positive or negative and define how “positive” or “negative” a color is based on the emotions it evokes. Kaya, Epps and Hall studied the correlation between colors and emotions [23]. We can set a color scheme to approach a mood with more than just one color, and communicate a theme to the player, or viewer, as described in Figure 3. Figure 7. Balanced Colors [22] Monochromatic and analogous schemes can create a deeply harmonious feeling, as achieved in The Matrix with a green monochromatic scheme [24]. Complementary schemes are used to create contrasting ideas, like used in Mad Max [25], contrasting the desert with the sky and water. Triadic is the least used, but can be used in a more vibrant 16 scene [22]. Colors can be used as an associative representation to characters, topics, ideas, transitions and other symbology, but should not be seen as a limitation to the creator [22]. Colors can create textures and materials, and can be combined in limitless forms, this sets an important area in emotional architecture. 2.1.3 Architectural Atmospheres Architecture can be described with a simple dictionary definition as the art of Zumthor projecting and constructing buildings, but architecture is actually a complicated, multi-faceted subject. As Peter Zumthor states: “Architecture is always concrete matter. Architecture is not abstract, but concrete. A plan, a project drawn on paper is not architecture but merely a more or less inadequate representation of architecture, comparable to sheet music. Music needs to be performed. Architecture needs to be executed. Then its body can come into being. And this body is always sensuous” [20]. As stated before, games are an interdisciplinary subject, and architecture has an important role in the creation of video games. Architecture and video game design for environments are related, but they are not the same. Video games do not need to follow every rule of architecture and not even real physics applied to the game’s architecture. Unused spaces, impossible designs, unreal intersection of areas are some of the possibilities of architecture in games. The creation of environments, buildings and other “man-made” structures in a game is mostly limited by creativity and game design. Even if video games architecture and environment design work differently, we can use certain techniques used in real design as the basis of emotional architecture. Peter Zumthor [19] describes architecture as a medium to perceive atmospheres through our emotional sensibility and describes nine principles, or guides to have in mind when building atmospheres in architecture. 17 As described in Table 2 (it uses the most applicable guides from Zumthor for our investigation), concepts and different theories of architecture can be used to set moods or atmospheres that surround the environment, creating a more emotional setup. The video game perspective uses Zumthor’s theories to help create a more suitable environment for a requested emotion. Table 2. Building Atmospheres [19] Name Brief Description Video Games Perspective Material Compatibility Materials react with one another and have their radiance, so that the material composition gives rise to something unique. The combination of materials and colors give rise to stimuli for different emotions. The Temperature of a Space Temperature in this sense is physical, but presumably psychological too. Using materials with different temperatures (like metal as a cold material), can help set a mood for a space. Surrounding Objects Things can come together in a very caring, loving way, with this deep relationship. Mixing objects with similar properties to evoke a certain emotion by their relation with the environment. Between Composure and Seduction How architecture involves movement to seduce, direct or lure away people. Seducing the player to traps, or setting the architecture to a more luring and stressful environment. Tension Between Interior and Exterior One can be inside or outside, the almost imperceptible transition between spaces. Relaxing the player by showing a reward outside their current position. Levels of Intimacy Proximity and distance. It refers to the various aspects like size, dimension, scale, the building’s mass by contrast to our own. The feeling of intimidation by enormous thresholds, or of security by a smaller threshold filled with light. The Light of Things Put in the light as if you were hollowing out the darkness, as if the light were a new mass seeping in. Light can have a direct impact on how the environment feels, it can set the mood for the whole room or just a small area. 18 Other components such as natural lighting and ventilation can generate a more healthy or happy experience, combined with natural systems and open spaces, stress can be reduced. Amid crowded spaces and busy environments a designated quiet zone in between sceneries can boost the confidence of a person. Games like Hades [26] use a similar approach, setting an open space with relaxing fountains and calmed waters to reduce the intensity in between battles, like shown in Figure 4. Figure 8. The Fountain Chamber, Hades [26] Emotional architecture is completely subjective, people do not always respond with the same emotion to different stimuli by the combination and distribution of elements in the environment. This is where adapting video games and emotional design make their appearance to help and adapt the game environment. 19 2.1.4 Emotional Design in Video Games The design process in a video game starts when the idea of the game is conceived, the mechanics, dynamics, setting and character appearance are done, and it is finalized when the project finds its end. Norman defines good emotional design with three major components: visceral, how you respond to a stimulus; behavioral, asks if something is actually useful and if it influences the users behavior; and reflective, if the user thinks of an object after it is gone [27]. Visceral components are easy to achieve, but reactions to these components are difficult to predict (positively or negatively) because these reactions are subjective. Visceral components can have a spontaneous response, but tend to fade away as time goes by or as the player becomes used to the situation. For example, in a horror game filled with jump scares each subsequent scare will evoke less of a response from the player as they get accustomed to the “shock”. Game systems on the other hand, are designed to affect the player over long periods of time. As shown in Figure 5, game systems can affect a player’s emotions and that can affect the player’s behavior. Game systems can respond to these emotional and behavioral responses with different in-game design. 20 Figure 9. Game Systems Emotional Loop Based on [28] Take League of Legends killing sprees as an example [29]: When a player manages to eliminate an enemy a message, for every player in the match, appears telling who did the action. For each consecutive elimination (in a short period of time) the same player achieves, a new message is generated up to a 5 elimination spree (Pentakill), accumulating the momentum of the player in every new message. This creates feedback for the players and generates an emotion in them. These emotions can then be translated into behaviors in game, like acting more aggressively following the killing spree in search for more kills when the player normally wouldn’t. Norman [27] describes physical objects as our delimiters of what we can actually do, like the keyboard or controller for a game, but this is overshadowed by the rational decisions of what one should be doing based on those physical limitations. The emotional decisions of what we want to do are described at a higher level, even if they are not the correct choice. This theory describes why emotional design in video games is so important, and how affective video games can retain a player's emotions and interest in a game. Mixing color theory, emotional architecture and even music to evoke emotions in games can change the way players perceive their experience in each playthrough of the game. 21 2.1.5 Emotional Music in Video Games Music is a complex acoustic and temporal structure rich in context and expressivity [30] and plays an important part of our everyday lives. Open libraries with large collections of music have increased the potential of research in different areas, like video games or emotion recognition. According to Juslin and Sloboda [31], when a person engages with music, a very broad range of mental processes are involved, including representational (perception of rhythm or harmony for example), and evaluative (including mood and emotion). As these big collections of music are usually described per emotion, the use of an appropriate emotional track in games becomes an easier process. Different epic songs can help enhance the emotions in a final boss scene, while a relaxing track can help make exploring the game more enjoyable. Simple sound effects can come with different emotions attached to them: The sound of a powerful weapon, the scream of the commentator narrating the killing spree, the sound of a chest opening, or the sound of coins falling in your character’s pockets, they all define positive patterns in games, and are followed by a positive emotion evocation. Games like Frogger, released by Konami in 1981, was one of the first video games to use adaptive music in video games, where it would abruptly change, whether the player was safe in a checkpoint or facing any danger while crossing a street with a frog. So far we have defined concepts to help us evoke emotions indirectly by different methods, and as explained earlier, video games are a source of entertainment which are directly related to emotions. There are several methods to predict emotions, in the next section we talk about how emotions can be predicted using a few important algorithms. 22 2.2 Emotions Can be Predicted In our research we worked with algorithms capable of recognizing emotions via facial recognition. These algorithms are tested and compared using different datasets, kernels and parameters in order to determine the best performing algorithm for use in real time in a video game. There are different emotion recognition methods and systems used in affective video games research (As defined in Table 3). We used facial expression recognition for this investigation as implemented in the Dlib [32], Open Face [33] or OpenCV [34] library. Table 3. Emotion Recognition Methods [36] Type Method Brief Explanation Vision Based Facial Expressions Systems that exploit facial expression information to recognize the visual manifestation of a person’s emotion. [35] Body Actions System that recognizes body actions/postures to predict an emotion, usually mixed together with facial recognition. Brain Computer Interfaces EEG Electroencephalography measures voltage fluctuations resulting from ionic current flows within the neurons of the brain. NIRS It uses infrared spectroscopy in order to extract features (measures brain activity). Haptics Devices that exploit the sense of touch by applying forces, vibrations, or motions to the user. Physiological Measurements Sensors Sensors that measure activity in the brain, heart or respiratory system. Skin or muscles can be used to extract physiological cues. Mocap Systems Markers are strategically placed on the joints to measure the movement of the body. Wearables Specialized devices incorporating computers and advanced electronic technologies. 23 Facial recognition opens different research areas, one of these is emotion recognition. Intelligent or affective systems are becoming more common nowadays, and creating better system interaction can be achieved with emotion recognition. With the advance of artificial intelligence the creation of systems that can detect emotions or behave as a human has increased. 2.2.1 Facial Expression Recognition Facial expression recognition uses feature-based approaches for facial observation to classify different facial expressions by using facial feature trackers and analyzing a small amount of features such as the jaw line, eyes, eyebrows, mouth, nose and facial distances [35]. Facial expressions are the most expressive way in which humans display emotion, they are responsible for 55% of the effect of a speaker’s message [37]. Considering this, there is no doubt that facial emotion recognition (the process of identifying human emotions based on facial expressions) plays an important role in the communication dynamic . Despite being an effortless ability for humans, it is not an easy task for a machine. Multiple algorithms and techniques have been proposed to accomplish this objective with different levels of granularity and precision to improve human-machine interaction in different applications such as psychological analysis, medical diagnosis, forensics (lie-detection), studying effectiveness of advertisement, for example [35]. 2.2.1.1 Landmarks The center of the eyes, the tip of the nose, the corners of the mouth and other salient facial points are called facial landmarks [38]. These landmarks can be used to detect facial expressions, which is usually done by three subsystems: facial landmark 24 tracking, building features from the landmark tracking result, and classification of the extracted features [39]. For example, we can have an image or video as input that can be normalized to always have the same size, and optimized for the detection of landmarks by setting it to grayscale and increasing the contrast of the image. Depending on the algorithm, a set of n points (68 different points in the example of Figure 6) around the most distinguished facial features are calculated. Each point is a vector in a coordinate system which can be further processed to include more features, like a mean point as the center of gravity and the magnitude of each landmark to this center of gravity. Figure 10. Landmarks These features can be used to categorize the image by a classifier trained with different images per emotion. The classifier then predicts the percentage per emotion with the landmarks given. Table 4 has the results of the example in Figure 6, classified as happiness: Table 4. Predicted Emotion of Figure 6. Emotion Anger Contempt Happiness Sadness Surprise Percentage 1.53% 12.2% 75.4% 0.31% 10.56% 25 The facial landmark tracking can be done using multiple algorithms. One of them is the Ensemble of Regression Trees (ERT) [40], used by the Dlib library. This method is based on pixel intensity differences to directly estimate each landmark’s position. These estimated positions are subsequently refined with an iterative process using a cascade of regressors. These identified landmarks are the inputs used to build the features, that can consider geometrical (coordinates, distances, etc.) and texture information [39]. After this, the features can be classified by a supported machine learning technique, such as the Support Vector Machines (SVMs) [41]. 2.2.1.2 Fisherfaces Belhumeur et al. [42] propose a face recognition approach based on Fisher’s Linear Discriminant (FLD) [43], which maximizes the ratio of between-class scatter to that of within-class scatter. It consists of finding a linear projection of the faces from the high dimensional image space to a significantly lower dimensional feature space insensitive both to variation in lighting direction and facial expression in a way that projection directions are nearly orthogonal to the within-class scatter, maintaining discriminability. Figure 7 shows how the LDA separates different emotions based on the image similarities. The EigenFaces is a simpler version of the Fisherface algorithm, which can be used to separate emotions in a similar fashion to classify emotions. 26 Figure 11. Linear Discriminant Analysis (LDA) [42] 2.2.1.3 Eigenfaces According to Belhumeur et al. [42], the main difference is how an optimal projection is calculated by this algorithm. Note that the use of the EigenFace projection implies the between-class scatter and the within-class scatter. However, the latter represents unwanted information for classification purposes. As described by Etemad and Chellappa [44], in these methods the set of all face images is considered a vector space, and the eigenfaces are simply the dominant principal components of this face space; they are computed as eigenvectors of the covariance matrix of data, the eigenvalues are the corresponding scale factors. The name Eigenvalues comes from the fact that the eigenvectors have the same dimension as the original images. 2.2.1.4 Deepface DeepFace's architecture is composed of several layers of convolutional neural networks, which are designed to process and analyze image data. The process begins with 27 a face alignment step, where the input image is normalized, ensuring that the face is centered and scaled appropriately. This alignment significantly enhances the model's ability to recognize faces under varying poses and lighting conditions [59]. Figure 12. Deepface analysis [60] Once the image is aligned, it is passed through multiple convolutional layers. These layers automatically extract hierarchical features from the image, starting with simple patterns like edges and progressing to more complex structures such as facial features. The final layer of the network produces a high-dimensional vector representation of the face. This serves as a unique identifier for the face, capturing its essential characteristics in a form that can be easily compared to other embeddings [59] as seen in Figure 8. Landmarks, fisherfaces and eigenfaces are just three examples of emotion recognition based on facial recognition, among many others. For this project, we were looking for an algorithm that has an appropriate robustness, but also can reasonably be integrated into the desired environment and used in real-time during game play. A video game can evoke emotions, and these emotions can be predicted by algorithms as the one explained in this section, but to take it a step further, the video game can use algorithms to adapt to different situations as we explain in the next section. 28 2.3 The Video Game Adapts Video games have been evolving through the time and adaptation of the game is a relatively new field of investigation that is being used in video games. Adaptive gameplay has advantages compared to non-adaptive gameplay, as far as matching essential cognitive features of the player [45]. This describes how important adaptation can be in order to design and develop a video game that enhances the player experience. 2.3.1 Adaptive Video Games Adaptation in games can come in different scenarios, Bontchev explains how there are three main mechanisms for game adaptation and describes their adaptation methods in Table 5. Video games can be adapted to certain conditions using methods as previously defined to procedurally generate new content or changing gameplay mechanics and dynamics to improve the overall game feel. We explain what PCG is, and how it can be applied to video games in the next section. 2.3.2 Procedural Content Generation As explained in the paper “Experience-Driven Procedural Content Generation”, PCG refers to the creation of content automatically, i.e. through algorithmic means [46]. PCG can create several types of content, stories for dungeon adventure [47], art content for a museum (As shown in Figure 9) [48], or even environments, rigs and animations for a video game [2, 5]. 29 Table 5. Adapting Mechanisms [45]. Mechanism Adaptation Description Tasks Explicit Tasks Such as objectives, goals and missions posed to the player as part of the gameplay. Implicit Tasks Not explicitly stated by the game interface but expected to be fulfilled, such as “staying alive”. Player-Driven Tasks Created by the player by her/his creativity within limitations of given mechanics (Emergent gameplay). Difficulty Level Generation Uses methods to algorithmically create in-game content. Artificial Intelligence Dynamically adjusts the difficulty of an intelligent non personal character, based on the current state of the player’s abilities. Level Content Dynamically adjusts the level of inventory interacted by the player for specific game context according to the player’s skill level. Audio-Visual Game properties Dynamically adapting the game’s audio or visuals like ambient lighting or post processing. PCG in video games can be defined as the algorithmic creation of game content with limited or indirect user input [49]. One of the most important reasons why using PCG is to alleviate the need to have many artists and designers for the game and help with the project resources. PCG is not only about creating an environment, procedural content can vary from different subjects, for example soundtracks, assets, level generation, gameplay mechanics or difficulty adaptation. We can further delimitate the definition by looking into what can be considered as a PCG algorithm and as defined by Shaker, Togelius and Nelson [50]: 30 Figure 13. Design your Wig, V&A [48] ● A software tool that creates dungeons for an action adventure game, without any human input in-game. ● A system that creates new weapons in response to what the collective of players do. ● A program that generates complete, playable and balanced board games on its own. ● Game engine middleware that rapidly populates a game world with vegetation ● A graphical design tool that lets a user design maps for a strategy game, while continuously evaluating the designed map for its gameplay properties and suggesting improvements to the map to make it better balanced and more interesting. 31 And a few things not considered PCG by the same authors [50]: ● A map editor for a strategy game that lets the user place and remove items, without taking any initiative or doing any generation on its own. ● An NPC (Non-player characters, including neutral characters and enemies) for a board game. ● A game engine capable of integrating automatically generated vegetation. Two good examples within the delimitation stated are Rogue (1980), which, as explained earlier, randomly generated levels every time a new game started. Spore (2008) is a life simulator game with real-time strategy where you can create your own character based on a set of modular pieces, as shown in Figure 10. Spore is the perfect example of a successful PCG algorithm used to create the user's own character from scratch using different professional- quality pieces [51]. Figure 14. Character Creator, Spore [51] Sinking City, a third person action/adventure shooter [52], uses another type of 32 PCG, where the cities are procedurally generated during the development process. This helps with development time, and the final output can be further improved upon manually. PCG implementations might come with different problems and solutions, Shaker, Togelius and Nelson [50] describe five desirable properties in a PCG algorithm: ● Speed: Performing faster is important depending on whether the algorithm is creating content in real game time (usually measured in milliseconds) or during development (which could even be months). ● Reliability: Guaranteeing that the content generated satisfies some given quality criteria. ● Controllability: Controlling the algorithm to a certain extent, so that the player, developer or algorithm can specify different aspects of the content. ● Expressivity and diversity: Avoiding minor variations to a single theme, or completely randomizing everything. ● Creativity and believability: Creating the output to make it look like it was made by a computer algorithm. Depending on the PCG, game criteria and how the algorithm is applied, these properties can be essential or not necessary, and usually there are trade-offs like speed and quality. There are different kinds of possible PCG algorithms depending on what the needs of the project are. We discuss a few of them below, selected by the importance of investigation and testing in this project. One important category of PCG algorithms are additive in nature which includes tiles, distribution, parametric, and interpretive. There are other additive algorithms like grammars and simulations (or even the subtractive methods) that I’ll not cover in this document. They are considered additive because they constantly generate or add new data into a space. 33 Tiles are a common method used when generating procedurally generated environments like the earlier explained Diablo [2]. Tiles can work well with environments that can be broken into regions and the placement of the tiles does not necessarily need to be constrained to various limiters. For example, a set of empty tiles with defined orientations and spaces for different objects. Transitions between tiles can be set based on their content and thresholds, and even to represent an emotion, and later be filled with similar themed objects. Distribution is a system that places data into a space. In games this can help to put down many objects in the scene, although most of the time are randomly generated placements, so the use of clusters or a hierarchical spacing of elements might be necessary to achieve real, and more natural, distribution of elements. Parametric PCG is a system where a compound of parameters affects the output of the final content; this can even be combined with different methods. Spore [51] and No Man’s Sky [5] games used parametric systems to optimize their final content results. Interpretive algorithms are used to process data (like a map, or a human movement, or even an emotion) into another different data and change the environment or models with additive procedures. An example of this is to adapt a region into a curve as a delimiter, Civilization games [53] use similar PCG algorithms for map generation. Sometimes PCG can work as a filler, to add unalike elements to the environment that are not really unique. As described by Eriksen and Eriksen [53] the visual system integrates stimulation over intervals, and sequentially presented patterns thus tend to summate into a single percent. This is the definition of perceptual differentiation, which is used in blockbuster movies like The Matrix, as shown in Figure 11. 34 Figure 15. The Woman in the Red Dress, The Matrix [24] In this scene all persons had twins and even triplets in the scene, but most of this is ignored and the focus of the viewer is set in the main actors and the woman in the red dress, achieving perceptual differentiation with perceptual uniqueness. PCG can use this concept to create background elements without the necessity of being unique objects. The player is the final user, and the experience of the game is crucial for this investigation. With this in mind, we can improve the game itself and PCG content by following certain game design theory and criteria that is presented in the next section. 2.4 Improving the Experience A video game can be described as a compound of mechanics, dynamics and aesthetics. As Hunicke, LeBlanc and Zubek say [54]; “By understanding how formal decisions about gameplay impact the end user experience, we are able to better decompose that experience, and use it to fuel new designs, research and criticism respectively.” Understanding of the mechanics, and searching for new dynamics in game can result in complete new implementations that enhance the game experience. Game design 35 and decision making is an essential part of game development, we discuss it in the next subsection. 2.4.1 Game Design As Jesse Schell says, “game design is decision making, and decisions must be made with confidence” [14]. Game design is not a simple action in game development, but the compound of decisions taken from start until the last update of the game is finally done. Some decisions are more important than others based on the type of games and principles in game design, the following definitions are some of the most important for this project. 2.4.1.1 Challenge Challenge is sometimes defined as the general difficulty of the game, but the challenge in a game is defined by certain game events and goals the player needs to achieve in order to progress in a certain point of the game. The challenge is always changing as the skill of the player improves [14]. 2.4.1.2 Skill It is crucial to keep the skill of the player in check. As the player’s skill changes as the game progresses it makes the game easier for them. Jesse Schell defines three different skills to have in mind [14]: 1. Physical Skills. These include skills involving strength, dexterity, coordination, and physical endurance. Effectively manipulating a game controller is a kind of physical skill, but some video games (such as Rayman Raving Rabbids, or Wii 36 Sports) require better physical skills from players. 2. Mental Skills. These include the skills of memory, observation, and puzzle solving. Although some people run away from games that might require too much mental skills, games require making interesting decisions, and decision making is a mental skill. 3. Social Skills. These include, among other things, reading or fooling an opponent, and coordinating with teammates. Typically we think of social skills in terms of your ability to influence people, but the range of social and communication skills in games is much wider. MOBA games are social, they focus on teamwork and on intimidating your opponents, for example. 2.4.1.3 The Flow There are certain emotions that are not desired in game design. Anxiety and boredom for example, are emotions that should be kept at a minimum. The system should then adapt the gameplay, based on a set of rules of game design and game mechanics, to try and reduce these emotions. As explained in Figure 12, when the player is playing, a difficult challenge compared to the player’s skill level brings the emotions of anxiety, but if the player has enough skill for the challenge, boredom will show up. Neither emotion is generally desirable for the player to have, so the flow is defined by the space in which the player feels comfortable, where the challenge is neither too difficult nor too easy. 37 Figure 16. The Flow Channel [14] The experience of playing a video game is emotional, the player is constantly feeling emotions that can be predicted by algorithms like the facial recognition methods studied. On the other hand, a video game can be adapted to certain new conditions using PCG algorithms. We can use these emotions predicted as parameters to the adaptive algorithm, and change the gameplay and look of the game to create a more immersive experience. The use of emotions as the essential engine of an adaptive algorithm in a video game, creates an affective video game, which we discuss in the next section. 2.4.2 Affective Video Games Affective computing is an interdisciplinary field that recognizes and processes human emotions, nowadays an important research area in computer science. Affective systems can be used to adapt a situation to the user’s perspective and mood. As such, affective systems have also been a topic of interest in video game research. The term affective computing was proposed by Picard [55] as a type of computing that; “relates to, arises from, or influences emotions”. Affective games use emotions as 38 the core driver that generates the environment and sets the core mechanics to adapt the player’s experience for a specific profile. We can use Figure 13 to define our affective system. Figure 17. Affective game feedback loop [28] The player is creating and experiencing a story through a video game, which evokes different emotions. These emotions can be detected and predicted by algorithms as the Landmarks, Fisherfaces or Eigenfaces and used as fuel to the game’s adaptation. Consequently, in an affective system, we use emotions as parameters for a PCG algorithm to start adapting the game’s content and gameplay. Changes in the story creates new experiences that can be measured by the emotions evoked by this new situation, and here the systems start again. This is defined as the affective game feedback loop [28] and used as based to create an affective game. Video games, including affective ones, can be created using different specialized software or environments. In the next section we describe the environment that is used for our research. 2.5 Development Environment Game engines are software environments for video game creation. Affective and adaptive video game research uses different APIs or tools to change the game content, but some of these are game-specific or engine-specific. For this project we used Blender as a 39 3D design tool, Unreal Engine as the game design environment and C++ and Unreal Engine blueprints as the programming languages for the game. As defined by Epic Games [56], “Unreal Engine is a complete suite of creation tools designed to meet ambitious artistic visions while being flexible enough to ensure success for teams of all sizes. As an established, industry-leading engine, Unreal delivers powerful, proven performance that you can trust”. The engine and different libraries for emotion recognition and 3D modeling for the game as Blender (a 3D modeling tool) [37] was used to create the affective system for this investigation. In the next chapter, we’ll set the delimitations of the research and the evaluation mechanisms planned based on the environment setted. 40 CHAPTER III. METHODOLOGY This research depends on an emotion recognition algorithm for predicting the emotions of the player, a correct PCG algorithm for creating the environment and positioning the game elements and creating a video game to combine all elements into an affective video game. In this section the bases and delimitations of this investigation are set, including the evaluation mechanism for the experiment. Here are the elements that describe the process behind the methodology of this investigation. 3.1 Investigation and Analysis The initial phase of this investigation involved selecting a suitable game engine for prototyping. Unreal Engine 5 was chosen due to its powerful toolset, real-time rendering capabilities, and proven effectiveness in both research and practical applications of Procedural Content Generation (PCG). Cutting-edge features like Lumen (real-time global illumination) and Nanite (virtualized micropolygon geometry) were instrumental in bringing the game's vision to life [50, 56, 61]. Subsequent to choosing the game engine, we explored various emotion recognition solutions, focusing on those compatible with Unreal Engine 5 and capable of real-time performance during gameplay. Landmark-based and other emotion recognition algorithms (like DeepFace), utilizing a single camera, emerged as potential candidates for emotion prediction. Another important step was to study and analyze existing PCG techniques, but which may not necessarily have been used in affective games. This informed us which PCG algorithm we can use to utilize the recognized emotions as input for the adapting algorithm. 41 Since affective video games are an interdisciplinary subject, investigating and studying the areas of cinematography, color theory, game design, architecture and psychology in more detail was necessary for better understanding affection. 3.2 Image Perception Questionnaire The Image Perception questionnaire [Appendix A] was administered to a group of 135 participants, ranging in age from 18 to 64. The questionnaire aimed to measure the emotional impact of various visual stimuli by presenting participants with a series of images and asking them to select the emotion that best resonated with their initial impression. The questionnaire was instrumental in establishing a solid foundation for the game's design, particularly in terms of selecting objects and environments that effectively conveyed the intended emotions for the affective system. The questionnaire results were analyzed by comparing the intended emotion for each image with the strongest emotion reported by the participants. The images were also categorized into three emotional blocks: Anxiety, Flow, and Boredom, based on the theoretical framework of the Flow Channel. The analysis presented in Chapter 4, combined with the knowledge acquired in Chapter 2, offered invaluable insights that directly influenced the design choices for colors, textures, and overall aesthetics of the emotionally evocative tiles within the game. We will now outline the specific video game design decisions that were made based on this comprehensive understanding. 3.3 Video Game Design The game is an adventure game designed in Blender and created in Unreal Engine 5, the player is faced with different obstacles that change based on the emotions they display. Additionally, the difficulty of the game, the mood of the environment and character personality are also aspects that change based on the recognized emotion. 42 The setting of the game is a survival environment in which the player has from 10 to 15 minutes to survive and pass through individual obstacles in the procedurally changing world. The game world is composed of the player, the NPCs, the environment and architecture. We discuss each one in the next sections. 3.3.1 Creating the Player and NPCs The characters design and models for the game were developed in Blender, while their states and artificial intelligence in Unreal Engine 5. Certain elements of the design, animations and colors were part of the next phase of the investigation. The NPCs have different states and animations to help achieve a more calm or aggressive environment. The main character design is currently in development, and follows a similar general structure as the NPCs, the main difference is how the colors are achieved. Figure 14 shows an early design for one of the NPC for the game. The utilization of distinct states and animations for NPCs to influence the game's ambiance aligns with established practices in game development. Studies have shown that the behavior and appearance of NPCs can significantly impact player emotions and immersion [69]. The main character's design focuses on visual differentiation through color, as shown in Figure 14, a technique often employed to enhance character recognition and convey personality traits [70]. 43 Figure 18. First Character Tests. The NPCs have different states that are assigned to different colors and adaptability processes, which are defined in later sections. Table 6 describes the early states for the NPCs (can be later modified based on further advances on the investigation), mostly for enemies since they have the most complex AI. Table 6. NPCs States. State Description Idle Walking around the environment, looking at things, doing predefined animations to blend in with the environment. The NPC is not conscious of the player's existence/position. Hunt Pursuing the player if the NPC is conscious of the player's existence, searching the player if it loses its track. Attack Engaging the player as soon as the enemy is in a close enough position to strike. Escape Running to a safe place away from the player if the enemy health is running low. This only triggers for some enemies. Heal Once in a safe position, the enemy attempts to heal and return to battle. 44 The behavior of the enemies can be seen in Figure 15, which is a representation of the final behavior for the enemies AI. The states in this state machine can be changed by the game state based on the player’s emotions as we see section 3.4. Figure 19. Finite state machine For the final version of the game, a behavior tree was developed in Unreal Engine 5 [56], incorporating multiple subtrees to manage the decision-making processes of the enemies. A simplified version of the foundational behavior tree for the enemies is presented in Figure 16. Detailed explanations of its structure and functionality are provided in subsequent sections. 45 Figure 20. Base behavior tree for the enemy The design includes how the color was managed based on its emotional state and what the NPC is doing. These colors help set a visual structure for the player to know more about the NPC, this is further discussed in section 3.2.3. Before setting the colors we explain how the environment and architecture work in the next section. 3.3.2 Modulating the Environment and Architecture The game is an adventure set in a roguelike dungeon (a PCG algorithm generates each section of the map differently in real time). The environment is created in real time and managed by tiles in an additive PCG algorithm as explained in section 2.3.2. Tiles can be used with parameters to set which kind of elements are present in each one, and to help distribute elements inside the tile. Tiles represent rooms, these rooms can be connected with any other based on their thresholds and orientations. Elements are represented by any object that fills the scene, be it a wall, a throne, a bonfire or even a movable trap. Diablo [2], as shown in Figure 17, sets a similar strategy to be reached, with tiles and different procedural created content in the same map in different playthroughs. 46 Figure 21. PCG in Diablo [2] Regarding the final stage of the experiment, the environment utilizes a persistent level divided into six distinct sections. These sections serve as the foundation for all procedural changes and where all additional streamed levels are loaded. Figure 18 illustrates the final persistent map configuration, this is used as a base for the non-affective system version. Figure 22. Base persistent level As the affective system registers new emotions, the level dynamically alters in response. The game is divided into six sections, with each section capable of transitioning 47 through four distinct stages: the persistent base stage and one designed version for each emotional state in the flow system, as described in Section 2.4.1.3. Figure 19 illustrates three different versions of the same pool area, each design crafted based on insights from the Image Perception Questionnaire. Figure 23. Changes in persistent level These sections also utilized procedurally generated content during the construction stages, particularly for elements such as foliage, ivy, rocks, decals, and other environmental details specific to each area. The architectural design of the level was influenced by Zumthor's principles of emotional architecture [19], incorporating colors, materials, and Brutalism, as shown in Figure 20, elements to enhance the impact of the affective system. 48 Figure 24. Brutalism in game design. Other parameters such as the player’s emotions or emotions to be evoked can be used to change the environment. We further describe adaptation in the video game in section 3.4. The environment and the player design is adjusted to have objects that can be changed or adapted to the current mood situation. Specifically, a representative color of an emotion to be evoked. The next section explains the use of color in order to evoke emotions. 49 3.3.3 Setting the Colors Figure 25. A dark night with multiple colors in the game. As described in section 2.4.2, color is a powerful tool for evoking emotions. From the character’s design, to the dungeon’s materials and textures, even the film grain and overall color scheme can change the whole emotive portrayal of the game. An example of an in-game sark scene with multiple colors is shown in figure 21. The main character is designed to show different emotions and states that can be achieved through the player’s emotions. The idea is to show a contrast between what the player is currently feeling and how the game changes later on based on these emotional states. This is taken as a live visual representation of how the game adapts to the player. As explained in section 3.2.1, the NPCs have various states, these states can have colors assigned for the player to interpret better the situation, Table 7 shows an example of how colors can be assigned to states, and a more graphical example in figure 22. Color can be used to subtly communicate the emotional state or intentions of the NPCs, providing players with visual cues to aid in their decision-making and interaction [70]. 50 These colors are set as schemes, not only for the characters, but for the environment and post processing effects for setting the overall mood. Table 7. Emotional States State Colors Representation Idle Pink, Orange, Blue Innocence, Friendly, Calm Hunt Red, Yellow Danger, Madness, Darkness Attack Red Danger, Violence, Anger Escape Yellow Sickness, Insecurity Heal Green, Purple Nature, Delicate, Ethereal The utilization of post-processing effects to dynamically adjust the color scheme in response to the player's emotional state is a subtle yet powerful technique for enhancing the affective experience in games. The manipulation of ambient light, fog density, and film color grading can significantly impact the overall mood and atmosphere of the game world, creating a more immersive and emotionally resonant experience for the player. The concept of using color to evoke specific emotions has been extensively studied in various fields, including psychology and visual communication [71, 72]. In the context of video games, color has been shown to influence player perception, mood, and even performance [73][74]. By dynamically adjusting the color scheme in real-time based on the player's emotional state, the game can create a more personalized and emotional experience. Ambient light sets the overall tone and mood of the scene, while fog creates a sense of mystery, suspense, or even claustrophobia. Film color grading, on the other hand, allows for more refined adjustments to the color palette, enabling the game to evoke specific emotions. 51 The final image of the game uses the theory to configure the post process to adapt the color scheme. Changes to these settings are more subtle in the game, but here is where the main emotion to evoke is selected. These changes configure the ambient light, fog and film color as previously described. Figure 26. Enemies color changes. Once the system has encountered an emotion that can trigger a change in the game, the respective color scheme is taken into consideration as a parameter for the next procedural change. Figure 23 describes an example of color schemes configured to evoke an emotion. Figure 27. Color Symbolism 52 Having a selected emotion for the next procedural change, not only changes the overall color in the game, but the look as well. Selecting a correct set of parameters for the environment is important to evoke a particular emotion. Evoking emotions can be achieved with music, as studies suggest [31] (and further described in section 2.1.5), so the next section is about managing the music in the game. 3.3.4 Emotional Music Video games use music and sound effects as an important part of storytelling, which helps transmit or evoke emotions. From an epic soundtrack played by a complete orchestra to raise your pulse in a giant battle to a single classical guitar describing the calm environment of the forest with its tunes, music can change the pace of the gameplay or the mood of the game. This branch of the research is more simple and direct in development, using simple and slow soundtracks, which could help the player to be more immersed in the environment and sound effects, for calmer or more subtle emotions. Or a more aggressive soundtrack when fighting hordes of enemies in the game. Music in the investigation was of help to achieve the final objective but not a central topic to dive in. In the next section we define how emotions can be predicted for the game’s adapting system. 3.4 Emotion Detection The effective system we are building starts with the game design, once the game is in a more defined stage, we need to predict the player’s emotions in order to adapt the game. As described in Figure 24, the video game creates an experience for the player, and the experience evokes emotions in the player that we can predict and recollect with an 53 emotion recognition algorithm. Figure 28. Detecting Emotions For the affective system and the emotion recognition algorithm, we used a facial recognition system that is described below. 3.4.1 Facial Recognition As defined in section 2.2, facial recognition is a visual based method that can be used for emotion recognition. We described three possible algorithms that could help predicting emotions via facial recognition. The algorithm was selected by how robust it is and how it performs with UE5. The facial recognition algorithm was trained with datasets that are defined later on. We define a comparison table between the libraries. We decided to go with DeepFace, while computationally demanding, offers a compelling advantage in facial recognition tasks where precision is critical. Its ability to capture subtle facial details and its robustness to varying conditions make it a preferred choice for the affective system that necessitates high accuracy, even if it means sacrificing some speed in the process. DeepFace was used to collect a set of possible emotions with a percentage assigned that best describes the image analyzed. Each frame in time lapse of the live video of the webcam was analyzed and parameterized as described in the next section. 54 Table 8. Emotional Adaptation Feature DeepFace Fisherfaces Landmarks Technique Deep learning with CNNs Linear Discriminant Analysis (LDA) Feature-based approach with landmarks Accuracy (cross validation) High (~97%) Moderate (~80%) Variable, depends on the accuracy of the landmarks Speed Slower due to deep learning computation (~ 4.5 fps) Fast (~ 12 fps) Fast but depends on the number of landmarks Emotion Recognition Recognizes multiple emotions with fine detail. Limited to simpler emotions due to linear methods Relies on the position and movement of landmarks 3.4.2 Parametrizing Emotions With the help of emotion recognition algorithms, we can categorize a live feed from a webcam into emotions, recollecting a lapse from the video and averaging the percentage per emotion in that moment of time. We can use a similar approach as the one described in the example of Landmarks, as described in section 2.2.2, with this proposition we can have the most predominant emotion’s percentage per time span and use this as parameters for the game adaptation, as described in the next section. 3.5 Video Game Adaptation Once the video game is designed and the emotion recognition algorithm is selected, created and can predict and parameterize emotions, the affective system can start adapting the game to a selected emotion as shown in Figure 25. 55 Figure 29. Adapting the Game The first step for adapting the video game to selected parameters was to define the correct inputs for the system. The next section describes the use of emotions as input for the affective system. 3.5.1 Input: Player’s Emotions As described previously in section 3.3.2, we can collect a set of percentages per emotion from a live video and classify them into different emotions. We used a set of around 4.5 images per second in a live stream to analyze in real time. Images with a clearly identified dominant emotion are categorized into three groups: anxiety (anger and disgust), flow (fear, surprise, and happiness), and boredom (neutral and sadness). Then, for a more simplified communication between the game and the emotion recognition software, transformed into positive and negative inputs. Positive inputs are a signal of the player enjoying the current’s game variables, so the game uses that to try and recreate similar outputs in game using the PCG algorithms and changing the game elements. Negative inputs on the other side, alert the affective system to change the current setup and try enhancing the player’s experience making it more challenging, or on the contrary, setting a more comfortable zone. 56 The game needs to adapt to distinct situations, these situations are triggered by different emotions which are defined by positive and negative inputs. If a negative input is encountered, the game tries to evoke a positive emotion and vice versa. Positive input does not necessarily mean positive emotions but what the game is actually looking for with the presented scene. If the algorithm creates an environment that tries to evoke calm, but the player gets frustrated, the environment and settings need to adapt to a new set of parameters and learn from the player’s profile. The same logic is applied for negative inputs to the system analogously. The server periodically sends updates (approximately every 25-50 seconds) containing the emotional classification of 100 processed images, grouped into the three defined categories. The game receives these updates through an event dispatcher, triggering changes in the environment, game mechanics, difficulty, color schemes, and other actors' behaviors. Changes to the game are gradual and persistent. The game tracks the dominant emotion over time to inform long-term design adjustments, while the environment and actors adapt in real-time based on the current emotional state, using the persistent level as a foundation. 3.5.2 Output: Mechanics and Dynamics The concept of dynamically adjusting game difficulty to match player skill and emotional state is central to the design of adaptive games. The idea that increasing challenge can lead to feelings of overwhelm or frustration, while overly simplistic gameplay can induce boredom, aligns with the core principles of flow theory [75]. The challenge-skill balance is important for maintaining player engagement and motivation [76]. As explained earlier in section 2.3, challenge and skill are two different parameters to be attacked when planning a change in the pace and experience of the game. See Figure 26 as an example, increasing the skills of the player by giving them 57 tools in the game can make the game feel easier and thus, the player can be more comfortable. But by giving too many tools to the player and setting the difficulty too low, this can also make the game and experience boring. Figure 30. Increasing difficulty Enemy difficulty is increased to achieve a more challenging game experience. On the other hand, giving the player improved stats and empowering the character (Like increased speed, dealing more damage, or giving the player more dashes per cooldown) can set a more trivial and comfortable path. Using different parameters in the game, the system can evoke multiple emotions in the player with the intention of keeping the player interested in the experience. In the next section an evaluating system for each playthrough is defined to see the impact on the player’s experience. 3.5.3 Output: Environment The environment itself comes with different problems. As stated in section 2.4.3, evoking an emotion by the architecture or atmosphere in the respective environment is 58 very subjective. Selecting a perfect set of parameters to trigger a certain emotion on the player is not possible, so the system has certain predefined parameters for each emotion trying to be evoked at the start of the game. Figure 31. Base/Anxiety/Flow/Boredom tiles. Every section of the level, object and action have a percentage per emotion and the system chooses which of these to use on every iteration of building. The percentage per emotion on each block adapts itself based on the reaction of the player. If the person reacts with a certain emotion on a tile, the respective emotion percentage increases and the next tiles have a different aspect, with changes applied. Figure 27 shows an example of new tiles over the persistent level. In a similar manner as the previous section, the landscape will be created for the next procedural iteration managed by the emotional parameters with the intention to evoke an emotion. The Image Perception Questionnaire together with what we learned in section 2.1.3 were the foundations for the creation of each tile in the game. Additionally, the game can be further adapted by changing several general game parameters. The next section emphasizes these parameters and how they were implemented in the game. 59 3.6 Affective Loop The video game design is defined, the NPCs have possible parameters to be changed and adjusted to a certain condition as well for the environment, music and player’s character. An experience is set by the game that helps the player project emotions, these emotions can be detected and categorized into emotions by a facial recognition algorithm. The emotions then can be parameterized and used to adapt the system to whatever mood better suits the game to promote a better experience. Figure 28 describes the affective system we want to create for this investigation. Figure 32. The Affective System The system works in a loop, every time the game creates an experience the player can be emotional about, we can detect this emotion and use it as emotional feedback for the system. 3.6.1 Emotional Feedback As defined earlier, the affective system works in a loop, each time the loop ends, a feedback for the past iteration is analyzed, which changes the next iteration based on two 60 layers, the most emotional layer, which uses positive and negative emotions to decide which changes needs to be done, and if it should be direct or indirect (Table 9). The approach of linking the intensity of emotional responses to the degree of game adaptation aligns with the concept of dynamic affective modeling [77]. The idea that stronger emotional signals trigger more indirect or subtle changes, while weaker signals prompt more direct and noticeable adaptations, reflects an attempt to maintain a sense of player agency and avoid overwhelming the player with drastic alterations to the game world. This strategy also acknowledges the potential for misinterpretation of emotional signals, as subtle changes allow for a more gradual and refined response to the player's affective state. Table 9. Emotional Adaptation in game Main Character NPCs Environment Post Processing Positive Emotions: Direct Changes Animations to a more weak state. Model’s physical appearance and animations to a more aggressive state. Aggressive architecture, a more industrial design. Aggressive color scheme. Positive Emotions: Indirect Changes Model colors to match current live emotion with saturated colors. Model colors to match state and aggressiveness level with saturated colors. Saturating colors, changing hues to a more aggressive scheme. Negative Emotions: Direct Changes Animations to a more powerful state. Model’s physical appearance and animations to a calmer state. Calmer architecture, a more natural design. Calmer color scheme. Negative Emotions: Indirect Changes Model colors to match current live emotion with desaturated colors. Model colors to match state and aggressiveness level with desaturated colors. Desaturating colors, changing hues to a more calm scheme. To define the type of change and how aggressive the change must be, we take into consideration positive and negative changes. In a general level, the higher the average of 61 the positive or negative emotion is, the more indirect changes are made, but the lower this average is, the more direct changes in the game are done. The other layer modifies the difficulty of the game, and uses emotions as a secondary parameter to decide, but two main factors are considered first to decide what change will be. The performance of the player in gameplay, or how skilled the player has proven to be, and how frustrated the player seems to be (be it as an emotion, or as factors like how many times the player has died in the last iteration of the system). Table 10 shows a list of changes that affects the game’s difficulty to keep the player on the flow, as defined in section 2.4.1.3. Table 10. Flow Adaptation Main Character NPCs Environment Post Processing Enhanced Skills Limiters, less dashes, more damage. Aggressiveness, speed, attack speed and damage increased. Number of enemies increased. Increased traps and narrowing spaces Fog increased. Frustration Increased Perks, more dashes, more damage. Aggressiveness, speed, attack speed and damage increased. Quantity of enemies decreased. Decreased traps and opening spaces Fog decreased. These two layers define which change must be applied in the next iteration based on the feedback by the previous loop. But the parameter to select certain changes must change in time to adapt to the player’s profile as we describe in the next section. 3.6.2 Reevaluating Parameters In each iteration we have two averages, the percentage of emotions detected in the game session, and the percentage of emotions detected in the past iteration. The main 62 idea is to increase the positive emotions and as a result, increase the player’s experience, but keeping the flow of the game challenging for the player. These two sets of averages are used to evaluate the parameters for constructing the next tile, or level. Some tiles or objects have an emotion to evoke based on the maximum time a different emotion than neutral was felt while in it; these parameters adapt in each iteration to keep in mind the player’s unique profile. The parameters and the evaluation methods for each emotion and game elements might change as further tests are involved. In the next section, we discuss how to evaluate the given affective system. 3.6 Evaluation Mechanism Emotion recognition using Deepface was used as defined in earlier sections, to achieve a robust system that can work in parallel with a game without impacting the performance too much. A camera was used in an environment for the experiment that needs to be well lit, without distractions, and correctly positioned in front of the camera. Along with the emotion recognition algorithm, to analyze ongoing emotions of the player in the game session, different libraries are used to create the affective game experience. For the adaptive settings, every time an expression approaches or reaches its apex, the game analyzes it, and categorizes it into positive or negative input for the affective system of the game. For example, if the player reacts to certain conditions in the game with happiness, the system uses it as consideration to adjust the level the next time it wants to evoke a positive emotion. Neutral, anger and sadness are negative emotions and work as negative emotions, on the other hand happiness and surprise are positive emotions. The system tries to evoke emotions, negative and positive emotions are not defined as “bad” or “good” respectively, but modules to change the game’s experience and pace to make it more fun. 63 To help with the evaluation of the system, we asked volunteer participants to play one of two different game sessions at random, each between 10 to 15 minutes, one session is oriented to a normal game, with static parameters and a non-adaptable world. On the other hand, the other setup is an affective prototype that uses emotions as input to change and adapt the game settings to the player's state. At the end of the game session the players were presented with a questionnaire that measures the user experiences on both game sessions. The game session itself recorded each emotion block based on the percentage of time in which it was manifested and elements in the game that triggered it. Each element had a score based on how many and how constant a positive input was given by it. 3.7 User Experience Questionnaire Through a comparative analysis of player feedback on games with and without an affective system, we study the impact these adaptive mechanisms can have on a player's overall enjoyment in the game and how to create personalized engaging experiences for them. The Ubisoft Perceived Experience Questionnaire (UPEQ) and the Player Experience of Needs Satisfaction (PENS) were employed [Appendix B] to assess the fulfillment of players' psychological needs as defined by Self-Determination Theory (SDT) [57, 63]. The UPEQ focuses on autonomy (evaluates the player's sense of efficacy and mastery within the game) and competence (assesses the extent to which players feel they can make meaningful choices and influence the game's outcomes), while the PENS adds an assessment of immersion (evaluates the degree to which players feel present and absorbed in the game world), crucial for understanding the sense of presence and emotional connection within the game world. Both questionnaires employ Likert-scale items where participants rate their level of agreement with statements about their experiences within the game. 64 The participants in the study were aged between 18 and 36 years old, representing a young adult demographic. The group consisted of 17 men and 5 women. The only prerequisite for participating in the experiment was that individuals had some prior experience with video games. This criterion was essential to ensure that all participants were familiar with gaming mechanics, allowing them to engage meaningfully with the tasks provided. The 22 participants engaged in two game sessions using the same PC setup and webcam. The sessions featured two versions of the game: one with an integrated affective and PCG system, and one without. The order of the game versions was randomized for each participant. Following each session, participants completed a questionnaire based on their experience in that specific session. They were not informed about the presence or absence of the system in either version. A total of 44 game sessions were conducted, with each session lasting approximately 5 to 25 minutes. The experiment had certain limitations. It needed to run two systems in parallel: the game and the emotion recognition system, which communicated via a local server. This was demanding, so only one system was active at a time, and the sessions were conducted on participants one at a time. These limitations added considerably to the time available to carry out the sessions. The following chapter analyzes the questionnaire results, examining how the presence or absence of real-time affective systems and procedural generation impacted player experiences. 65 CHAPTER IV. RESULTS This chapter provides a detailed analysis of the effect of visual stimuli and the adaptive system on player emotions and experiences within a game experience. The chapter is separated into three sections. The perception results explore the findings from an Image Perception questionnaire, designed to examine the relationship between visual elements and the emotions they evoke. The user experience results discuss the influence of an adaptive affective system on gameplay. And the emotion recognition results present an analysis of the emotional responses captured during gameplay. 4.1 Perception Results The analysis revealed instances where the intended emotion aligned with the dominant perceived emotion, as well as cases where there was a divergence, offering valuable insights into the subjective nature of emotional interpretation. By understanding the emotional associations triggered by specific visual elements, we created a better background for the game design. An analysis of the most significant results allows us to further define each of these emotional sectors within the context of the game. The tables provide a breakdown of each image's intended emotion, the predominant emotion it elicited, and the distribution of emotions within the corresponding Flow Channel block. 66 4.1.1 Anxiety This sector encompasses images of textures, colors and objects that elicited feelings of anger or disgust from participants. In-game, these elements, shown in Table 11, were used as a base to create enemies, objects and environments for more suspenseful and challenging moments. Table 11. Anxiety Perception Intended emotion: Angry Strongest emotion: Angry Resulting block: Anxiety 45.9% 1. Angry: 31.1% 2. Disgust: 14.8% Intended emotion: Disgust Strongest emotion: Disgust Resulting block: Anxiety 38.7% 1. Disgust: 28.5% 2. Angry: 10.2% 4.1.1 The Flow The flow represents the ideal emotional state where players are fully immersed and engaged in the game [14]. It's associated with images that evoked feelings of enjoyment or happiness, fear and surprise. As for the game, objects and environments 67 falling into this category were central to core gameplay loops, encouraging exploration, challenging gameplay, and changes in game design, charac