Aalborg University Copenhagen

Download Aalborg University Copenhagen

Preview text

Aalborg University Copenhagen

Semester: 4th Semester (MSc) – Spring 2020 Title: The Impact of Basic and Active Interaction Schemes on
Player Involvement Semester No.: MED10
Semester Theme: Master’s Thesis

Aalborg University Copenhagen
Frederikskaj 12,
DK-2450 Copenhagen SV
Semester Coordinator: Stefania Serafin
Secretary: Lisbeth Nykjær Phone: 99402470 [email protected]

Supervisor(s): Henrik Schønau Fog
Project group no.: N/A Members: Christoffer Nicolai Kiaby Kjær Ervin Hamulic

This study investigates how appliances of basic and active-action based interaction schemes can be used in mobile games and how they impact player involvement in 2D-platformers through their unique input gestures that are based around basic tap input and the more mobile-centric input methods such as active swiping or dragging. An evaluation was performed using methodology mostly centered around methods such as questionnaires and card sorting. The findings show that there is a significant difference in kinesthetic and overall involvement between the basic and active action-based interaction schemes, in favor of the basic one. However, because of a sudden need for changing the evaluation strategies due to ethical concerns, certain important compromises had to be made, which lessened the legitimacy of the findings. Therefore, further testing should be conducted in order to enhance the validity of the results.
Keywords: Player Involvement, Active Interaction, Basic Interaction, Mobile Interaction, 2D-platformer, Playability Heuristics, Player Engagement

Copyright © 2006. This report and/or appended material may not be partly or completely published or copied without prior written approval from the authors. Neither may the contents be used for commercial purposes without this written approval.

The Impact of Basic and Active Interaction Schemes on Player Involvement
Master’s Thesis
Members Christoffer Kjær [email protected] Ervin Hamulic [email protected]
Supervisor Henrik Schønau Fog [email protected]

Table of Contents

Table of Contents


1. Introduction


1.1. Initial Problem Statement


2. Analysis


2.1. Mobile Interactions


2.1.1. Mobile Interaction Techniques


2.2. Mobile Touch Interaction


2.2.1. Active Actions


2.2.2. Basic Actions


2.2.3. Virtual Controllers


2.3. Design Principles in 2D-platformers


2.3.1. Building blocks


2.3.2. Design patterns


2.4. Measuring the User Experience in Games


2.4.1. Playability Heuristics


2.4.2. Player Focus and Flow Theory


2.4.3. Player Engagement


2.4.4. Player Involvement

21 The Six Dimensions of Player Involvement


2.5. State of the Art


2.5.1. Virtual Controller Designs


2.5.2. Active Action Controls


2.6. Analysis Conclusion


3. Final Problem Statement


3.1. Design Requirements


4. Methods and Evaluation Strategies


4.1. Ethical Concerns


4.2. Usability Testing


4.3. Pilot Testing


4.4. Final Evaluation


4.4.1. T-test and Questionnaire


4.4.2. Card Sorting


4.4.3. Observations


5. Design


5.1. The Concept and Visuals of Dark Squares


5.2. Components


5.3. Level Design



5.3.1. Rhythmic Jumping


5.4. Input Design


5.4.1. Basic Action Input


5.4.2. Active Action Input


5.5. Sound Design


6. Implementation


6.1. Detecting Interaction


6.1.1. Detecting the Gestures

42 Tapping

42 Swiping

43 Dragging

45 Buttons


6.2. Player Actions


6.3 Changing the Environment


7. Results


7.1. Usability Test


7.1.1. Setup and Procedure


7.1.2. Results


7.2. Pilot Test


7.3. Final Evaluation


7.3.1. Demographics


7.3.2. Procedure


7.3.3. Questionnaire and Statistical Test


7.3.4. Cards Sorting Results

61 Card Frequency Results

61 Card Sorting Score


8. Discussion


8.1 Sources of Error


9. Re-design


10. Conclusion


11. Future Works





1. Introduction
Video games have long been a stable form of digital entertainment. This is not only because it allows the player to actively take part in the experience but also because it has the capabilities to reach a wide audience through many different types of platforms such as computers, consoles, tablets and mobiles. Particularly the mobile platform, which include smartphones, smartwatches and other handheld devices, has over the last decade become the most widely used digital platform available (StatCounter, 2020).
Along with the rise in popularity of the smartphone, a growing number of video games tailored to the mobile platform has also seen the light of day. While some of these mobile games present input controls that are carefully tailored to fit the mechanics of the experience, other games instead rely on traditional input schemes that are heavily inspired by console controllers. By merely adopting these traditional input schemes, mobile games exclude some of the platform’s many available features such as the gyroscope, camera, and a wide range of touch gestures, which is unfortunate since making use of these features could potentially enhance the player experience.
Certain developers have attempted to tailor the input scheme to fit the mechanics of their game, while also adhering to the traditional scheme. However, utilizing this approach can result in a range of issues, such as overuse of UI elements, which can become too intrusive and thus negatively affect the player’s experience with the game. However, while certain games hold onto these traditional input schemes, other games have adapted to the mobile platform by utilizing the many features it possesses. This raises the question: is the traditional approach of simply adopting the input scheme from the console controllers really the most optimal way of designing input controls, or does embracing the alternative features of the smartphone enhance the player experience?
This study aims to investigate how the level of player involvement is influenced when comparing certain mobile interaction schemes. To achieve this, the study will explore how mobile interaction schemes are built and applied to a game. In order to measure player involvement, the study will investigate said subject along with other related topics. With this knowledge, a 2D-platformer and evaluation methodology will be designed to answer how different mobile interaction schemes influence player involvement.

1.1. Initial Problem Statement
This leads to an initial problem statement being formulated. Which type of interaction schemes induce a higher sense of player involvement in terms of basic gameplay actions in a mobile 2D-platformer?

2. Analysis
The following chapter aims to research several aspects related to the initial problem statement in order to acquire useful knowledge on how to properly develop a 2D platformer that will be suitable to function with different types of mobile interaction schemes. The analysis will cover elements such as mobile game interaction techniques, touch interaction techniques, how to design a 2D-platformer, player involvement and other models for user experience, how user experience is measured, as well as current state-of-the-art as these are considered related to the evaluation performed in this study.
2.1. Mobile Interactions
In order to determine which type of mobile interaction techniques that should be evaluated it is important to understand which ones exist, how they are used as well as how widely used each of them are. Game interaction, as a concept, refers to the relationship between the user’s input and the response outputted by the smartphone. The general relationship is based around the smartphone continuously taking haptic input through either finger touches on the screen or the device being shaken; or by taking auditory input through the built-in sound recorder. The smartphone can then output a wide range of visual feedback, audio cues, or haptic vibrations, which the user can act upon and then provide the smartphone with new input (Chaichitwanidchakol and Feungchan, 2018).
Figure 1. The relationship between user and device (Chaichitwanidchakol and Feungchan, 2018).
Savari et al. (2016) state that game interactions can be divided into two types, with the first type being known as natural interactions. This concept refers to techniques that attempt to mimic real life interactions through body movements that are similar to those performed in the physical world. These movements are then captured by hardware which can potentially

simulate said movement in a virtual environment or on a screen (ibid). Virtual reality and the Xbox Kinect are common pieces of hardware that utilize natural interactions. The difference between these two particular pieces of hardware is that virtual reality uses physical controllers and sensors to capture and simulate body movements, while the Xbox Kinect eliminates any need of a physical controller and instead relies on a camera with sensors to capture body movements. The second type is called non-natural interactions, and this form of interaction revolves around techniques that are performed through more conventional means such as the usage of a mouse, keyboard, joystick, or console controller (ibid.)
Both of the aforementioned types of interaction can be utilized for mobile games. Many of today’s smartphones do not have any physical buttons since on-screen buttons are used for input commands (Chaichitwanidchakol and Feungchan, 2018). However, physical buttons or controllers such as a keyboard or a joystick can be utilized through a Bluetooth connection, and then used for non-natural interactions. Moreover, most of today’s smartphones are equipped with many different sensors that can allow for implementation of more natural forms of interaction (ibid.)
2.1.1. Mobile Interaction Techniques
Chaichitwanidchakol and Feungchan (2018) view the concepts of natural and non-natural interaction as more general categorizations of game interaction. They argue that mobile interaction could be categorized into many interaction techniques that each utilize a single (or multiple) features of today’s smartphones.
Touch The touchscreen is one of the most standard methods for providing input on today’s smartphones, and it allows the users to use their fingers to perform various actions. These actions include a wide range of touch gestures such as, tapping, double tapping, long tap (holding), dragging and flicking; all of which can be used for unique gameplay purposes (Kim and Lee, 2014). The concept of multi-touch can provide even more variation in touch gestures, by allowing users to perform actions by placing two or more fingers on the screen simultaneously (Chaichitwanidchakol and Feungchan, 2018). Though not seen on many of today’s smartphones, innovative touch technology such as pre-touching (which can detect users’ fingers before they touch the screen) (Hinckley et al, 2016), and rich touch (which can detect whether the player is touching the screen with their fingers, nails or knuckles) (Harrison, 2014) do exist and can allow for even more unique touch gestures .

Motion Motion interaction revolves around physically moving the smartphone as a way of providing it with input, and this is done through built-in sensors such as the accelerometer or the gyroscope. One of the more popular examples of motion detection on smartphones is ​tilting,​ which works by the sensors detecting declination and inclination of the smartphone (Du et al, 2011; ​Gilbertson et al, 2008​). Tilting is sometimes used in driving games to turn vehicles or in classic maze-like games where players have to guide a marble ball through a maze. While tilting-based motion controls might not be optimal for many games; it has been found that motion controls can induce a higher level of player satisfaction compared to conventional touch controls (for games that allow players to use either of the two) (Gilbertson et al, 2008). Examples of non-gaming applications that use motion detection include 3D viewing-modes and standard camera applications.
Video Video interaction concerns itself with utilizing the camera as an important tool for interaction. It is often used by applications that aim to combine real-world and virtual imagery into a singular digital experience. The concept is more commonly known as augmented reality, and it allows the user to view the combined imagery through the smartphone’s camera (Das et al, 2017). It is however important to note that augmented reality applications usually do not rely on video interaction alone. This is because much of the actual interaction still occurs through conventional touch controls, while the video-aspect focuses more on how to present the imagery. Popular examples of mobile games that use augmented reality are Pokémon Go! (2016) and Harry Potter: Wizards Unite (2019) by ​Niantic.​
Location This type of interaction utilizes the player’s location as a key element in its gameplay through GPS tracking. The aforementioned mobile game Pokémon Go! (2016) is an example of a location-based game that takes into account where in the world the player is so that it can determine which Pokémon they should be able to find in their vicinity. Smartphones built-in magnetometer is also used in location-based interaction to determine which direction the player is looking (Chaichitwanidchakol and Feungchan, 2018). However, just like video interaction, location-based games often still rely on touch input from the player.
Sound Sound interaction revolves around the player providing the smartphone with input through voice commands that are detected by its microphone. Sound interaction can be based

around concepts such as speech recognition, where the player can e.g. make their tetrominoes in a game of tetris drop down by saying the word “down” (Sporka et al, 2006); or sound processing, which is often used in karaoke games where the input is more focused on the quality of the sound than the content of it (Gokul et al, 2016).
Other techniques Other techniques highlighted by Chaichitwanidchakol and Feungchan (2018) include, social interaction (where multiple users are competing or cooperating in the same experience); date and time interaction (where the game can take into account the time of the day from the real world and then mimic it in the game); weather interaction (where e.g. in Pokémon Go (2018), water type Pokémon will be stronger if it is raining in real life); bioinformatics interaction (where biodata such as heart rate can be used as a gameplay interaction feature); and special purpose interaction (where an external device is connected to the smartphone as an alternative way of interaction), among others (ibid).
A general pattern that can be seen in a few of these mobile interaction techniques is that many of them rely on touch controls even though their name does not imply it. For example, while video and location interaction utilize the camera, magnetometer and GPS in order to function, these still rely on touch input, while the aforementioned features act almost like secondary elements to the interaction (see figure 2). Based on these findings, it is possibly to state that touch plays a much larger role in general mobile interaction than other techniques. It would therefore be more suitable for the techniques evaluated in this study to be influenced by some form of touch input, since it can be viewed as the most utilized interaction technique.

Investigated Interaction Techniques Touch Motion Video Location Sound

Utilized Smartphone Features Touchscreen Gyroscope, accelerometer Touchscreen, camera Touchscreen, GPS Microphone

Figure 2. An overview of the investigated interaction techniques and the smartphone features each of them utilize for proper interaction.


Preparing to load PDF file. please wait...

0 of 0
Aalborg University Copenhagen