Have you ever wondered how game studios test their games? Can a music festival serve as a venue to do such testing and how are testing of innovate sustainable concepts in games done? Read along as we share our behind the scenes insights from our tests!
Behind the Scenes: Testing at The Circular Lab
Hi, I’m Helena, a UX researcher at GAMUCATEX! I had the pleasure of organizing our testing booth at The Circular Lab in the middle of the Roskilde Festival. I worked with many talented team members that helped me perform the test and collect the data. After our first go in 2023, we came back this year with fresh ideas and a game plan to make the experience for the users even better. At our booth we tested our video game Tectonicus, it is a strategic game with a historical focus and it has a battle system where you play cards on a tactical board in the game. This blog focuses on our experience testing at the festival and the insights we want to share.
The Circular Lab at Roskilde Festival
The Circular Lab is a vibrant hotspot at the Roskilde Festival, one of Europe's biggest music and arts festivals. The Circular Lab is a place for entrepreneurs to showcase creative and sustainable projects, making it an interesting spot for us to test our game and connect with festival-goers. We brought a couple of green concepts to test, which we will explain in this blog.
Evolving Our Testing Approach
Last year, we had our first experience setting up a booth at The Circular Lab. Back then we tested an early version of the game and a paper prototype. What is a paper prototype? Well, For those unfamiliar, it is a low-fidelity, physical representation of a game's user interface and gameplay elements. We were testing a new feature with an in-game shop which we wanted to implement, so players can buy items like card bundles directly in the game. So, we wanted to test this shop before developing anything new that would take longer time inside the program we used using simple materials like paper, cardboard, and markers to create mock-ups of the game's screens, controls, and interactive components. This allowed us to quickly test and iterate on game concepts and mechanics before investing time and resources into digital development. The event also facilitated early feedback from the festival-goers, making it easier to identify and address potential issues in the design of the shop.
The shop included beautifully digital hand-made drawings in deck bundles and illustrating how the vikings of how the vikings utilized and reused their natural materials. Since our first testing at the Circular Lab we have now made a digital version of this shop in the game. . Now, we’ve come to test the digital version!
This year, we added fun activities to fit the festival vibe–such as allowing visitors to try drinking from viking horns. Having such activities allowed for more activity and buzz around our both, it created a positive more laid back atmosphere where people were more inclined to play a game. As a result we got people into a more relaxed stage to play our game, we believe this gave more authentic data.
Qualitative Research
There are many different ways to collect data, including questionnaires, surveys, and eye tracking. We went with a qualitative approach to collect our data, which is all about getting a deeper understanding of people's experiences and feelings. Instead of just collecting numbers, we look at how people interact with something and what they think and feel about it. One of the methods we used is the "think-aloud" approach, where we ask participants to verbalize their thoughts and feelings as they interact with the game, which we will elaborate on below.
Key takeaway: It is important to consider what kind of data you are able to and interested in collecting.
Usability testing
Throughout our testing, we used a usability test approach, which is a method to evaluate how intuitive the game is. Elizabeth Goodman, co-author of Observing the User Experience (2012), explains that it can help identify issues; for instance, can the participants complete the tasks we designed for them to try to solve? Do they understand the language and use of wording in the game? Before the happy festival attendees tested our game we gave them a disclaimer where we explained that the purpose is to test the game, not them and their gaming abilities. This is because we know from past usability tests that some participants have felt insecure due to not understanding each element of the game. We wanted to make them feel at ease and see the test as fun.
This technique allows us to hear their immediate reactions and understand how they interpret and experience different aspects of the game.
Key takeaway: find a suitable testing method that is designed to match the participants.
Preparing the participants to play
In order to get useful feedback on areas for improvement, it was important to avoid giving participants directions on how to play the game or influencing their experience in any way. We encouraged them to navigate the game on their own and describe their actions and expectations as they played. This approach helps us understand how users naturally interact with the game without any guidance influencing their experience.
Learnings From Two Years at The Circular Lab
Based on what we learned from last year some methods are just not suited for what we do when collecting data at a music festival.
1. participant sorting list
One major practical lesson we learned last year was how to improve coordination and tracking of participant number. For instance, we previously brought a printed table with timestamps like “Participant 7” at “13:45” that we placed at our booth. But this setup became confusing with multiple researchers ath the booth conducting tests simultaneously. This year, we streamlined the process by having each researcher announce the identifier of each participant into the audio recording, such as “This is participant T6.” The “T” indicates the first alphabetical letter of the researcher's name, and the “6” is the participant's sequence number. This approach made tracking much clearer and more manageable. So my first participant would be H1 and if I tested 50 participants my last participant would be numbered H50.
Key takeaway: Having a clear and easy to use mechanism of sorting and organizing participants is valuable while testing and when analysing data.
Headsets
Another significant change was our decision to forgo the use of headsets to listen to the audio in the game while playing. Last year, we gave participants headsets to manage the noise from the surrounding festival and allow them to hear the game’s sound effects and music. However, we found that headsets isolated participants and limited our ability to interact with them during gameplay. They were originally wearing headsets as the area we tested in had a lot of noise from other people and the loud festival music in the background.
The data we collected last year showed that most participants enjoyed the audio and background music in our game, so it was not a focus for us to test it this year. By skipping the headsets, we maintained open communication with players, allowing for more dynamic interactions and immediate feedback. This approach enhanced our engagement with participants and streamlined our data collection process, helping us gather more relevant insights. There are other more suitable avenues to test the sound and music of the game.
Key takeaway: It becomes clear that adapting the testing setup to the surroundings and prioritizing that to include and exclude in the test.
Additional Insights and Takeaways
2. Accept Our Limit
Last year we tried out various data collection methods including questionnaires, both qualitative and quantitative,printed-out artifacts, list sorting, observation notes and more. However, with limited team resources, collecting this much data was overly intensive, and there were more data avenues than our team had the ability to organize and analyze. To ensure organization and efficiency this year, we reduced the number of questions and tasks.
3. Presenting Our Sustainability Values
Last year, we included an energy-saving button in our game’s main menu as a feature for testing. This button didn't actually alter any functionality of how the game performed, something the button also communicted but it was intended to prompt players to think about and discuss energy efficiency in regards to playing a game. We used this as a way to gather opinions on incorporating sustainability into the game.
This year, we took our sustainability efforts a step further by focusing more comprehensively on green features. We introduced new elements like our in-game shop section which contained eco-friendly card bundles and an additional playable chapter in our campaign that emphasizes how sustainability played a role in the viking age, and an eco card bundle highlighting historical sustainability.
These card bundles contained cards the player can obtain in the game that deal with various aspect of sustainability with a historical perspective, such as a card about repurposing and reusing materials. As a way for the people to reuse what was already around.
By actively seeking feedback on these updates, we aimed to better understand how players perceive and value these green features.
Key takeaway: We learnt that to have participants to consider and discuss something relatively abstract and unfamiliar, it is important to make it as concrete and specific to make sure both sides are talking about the same thing and get as reliable data as possible.
1. Make a Data Collection Plan
We’ve learned that having a broad research scope makes it hard to replicate results and focus on specific insights. In 2023 we ended up testing features that weren’t part of our original plan, which made it challenging to get deep, actionable data.
Our goal is to keep research focused and meaningful, while also balancing the needs and interests of different stakeholders in our company, including people outside UX that might want to test out their new features. This requires us to work closely with each other internally and adapt our research plans to meet diverse goals.
Key takeaway: working on this long-term project like The Circular Lab has taught us the importance of focusing on what truly matters and being willing to adjust our plans. We’ve learned to let go of preconceived notions about our research approach and be flexible, allowing the practical environment to guide us in gathering the most relevant and useful data.
3. Ensure successful audio recordings
In an environment like The Circular Lab at the Roskilde Festival, where there is a lot of noise and some participants speak with a low voice, we advise researchers to physically hold their microphones closer to themselves and the participant while audio recording. We did not have a big budget so we had to consider tools that were already available to us.
We previously experienced that some audio files were nearly impossible to hear because they were placed on the table at a distance from the participants talking and the researcher asking questions.
Key takeaway: It may be uncomfortable for the arm to hold it out straight wit hthe microphone, but it will serve you well when listening, transcribing and analyzing the sound recordings afterwards. And last, audio recording from a laptop depending on the model might work extremely poorly, so we advise you to better tools like a phone, a dictaphone or a dedicated microphone on the computer.
3. Final Thoughts
Testing our game at The Circular Lab has been an enriching experience. We’ve learned valuable lessons on how to improve our research methods, engage with participants, and gather meaningful data. By refining our approach, we hope to continue improving our game and delivering a better experience for our players.
If you want to test the game yourself you can find it at: https://gamucatex.itch.io/tectonicus
and if you want to become a part of our internal testing group for exclusive test you can contact us at ux@gamucatex.comWe also do testing sessions of our game with bigger groups likes school were we also share with the participants about our processes in a tailored experience.
Thank you for joining me in exploring our journey! And we look forward to sharing more insights in the future.
Comments