M1 Master Somethin.AR  

Team

  • Ruslan Novikov
  • Juri Wiechmann
  • Julia Zamaitat
  • Robin Jaspers
  • Diro Baloska

Supervision

David Strippgen, Robert Meyer, Fabian Quosdorf

Collecting ideas

At the beginning of our project, we had a clear goal in mind: to create a game using the Meta Quest 3 headsets provided by HTW, with a focus on multiplayer elements and mixed reality features. To get our project off the ground, we started with some hands-on exploration. We tested various games that were compatible with the Quest 3, and we even took a trip to Cosmic.VR in Berlin, a virtual reality studio where friends can meet and enjoy VR games together.

Picture of 3 People with a VR headset
Excursion at Cosmic.VR

Developing a Concept

Inspired by what worked well and what didn’t after these experiences, we began to outline the essential requirements for our product. We envisioned a family-friendly game that would offer players a quick yet immersive experience that they can share together. The key idea was to introduce an extended world that would leave players with a ‘wow’ feeling, all while keeping them active and moving.

Picture of Brainstorming Board
Brainstorming Session

Understanding our Audience

Once we had a rough concept in mind, we dug deeper to understand our target audience better. We created personas and user experience maps to deepen our understanding of the players and figure out what would work best in the Showtime setting. With these insights we refined our game concept and mapped out the entire game process using a user story map and a flowchart.
Picture of Personas and User Experience Map
Personas & User Experience Maps

Defining Priorities

The project was started with the mindset that we want to try out these new and fast-evolving technologies and see what is currently possible with them. Due to the fact that there were so many unanswered questions right from the start, we soon realized that we were a bit to eager with all the features we listed for our final game, so we left some for future releases, but still kept our focus on what we envisioned for the overall gameplay: a fun fast-paced mixed reality game to share with friends with focus on collaboration, movement and impressive scenery.

Conducting Technical Research

XR technology is a rapidly evolving field, as we discovered firsthand during our research into the Meta XR SDK and Oculus Integration and Interaction SDK. We delved deep into XR technology using Unity, exploring how we could use it for our project. Our focus was on synchronization across multiple headsets and creating a mixed world within the game that smoothly combined elements from both the virtual and the real world, in line with our desired effect.

Our research also led us to investigate the transfer and sharing of spatial data for colocation features in our game, as well as the interaction between real-world objects and virtual elements. We aimed to integrate these components for the Meta Quest 3, requiring us to understand various concecpts of XR technology to build our game.

Creating a Product Backlog

Having a clear vision in mind, we got down to the task of creating our product backlog. This comprehensive list included all the features needed for our MVP (Minimum Viable Product) as well as those we planned for future releases. Building on the User Story Map we developed during the conceptual phase, we translated it into practical tickets, seen from the perspective of our users. These tickets included User Stories, Features, and specific tasks.

For project management we made use of project planning tools on Github. We organized our work into weekly or bi-weekly sprints, each with a different focus. Whether it was asset design, network logic, game state implementation, UI design, or other crucial aspects of the project, we assigned the tasks to the team member who wanted to tackle it for the upcoming week.

Picture of User Story Map
User Story Map
Picture of Github Project Board
Github Project Board

Implementing and Testing our Features

It was finally time to get some actual programming work done, and let us tell you, we rolled up our sleeves and got our hands dirty! The implementation phase was quite an adventure, filled with its fair share of challenges. We encountered unexpected bugs in the SDKs we were working with and had to consistently test and reevaluate our concepts to ensure everything played nice with the Meta Quest 3 and the provided SDK, all within the Unity environment.

Picture this: we were all gathered in a room, each with our headsets on, gesturing and pressing invisible buttons in the air, probably looking quite strange to anyone on the outside. But in the end, all that effort paid off, and we’re thrilled to have created something awesome to share with all of you.