Beat Striker

A low-tech take on the motion-tracking and rhythm-based virtual reality game called Beat Saber, written entirely on Python.

The player uses a bright light source, such as a phone torch (easily accessible), to record their movements in real-time via an image recording source, such as a PC webcam. The game will use the OpenCV algorithm to capture the player’s movement. The game features a 3D engine written from scratch that creates and accurately slices the cubes into fragments based on the user’s interaction with the cubes.

Source Code

15112: Fundamentals of Computer Science and Programming
Computer Science Department
Carnegie Mellon University – Pittsburgh
Fall 2021

The webcam-based 3D motion rhythm-based game imitating the original Beat Saber seemed just about right to explore some new territories, achieve good balance in algorithmic complexity, and most importantly, to be able to enjoy the process.

With my inspiration and focus defined, I broke down the areas of concentration for my project.

1. Light Detection: Using OpenCV module to explore light detection

2. 3D Graphics: Understanding the projection matrix and simplying the process of creating the objects in 2D screen by developing a mini 3D engine of my own

3. Beat Detection: With the help of Pygame and Aubio modules syncing music to the appearance of the cubes to be sliced, just like in the VR game.

Light Detection – OpenCV

Understanding that the game will require the player to move around rapisly without disruption, using a bright light source as a connection between the player and screen seemed like the best option.

Another idea was to maybe try sticking a simple red/blue tape to the tip of the finger or, even better silicon finger covers in red/blue color. But the torch was more accurate to detect even from a distance, giving the player flexibility of space for movement (higher priority). However, to enhance the user experience and to somewhat achieve the same outlook as the VR game Beat Saber, option to change the color of the light strokes was added.

Eventually, the aim will be to add an additional light stroke and differentiate between the left and the arm hands of the player to increase the interactivity.

3D Engine

Translating XYZ coordinates to 2D screen by the help of projection matrix.Using the projection matrix, points of the cube are plotted on 2S screen and joined by lines to get the cube appearance. The cube is then put in motion by alterting it’s Z coordinate value.

Simplfying the overall UI of the game, I realised there is no need for me to create all the six faces of the cube. Instead I can create just the front and the back and connect the rest using line segments. This worked out just right as the other faces has essentially no significant role to play.

Cube Slicing: The same projection matrix helps us find the on screen coordinate points for the cube once intersection points between the cube edges and the light stroke are calculated.

Below is the code written for the Projection Class:

Option 01

checks 0 and 3 = missed intersection

Option 02

checks 0 and 3 = no intersection
1 and 3 = intersection
2 and 3 = skips this check as one intersection is achieved
Slice is created but not accurate to user movement.

Option 03

checks 0 and 1 = no intersection
1 and 2 = intersection
2 and 3 = intersection
Two distinct intersection points are collected for the two edges to form the fragmented cube.

Music Beats – Tempo Detection

Aubio is a collection of algorithms and tools to label and transform music and sounds. From the aubio library, the tempo detection algorithm was used to record the time-stamps at which the cubes should appear.

The tempo detection was manually altered to accomodate the time difference between a cube getting generated and it becoming sliceable (reaching a point when the player can actually slice the cube).

With the help of Pygame module, the music was manually synced to be played in the background with the game.