Mixed-reality app for teaching basic programming honored at Stanford hackathon
Two Mines students were part of the team that developed the first-of-its-kind tangible programming experience for beginners
A pair of Mines students were part of the team that designed and built the best augmented reality/virtual reality platform at the 2020 Stanford TreeHacks.
MusicBlox, the winner of TreeHacks’ AR/VR Challenge, combines mixed reality with introductory computer programming lessons to create a first-of-its-kind tangible programming experience for beginner students.
Nhan Tran and Natalie Kalin, both master’s students in computer science at Mines, teamed up with two undergraduate students from the California College of the Art and University of Maryland to form the winning team.
“Our app lets people interact with virtual objects just like physical objects,” Tran said. “Imagine playing with tactile Lego pieces but all of them are computer-generated so you won’t have to clean them up or hurt yourself by stepping on the pieces. Also, users won’t have to worry about misplacing the pieces or buying new hardware as we can easily roll out new updates virtually.”
The AR/VR Challenge, one of six tracks that hackers could choose at the 36-hour event, pushed hackers “to create their own realities and the tools necessary to experience them in order to tackle a breadth of problems and push the boundaries of human experiences.”
Kalin and Tran’s own experience as graduate student instructors in the introductory computer science class at Mines helped inspire the team’s project.
“Students struggle with abstract concepts like loops and methods,” Kalin said. “MusicBlox gives students an understanding of these concepts via real-world objects they are already familiar with.”
“For example, the first level in our platform has three tiles: a music note, a piano and a sound. Students must organize these tiles in the correct order to convey their understanding of a method or function. They must first order the note (which is like an input), then the piano (which is the function), and finally the sound (which is the output from a function),” she said. “This reinforces a rather abstract concept using a musical approach that new programmers are already familiar with.”
The team also drew inspiration from Google’s Project Bloks, which utilizes hardware for tangible programming, and the reality-based interaction framework, a technique for teaching programming languages without electronics or power supplies.
“This framework has two main principles: First, interaction takes place in the real world, so students no longer program behind large computer monitors where they have easy access to distractions such as games, IM and the Web,” Tran said. “Second, interaction behaves more like the real world. That is, tangible languages take advantage of students’ knowledge of the everyday, non-computer world to express and enforce language syntax.”
MusicBlox was designed for the Magic Leap AR headset – the technology Kalin is currently using for her research – although the team hopes to expand it to more readily available platforms like smart phones and tablets in the future.
“After our research phase, we realized there was no such AR/VR platform for tangible programming on the market, so we decided to start one,” Tran said.