Grace Hopper senior Isabel Hirama demonstrates her AI-assisted salsa dancing lessons that allow players to control the program with body movements and get real-time feedback on their salsa steps.
Movement tracking was achieved using Google's PoseNet machine learning model, which performs real-time human pose estimation in-browser using TensorFlow.js. Overlay graphics were created using HTML Canvas.
The project is a prototype for a suite of AI-assisted dance lessons that would give users a choice between fully human instruction (perhaps via video), fully AI instruction, and a synthesis between the two. The prototype here portrays synthesis mode, where a human instructor gives instructions and occasional specific feedback, while the AI component provides broad immediate feedback based on the dancer's position and step.
A fully AI-taught iteration would would compare users's movements to data collected from professional salsa dancers, and give specific immediately visual feedback based on the differences between the two.
Project Members: Isabel Hirama