Skip to main content

    Movement Experiment

    This project was an experiment with movement, color and coding. As a former ballet dancer, I wanted to explore coding in conjunction with movement and try to combine these two very different fields.

    This project leverages real-time human pose estimation through a machine learning model, PoseNet, from TensorFlow.js. The poses are detected in real time from the user's webcam. Machine learning pose estimation allows for the possibility to locate 17 key body points and gage human pose and posture from a normal 2D image.

    Project Members: Shoshana Rosenfield