Motion tracking with Posenet (research project)

This project idea started with the discovery of a motion tracking algorithm made in Tensorflow.js called Posenet. This allows for real-time human pose estimation from the browser, using your webcam. One exciting aspect about motion tracking with Posenet is that it can run from the browser, without any hardware. Below, I explain my idea for a musical application of this algorithm, for the context of Yoga.
1_bkzqeptvm-6xwarhabzqna.gif
final_project.png
An example of the above process but prototyped in Max Msp and Wekinator. The webcam running through Max, sends the keypoint data out to Wekinator. Wekinator here is doing the machine learning part, where the pose data is recorded as an input from the video feed, then sends its output back to Max via OSC.
The system, built in Max MSP is detecting recognised poses and playing a sound in response.
Previous
Previous

Fashion

Next
Next

Cave Installation