I bought a MYO armband from Thalmic Labs a couple of years ago. It seemed like a good way of getting movement data from a dancer in a form that i can use for controlling a responsive environment. The problem was that the hardware is good, but the developers are very geared towards gesture control of existing software – things like controlling iTunes with a wave of your hand, changing volume, skipping tracks, etc.
What I needed was a wrapper that takes the data from the MYO and outputs it as useful OSC or MIDI. I experimented with an open source wrapper called myOSC built by a chap called Ben Kuper. It allowed me to get basic movement and orientation data from the MYO into Isadora, but was not very stable, so I gave up on it, and the MYO stayed in its box for a year or so.
Fastforward to February 2016 and I chance upon a very clever chap called Balandino Di Donato, aPHD candidate living in Birmingham. His specialist area is gesture control of music, and he’s built an app that does exactly what I want. I downloaded it and it works beautifully. Now I can really start to explore what the MYO can do.