Thursday, 2 April 2015



SENSOR MOTION TRACKING FOR CONTROL OF ABLETON LIVE AUTOMATION PARAMETERS 

Andres caicedo.                                                                                                                                                               21199639

The sensor motion tracking for control of Ableton Live automation parameters is a device that is being designed to be able to control any MIDI editable parameters inside the software Ableton live. The project consists of an Xbox kinect acting as the main censor and receiver of information in charge of mapping and tracking the joints and movements of the human body in a three dimensional spectrum. This information is then processed, sent and controlled by Max For Live via OSC (Open Sound Control) messages, providing the user the control of parameters such as panning dials, level faders, tap tempo and even controls inside plugins and effects.



MAIN DRIVERS 

The drivers that are part of the brain that runs the whole system are Synapse and OpenNi. Synapse is generally in charge of receiving the input data of the kinect and sending it out to MAX MSP or Ableton LIve.

OpenNi is linked with synapse, its main function is to make possible the recognition, tracking and mapping of the different body joints.

Both drivers are doted with full programing codes; these codes are shared with the MAX MSP patches in order to achieve the skeleton mapping.




The user stands in front of the camera kinect in a position that makes it easier for the drivers to identify the body joints and then map the skeleton.

SYNAPSE
As mention above, the driver that makes possible the reading of information received by the kinect is Synapse. Synapse itself sends and receives a series of valid messages that are written or programmed within it. These messages are sent through ports 12345 and 12347, and receives data through the port 12346. Once the driver is installed together with the framework OpenNi, a new object will be available inside the Max MSP folder; the object will be in charge of receiving and sending the synapse data and therefore making possible the understanding between the whole operational system containing Synapse, OpenNi, Max MSP (Max for live) and ableton live.

JOINT TRACKING
Tracking the different joints, keeping them together and providing the function of following the body movements is carried out with the message sent to synapse called /<joint>_trackjointpos<int>.

The message will be focused on mapping the skeleton and tracking modes (pos_body, pos_world and pos_screen). Sending this message to synapse will avoid the joints to spread out from its position in every space frame, other ways it will cause packet loss. The way to do this is by picking which joint the user is willing to track and in which position, and finally create the main general patch on Max specifying the input and output ports contained on synapse.
 
The tracking modes pos_body, pos_world and pos_screen are sent through the valid values 1, 2 and 3 respectively, therefore, the number 1 shown on the message bar of the image means that the joint being tracked will be measured on pos_body, which is the measurement of the joint distance respect to the torso.


ABLETON LIVE CONTROL OF PARAMETERS

Some of the final steps of the project is to create on Max for live the patches capable of receiving the data from the Kinect drivers and find the way to edit them in order to get the mapped joints to gain control into Ableton live.

There will be two different patches in charge of two different actions; The first one will be focused on controlling most of the moving parameters into Ableton, this includes level faders, panning dials and faders and dials inside plugins and effects. The second patch will be in charge of triggering actions by quick movements or hits made by the user; these actions are generally focused on pressing buttons such as play, stop, bypass, on, off etc.

The next picture shows more specifically the process and different objects used into Max for live designed to map and control any moving parameter on Ableton.



As mentioned before, the "udpreceive 12345" and "udpsend localhost 12346" objects are in charge of receiving the data information coming from Synapse, and sending it out to Ableton. Also shown on the picture the chance of selecting between tracking modes (Body relative, World relative, Screen position) and x, y and z axes.

The next video will show a more complete functioning of the whole project. Because unfortunately  the orchestral composition has not been finished yet, some audio and sound effects have been picked from Ableton live in order to show how the audio is being controlled and modified by the device.