top of page

Cycling 74 Max Projects

両手

Ryouteutau

Ryouteutau (Two-Handed Singing)

Musical performance with the flick of a finger.

Using the Leap Motion controller, this project combines musical performance, improvisation, visualization, and non-traditional scoring all at once. Naturalistic controls remove the constraint of keyboard input and extend the possibility of musical creation equally to all users, regardless of theoretical knowledge or prior musical experience. Accompaniment tracks can be selected from a bank of reimagined classical pieces such as Debussy's Claire de Lune, Barber's Adagio for Strings, and Beethoven's Sonata No. 8 (Pathetique). By pressing fingers together, different functionalities of the project can be accessed, including selection of accompaniment track, capturing an image of the user's unique visualization, or resetting the project for a new user.

The Mechanics

This project relies on a constant concert of interaction between the user, the Leap Motion controller, and Cycling 74 Max code. For the sake of easy implementation, mechanics were broken down into individual abstractions of interaction - separate pieces of code created for each function the user would be able to access - and then combined into a final project where all worked harmoniously together. Below are first the individual abstractions, then the final cumulative project.

Abstraction 1: The Visualization

This piece of code enables the user to use a Leap Motion controller to create an abstract piece of visual art. A color randomizer serves to keep it visually stimulating with no more than a wave of the finger. The tip of the pointer finger is the collected data for the xy coordinates, and when two hands are used an interesting vertical mirroring effect is also seen to occur.

Abstraction 2: The Two-Handed Theremin

Screen Shot 2018-11-20 at 12.35.45 PM.pn

This piece of code works with the Leap Motion controller to allow the user to play a two-handed, finger position-based instrument. The left hand samples from a playlist of clips of a cello being played, and the right hand samples from a playlist of clips of a violin being played. Vertical height above the sensor determines volume, left to right position determines pitch. The further to the left the hand is positioned, the lower the note. Also worth mentioning - the two samples and their range of pitches have been selected specifically to avoid much overlap in pitch, so the two sound distinct no matter what the user plays. 

Screen Shot 2018-11-20 at 12.36.18 PM.pn

This is the "Note Picker" subpatch. This operates as a functionality of the Theremin patch and breaks the data input from the Leap Motion controller into 24 discreet groups, which enables the larger patch to decide which note it should play in the scale. The output messages signal to the playlist which clip to play next for both the cello and violin theremins.

Abstraction 3: Middle Finger Touch Detection

This abstraction works with the Leap Motion controller to detect when a user presses the tips of both of their middle fingers together. This piece of code relies on the distance formula, and thereby functions about the same anywhere in the detectable range of the sensor. When it is first turned on it takes a matter of seconds for it to begin tracking the fingers accurately enough to detect a touch collision, and remains quite sensitive. For the sake of this project, the change that the tap enacts is to cause a screenshot to be captured of the entire monitor display by use of the “shell” tool.

Abstraction 4: Pinky Touch Detection

This patch serves to indicate when the user’s pinkies are touched together above the Leap Motion sensor. Much like the previous patch, it utilizes the distance formula and is thereby effective across the spread of the Leap’s ability to detect. Also like the previous patch, it takes a few seconds for the Leap to establish enough accuracy to actually detect collision, but after it has it is repeatably functional. In the final version of this project, the patch will serve as a “reset” on the project (i.e. erasing existing visualization and stopping any accompaniment track).

Abstraction 5: Thumb Touch Detection

This piece of code utilizes the Leap Motion Controller to detect when a user presses their thumb tips together. Much like the other collision detection patches, this one relies on the distance formula and is functional across the Leap’s sensory range. When the thumbs are pressed together, a playlist cycles through several options for accompaniment. The pieces are selected from various classical pieces, reduced to act just as accompaniment, and looped so that a user may interact with a single piece for as long as they’d like.

The Completed Project

This patch is the entirety of the compiled, functional project. Each of the subpatches (i.e. p ThumbTouch, p MiddleTouch, p DrawingStuff, etc.) is a collapsed version of the abstractions detailed above. This patch is fully functional and operates using exclusively input from the Leap Motion controller - once it's up and running no keyboard or mouse input whatsoever is required. 

Further Information

Further documentation on this project - including downloadable code for every Max patch detailed - is available here. If questions or concerns persist, please visit my Contact page and reach out to me directly.

Kaleidoscopic Keyboard

Kaleidoscopic Keyboard

This project takes input from a full piano-style midi keyboard and combines chord progressions with live camera input to create unique visualizations for each user. The user's face is projected on a screen, and as they interact with the keyboard not only is sound produced, but also the camera input is distorted and warped, repeatedly folded in on itself, and even rotated. The effect can be dizzying and quite surreal, the images abstracted to the point that users may not even initially recognize it's footage of them being displayed on the screen.

This is the code of the completed project. When active, input is taken from both the built-in camera as well as an attached midi keyboard. Users are not required to interact with the computer keyboard or mouse whatsoever, all input originates from the midi keyboard and camera. A foot pedal may be attached to the sustain function of the midi keyboard to access an additional video effect on the camera input.

Keyboard Functionality Subpatch

This subpatch serves as the spine of the functionality of this project. Here, input is received from the attached midi keyboard and routed very precisely into various subcategories to produce different effects as the data is processed. In the final version of the project, it is able to detect the difference between major and minor chords, and depending on the tonality of the input different video effects are applied to the camera input.

Further Information

For further information on this project, please visit my Contact page and reach out to me directly.

In Full Bloom

This project worked with preloaded videos and a web app to allow users to interact directly with the project using only their phones. When the correct url was loaded, this patch functioned in conjunction with input on the webpage so that users could produce specific pitches and utilize those pitches to modify videos running on a loop. The saturation of the videos would improve as users carefully puzzled out what the correct chord was for each video, until finally when harmony was reached the videos would be fully saturated with color.

Further Information

For further information on this project, please visit my Contact page and reach out to me directly.

bottom of page