Affine Tuning

2021 | iPhone app

Embodied Interaction and Dynamic Composition

Affine Tuning seeks to connect digital embodied interaction with dynamic music composition. In a place of their chosing, the audience can interact with musical pieces through their body movement and posture. Their performance is captured by the phone camera and analyzed through pose estimation software. This image processing happens locally on the device. The feedback is audio only.

One core idea of the project is to present a system which is quite open, using a combination of rudimentary principles. Those are relating the intensity of rhythm to the energy of the movement, and the pitch of the sound to the raise of certain limbs. Further, more, specialized interaction methods are specific to the selected composition.

Visualisations from custom software which I use for inspiration.

The reasoning behind this rather loose model is that both the player and composer should have freedom for artistic expression without coupling every instance of it in a tight feedback loop. Affine Tuning is not an instrument that can be controlled. Rather, it creates a virtual space for a cycle of interpretations, of questions and answers, in which an ephemeral experience with a lasting impression can take place. Eventually, it represents an optimistic view on ambient interactions between humans and computers.

Nefrin

The music in Affine Tuning are original pieces. The first composition is called "Nefrin", which means "Curse" in Farsi. A few more pieces will follow before the work is published.



Sarah-Finja Rost performing "Nefrin"

Stay Tuned

All these descriptions are not really satisfying, and the obvious makeshift, the videos on this page, are also only a shadow of experience. The focus lies explicitly not on this kind of presentation, but on the individual experience.

Affine Tuning will be released on the iOS App Store in the first half of 2021. Initially it will include five different songs and interactive sound experiences.

A deep dive into the project can be found in this master thesis documentation.