Cine-sthetics

Cine-sthetics

2022, Documentation of the artwork, London

Cine-sthetics is a collection of computational artworks imagining embodied interactions in the context of moving image.

A set of sensors connected to Arduino micro-controllers allow the user to control a sequence of moving images on a projected screen through gesture, muscle tension, and sound. This work offers users an experimental way to interact with moving images. Cine-sthetics engages with questions around design futures, accessibility, mind, body, and interactivity informed by prominent theories in embodied cognition. The artist's personal experience of trauma and mental health disability have been informative in the development of this work.

Cine-sthetics aims to challenge interaction design standards in computing and consider new ways of interacting that may be more tactile and comfortable for individuals with sensory differences or disabilities, and create interactions that are dynamic rather than static in the context of moving images.


The outcome of Cine-sthetics is a collection of three micro-controller art projects. Each project uses the Arduino serial data from sensor input as variables for controlling a sequence of moving images in Processing. Each project gathers a certain kind of data and is mapped uniquely to the controls based on the collected data type. Clay enclosures hold the infrared and sound sensors for protection and portability.

PROJECT 1
Two infrared sensors connected to an Arduino Uno Rev2 collect distance data. The infrared sensors are situated at opposite ends of the clay enclosure to allow for left-right gestural movement of the hand similar to that of a theremin. In the Processing sketch, the right-hand controls the speed of the moving image (frames per second). If the right hand is too far, the sequence of images begins to reverse. The left-hand uses distance data as colour controls to influence the tint of an image.

PROJECT 2
A clay enclosure houses an Arduino Uno Rev3. Cuts in the lid allow noise to reach the sound sensor. The serial data is mapped to the frame rate in the Processing sketch. The image will play only when the sound sensor reaches a certain value, and the frame rate of the image is influenced by the noise value.

PROJECT 3
A MyoWare 2.0 sensor connects to an Arduino Uno Rev2. The MyoWare sensor collects electromyography data (EMG) corresponding to muscle tension. The serial data from the Arduino is used in the Processing sketch, like the others, to control the frame rate of the image sequence. If the serial data detects muscle tension at a certain value, the animation will continue. If the sensor does not detect any muscle tension, the animation stops until muscle tension is detected again.


In Cine-sthetics, tactility is a focus, not for novelty but need. The experience of sitting at a desk with a laptop is primarily stagnant aside from scanning eyes and tapping-clicking fingertips. The values embedded in the design of the modern PC split the experience of human interaction in half, separating mind and body in the fashion of dualists.

This work places the body at the centre of human-computer interaction. The experience of using a personal computer with commonplace controllers such as keyboards, mice, and touchscreen is efficient and familiar to most, but it could be argued that this experience lacks tactility. This lack of meaningful engagement with the body may leave some users feeling disconnected.

Unsoldered wires, modularity, and temporality are not a flaw but precisely the point of this work. In each work, users can either directly see the circuits or 'uncover' them. Giving transparency to 'what is inside' the controller might allow the audience to entertain the possibility of creating controllers for their individual use and the use of their community.