CONTACT

CONTACT was a series of projects during my Msc in Adaptive Architectecture and Computation at the Bartlett school of Architecture exploring the use of structure borne sound as an interaction medium. The work involved listening to and digitally analysing how sound travelled through various materials to work out where an object had been touched and how it had been interacted with. Combining this acoustic system with projection mapping allowed ordinary objects like wooden tables to become touch sensitive visual interfaces.

Contact1

The first instance of the CONTACT project involved turning a wooden table into touch sensitive audio visual instrument such that every tap, stroke or bump resulted in expressive and meaningful audible and visual feedback. Two contact microphones (traditionally used for acoustic instrument recording) were attached at each end of the table to pick up vibrations across the whole surface. The high temporal resolution of audio / vibration data from the microphones data meant that haptic interactions could have a lot more nuance and expression compared to traditional interfaces such as buttons or touch screens. This meant that gestures such as thumping with the wrist could trigger deep kick drum sounds while taps with the fingernails could become sharp snare or clap sounds. In addition to using the frequecy data to trigger different noises, the raw audio signal from the table was processed using several audio effects to resonate the natural sound to different pitches, effectively digitally 'tuning' the instrument. 

Contact2

The instrument required a way to control the artificially resonated pitch (as well as the additional reverb and filter effects), however all contact with the table surface resulted in additional sound. A hand tracking camera was used to enable the user to manipulate the sound live using spatial gestures above the table (without having to make physical contact with the instrument). I reused some audio-spatial gestures from my previous research that I found particularly effective (such as the pitch to height metaphor) and created visualisations of the interactions that also visually responded to the audio effects. For example raising your hand high above the table would increase the reverb on the sounds and as such the visuals would leave long decaying trails of motion to match the extended lifetime and envelope of the sounds.

Developments and adaptations of the project included a public installation at the Royal Academy of Art 'Sensing Spaces' Late exhibition (see below).

The Contact research project culminated in my Msc thesis which further explored structure borne sound as an expressive interaction medium. The number of microphones expanded to 4 to explore precise location / sonic triangulation and was then reduced to 1 by exploring how neural networks and machine learning could be used to map and classify different interactions with asymmetric objects. The research concluded that everyday objects could be 'taught' how to listen to different haptic interactions by analysing the vibrations passing through them. This opens the possibilities of creating new interactive interfaces from previously static objects.

Selected Works

Museum of the FutureProject type

Soundscaper OSCProject type

Covariant FlowProject type

Rolls-Royce ToolProject type

Nura Profile DesignerProject type

IBM ThinkProject type

Rolls-Royce - Ad InfinitumCorporate Installation

Sennheiser - AMBEO Lab - XRinteraction design

GitexInstallation Software

LululemonInteractive Installation

Nura - Hearing ProfileGenerative Branding

Contact: Augmented AcousticsInteraction Design

SanctumWebGL

TessituraProject type

Celestial BodyProject type

ROLI - R&DProject type

Spatial MusicResearch

Arup - BallsProject type

TempInstallation

Arcsoc Pavillioninstallation

MA ArchitectureProject type

Copyright Felix Faire 2020 ©