Sonus is a sound-and-location based social network.
Recordings are posted in physical locations, to be later encountered by others when they visit those locations. As these posts accumulate in space, they form visible “clouds.” As with atmospheric clouds, these assume abstract but classifiable forms reflective of dynamic and emergent conditions––in this case, social rather than meteorological.
Fundamentally, this sound-and-space project intends to center accessibility for the visually impaired into its functionality, and culminates into cloud-like traces of spatialized auditory embodiment of place.
https://vimeo.com/blakeshae/aural-clouds
https://www.blakeshaekos.com/work/sonus
To provide public access to my work, I created an interactive exhibition.
I coded the audio effects of publicly recorded audio in SuperCollider and prototyped an interactive interface between SuperCollider and a Raspberry Pi using Python and OSC. Ultimately, the interactive element was not used in the final exhibition due to time constraints and lack of buy-in from Thesis Advisor. Therefore, I used the Raspberry Pi to playback audio created within SuperCollider and video recorded at various locations around Los Angeles.
https://vimeo.com/867293963?share=copy