Skip to content

Latest commit

 

History

History
44 lines (22 loc) · 2.39 KB

README.md

File metadata and controls

44 lines (22 loc) · 2.39 KB

Sonus

Project Description

Sonus is a sound-and-location based social network.

Recordings are posted in physical locations, to be later encountered by others when they visit those locations. As these posts accumulate in space, they form visible “clouds.” As with atmospheric clouds, these assume abstract but classifiable forms reflective of dynamic and emergent conditions––in this case, social rather than meteorological.

Fundamentally, this sound-and-space project intends to center accessibility for the visually impaired into its functionality, and culminates into cloud-like traces of spatialized auditory embodiment of place.

Aural-Clouds-Cirrus-GIF-360-5

Aural-Clouds-Cumulus-GIF-360-5

Aural-Clouds-Stratus-360-5

Aural-Clouds-Nimbus -GIF-360-5

Full video of Experience:

https://vimeo.com/blakeshae/aural-clouds

Full Project Documentation:

https://www.blakeshaekos.com/work/sonus

Interactive Exhibition

To provide public access to my work, I created an interactive exhibition.

I coded the audio effects of publicly recorded audio in SuperCollider and prototyped an interactive interface between SuperCollider and a Raspberry Pi using Python and OSC. Ultimately, the interactive element was not used in the final exhibition due to time constraints and lack of buy-in from Thesis Advisor. Therefore, I used the Raspberry Pi to playback audio created within SuperCollider and video recorded at various locations around Los Angeles.

SuperColliderInterface

https://vimeo.com/867293963?share=copy

WIP-UI-230206

https://vimeo.com/blakeshae/rpi-interaction?share=copy

Final Gallery Installation

Aural_Clouds-Installation-v2