Musicking robot interacting in real-time with a musician ensemble through drawing digital score, designed to enhance creativity using audio captured by a microphone and physiological measures (EEG and EDA). Currently supports Dobot Magician Lite and UFactory xArm, uses a BrainBit EEG and BITalino EDA. Tested on Windows 10 primarily.
- Install Python 3.10 or higher
- Install the requirements:
pip install -r requirements.txt
- Download the Embodied Musicking Dataset into the
nebula/ai_training/dataset/
folder - Run
nebula/ai_training/src/train_feature2feature.py
- Connect the robot to the computer
- Connect the BrainBit and BITalino to the computer via bluetooth
- Run
main.py
Note: if the BrainBit EEG is connected but the streaming session cannot be started, try unpairing the device, restart the computer, and repair it. The BrainBit should normally display a blinking light when connected but not in use, and have a solid light when actually being used by the script.
The main script to start the robot arm drawing digital score work. Digibot calls the local interpreter for project specific functions. This communicates directly to the pydobot library. Nebula kick starts the AI Factory for generating NNet data and affect flows. This script also controls the live mic audio analyser.
Arguments in config.py
:
duration_of_piece
: the duration in seconds of the drawingcontinuous_line
(bool): True = will not jump between pointsspeed
(int): the dynamic tempo of the all processes.1
= slow,5
= fastdobot1_port
: port for the Dobot, may need to be changed depending on the machine
- Clone this repository using git
- Create a branch and work on a task (one at a time)
- Once tested, create a pull request from that branch that will be reviewed and eventually merged
Note: pull regularly!!