Noman Mohammad & Nyx Zhang
-
presentation: https://youtu.be/oEG8zVlS5q4
-
- Users can randomly choose three notes of their choice
- The notes will be sent to the music generator to be parsed into playable music
- The user can then select play in order to play the music generated from their note selection
-
- Analyze the input three notes.
- Generate chord regression based on the analysis results.
- Generate rhythm and beats.
- Generate a melody based on the three notes and chords.
- Mix the results into an audio file with the metadata.
The class name is MusicInterface. Its main purpose is to create the underlying GUI for the program. It inherits the Frame class from the tkinter library.
-
- Function 1: init()
- Calls the initialize_interface function which is designed to create the widgets within our interface
- Function 2: initialize_interface()
- This function will create all the widgets for the interface
- Function 3 - 7 : value_C() - value_B()
- These functions act upon a piano key click. They call a function from the notes module and set values for our notes class
- Function 8: generate()
- Will act upon the generate button. Calls the MusicJam module to generate music based on user input from selected keys. Also calls the notes module to clear the notes object upon music generation and sets the current status in the interface
- Function 9: play()
- Will act upon the play button and play the generated ‘mid’ file from the MusicJam module. Makes a secondary call on the notes module to clear the notes object upon playing the generated music and sets the current status in the interface
- Function 10: check_notes_status()
- returns true/false whether target note count is reached
- Function 11: check_sound()
- stops music depending on the current state
- Function 12: run_interface()
- calls the interface class and sets the main loop to be called in main module
- Function 1: init()
The class name is Notes. Its main purpose is to parse the data coming from the user. This includes duration, speed and error checking for expected note count.
-
- Function 1: init()
- Sets empty lists for notes and time
- Function 2: calculate_duration()
- will return a calculated ratio of the elapsed time between note selection approximated to the nearest set value
- Function 3: generate_note()
- appends selected note to our initialized list along with the current time the note was clicked so that the future calculation of speed and duration is made possible
- Function 4: calculate_speed()
- calculates and returns the total elapsed time from first to last note click
- Function 5: convert_duration()
- converts number value to associated fraction
- Function 6: clear_notes()
- clears the current attributes set for the notes object. Called when music is generated/played
- Function 7: export_notes()
- packages data in form read by the MusicJam module to be parsed and turned into music. Calls multiple helper functions within the class
- Function 1: init()
class MusicAnalyzor:
|
---|
The class name is MusicAnalyzor. It analyzes the input notes Object and sets the key and rhythm of the music.
- Store the input notes from the music player as initial.
- Analyze and Extract the features.
- Using based information to set key and rhythm.
-
- Function 1: key_analyze
- Analyze possible music keys and determine the main key of the music by using empirical probability and music theory.
- Function 2: rhythm_analyze
- Determine the start rhythm of the music based on the input rhythm analyzing result
- Function 3: rhythm_setting
- Based on the analyzed-first-rhythm result to generate the whole rhythm by using empirical probability and music theory.
- Function 4: chord_setting
- Based on the analyzed-first-rhythm result to generate the whole chord progression by using empirical probability and music theory.
- Function 1: key_analyze
class MusicGenerator(MysicAnalyzor):
|
---|
The class name is MusicGenerator. It inherits the MusicAnalyzor class
- Set all the parent class features as initial.
- The primary model that generates the music.
- Output the music audio, music score picture, and other metadata to the music player.
-
- Function 1: chords_generate
Transform the generated chords to music21 objects.
- Function 2: melody_generate
- generate notes object(pitches and durations) stream based on the beats setting result and generated chords progression by using empirical probability and music theory.
- Function 3: melody_str2notes
- a tool for melody_generate to transform chords objects from roman numerals.
- Function 4: melody_f3notes
- a tool for melody_generate to replace the first several notes to the inputs.
- Function 5: mix_melody_chords
- Mix the stream of notes and stream of chords.
- Mix the score of the music.
- Return the chords, score, and other metadata.
references of music-related metadata
metadata_sample = { // beats per minute // default and flexible "bpm":111, // the major or minor scale around which a piece of music revolves // default and fixed "Key":"C", // rhythmic pattern constituted by the grouping of basic temporal units, called beats, into regular measures, or bars // default and fixed "meter":"4/4", // music genre: such as Blues, Jazz, Metal. // default and fixed "genre":"pop", // chords of music melody in this key // model generated the whole melody chords divided by the bars. "chords":[ { // chords "chords":[ "C", "G", "Am", "B-" ], // Offset of the chord in the unit beat "chordRythm":[ "1", "2", "3", "4" ] }, { "chords":[ "G", "A", "C", "D" ], "chordRythm":[ "1", "2", "3", "4" ] } ], // notes of music melody in this key // user random input several notes, then model generated the whole melody notes divided by the bars. "notes":[ { // notes "notes":[ "C", "E", "A", "G", "C", "D", "F", "D" ], // Offset of the notes in the unit beat "notesRythm":[ "0.25", "0.25", "0.125", "0.75", "1", "0.25", "0.5", "1" ] }, { "notes":[ "G", "A", "F", "G", "C", "D", "G", "D" ], "notesRythm":[ "0.75", "1", "0.5", "0.25", "0.5", "1", "0.5", "1" ] } ]} |
---|