It’s so freeing to be able to cut, copy, and paste, rather than having to re-enter each note individually on the phone’s keypad. Being able to separate measures/sections with blank lines is handy too.
Something I learned while making this: arpeggiated chords for three voices seem to sound better when you cycle [High, Low, Mid] rather than [High, Mid, Low].
Is it possible to render the output of a webaudio thing to a wav file like you can rerender the output of a html canvas to a png?
It doesn’t look AudioContext has a one-step solution like the canvas has with HTMLCanvasElement.toDataURL(). However, the example on the MDN page for AudioContext.createMediaStreamDestination() shows how to hook up a MediaRecorder to an AudioContext, ultimately producing an Opus-encoded Blob that can be played by an <audio> element.
Cool. Here are some compositions I just made with it:
It’s so freeing to be able to cut, copy, and paste, rather than having to re-enter each note individually on the phone’s keypad. Being able to separate measures/sections with blank lines is handy too.
I made another composition I’m proud of, one that simulates three-voice chords by arpeggiating (cycling through) 64th notes. Here’s my Nokia Composer arrangement of the chorus (0:52–1:09) of “Countryside 2 (Lee Brothers – Glad I Am)” from Double Dragon Neon by Jake Kaufman.
Something I learned while making this: arpeggiated chords for three voices seem to sound better when you cycle [High, Low, Mid] rather than [High, Mid, Low].
These are nice 👍
This is fun. The syntax is actually very similar to what lilypond uses.
Who recognizes the melody I transcribed here?
Well that’s lovely. I appreciate that the code isn’t even golfed excessively hard, it’s just a nicely designed regular language.
Is it possible to render the output of a webaudio thing to a wav file like you can rerender the output of a html canvas to a png?
“Light My Fire” is a great choice of example song. Very 90s retro. :)
It doesn’t look
AudioContext
has a one-step solution like the canvas has withHTMLCanvasElement.toDataURL()
. However, the example on the MDN page forAudioContext.createMediaStreamDestination()
shows how to hook up aMediaRecorder
to anAudioContext
, ultimately producing an Opus-encodedBlob
that can be played by an<audio>
element.Nice! Thank you.