Open Sound Control is just about the worse possible name for a protocol since I’d thought for a long time that this protocol could only be used to control sounds. Nothing is further from the truth. OSC should be renamed OAC – Open Anything Control - which would be a far better name since it can be used to control anything.
It’s kind of interesting that this happened, since the original intent really was sound-specific. OSC was developed by the Center for New Music and Audio Technologies (CNMAT) at UC Berkeley more or less as a replacement for MIDI, the previous standard for chaining together synths and other audio equipment. But decoupled from MIDI’s connection to a specific hardware interface, and generalized to a much wider range of musical and performance data, the end result is more or less a general protocol for connecting programs. Probably that’s because “anything music programs might want to communicate with each other” ends up being a pretty large subset of “anything programs might want to communicate with each other”.
There are plenty of cool things about OSC, one of the things that you find in SuperCollider (at I think CMIX, Csound, ChucK, Max/MSP and actually synths in general) was/is the idea of audio rate and control rate. OSC was and is generally for things at control rate and usually works ok over lossy connections, but can falter when you have binary on/off controls. What’s interesting is that mobmuplat, touchosc, lemur, TUIO etc have used OSC (and MIDI) to abstract control surfaces in a way that hasn’t been done in standard UI/UX for general purpose programs.
One thing that isn’t touched on here is the endpoint definition ‘address’ which is generally why OSC has spawned a bunch of competing implementations that usually drop down to MIDI and why generally everyone still uses MIDI and NAMM 2016 was full of bluetooth/usb MIDI controllers not bluetooth/usb OSC controllers. Since everything (generally) has an address space there was no standardization around where “notes” are or where “pitch bend” was or is, so the basic stuff that midi had handled in 7-bits (or 14) didn’t get any boost by having 32 bit representations if there wasn’t a standard address you could reach grand piano middle C. Whereas with the old MIDI standard and its add ons like general midi, you could kind of uniformly address that concept.
Protobufs, avro and capnproto are all interesting ways of doing this as well.
http://opensoundcontrol.org/ seems to be down.
Wikipieda Page - Google Cache Link for the intro
Reading the spec, the concept of bundles and requiring events to happen at the same time looks appealing.
It’s kind of interesting that this happened, since the original intent really was sound-specific. OSC was developed by the Center for New Music and Audio Technologies (CNMAT) at UC Berkeley more or less as a replacement for MIDI, the previous standard for chaining together synths and other audio equipment. But decoupled from MIDI’s connection to a specific hardware interface, and generalized to a much wider range of musical and performance data, the end result is more or less a general protocol for connecting programs. Probably that’s because “anything music programs might want to communicate with each other” ends up being a pretty large subset of “anything programs might want to communicate with each other”.
There are plenty of cool things about OSC, one of the things that you find in SuperCollider (at I think CMIX, Csound, ChucK, Max/MSP and actually synths in general) was/is the idea of audio rate and control rate. OSC was and is generally for things at control rate and usually works ok over lossy connections, but can falter when you have binary on/off controls. What’s interesting is that mobmuplat, touchosc, lemur, TUIO etc have used OSC (and MIDI) to abstract control surfaces in a way that hasn’t been done in standard UI/UX for general purpose programs.
One thing that isn’t touched on here is the endpoint definition ‘address’ which is generally why OSC has spawned a bunch of competing implementations that usually drop down to MIDI and why generally everyone still uses MIDI and NAMM 2016 was full of bluetooth/usb MIDI controllers not bluetooth/usb OSC controllers. Since everything (generally) has an address space there was no standardization around where “notes” are or where “pitch bend” was or is, so the basic stuff that midi had handled in 7-bits (or 14) didn’t get any boost by having 32 bit representations if there wasn’t a standard address you could reach grand piano middle C. Whereas with the old MIDI standard and its add ons like general midi, you could kind of uniformly address that concept.
Protobufs, avro and capnproto are all interesting ways of doing this as well.
(end meandering rant)