JavaScript audio synthesis with HTML 5
HTML5 gives us a couple new toys to play with, such as <AUDIO> and <VIDEO> tags. On the visual side, we've already seen live green-screening with Canvas and JS, and in terms of audio there's been several JS drum machines already. But the question I was interested in was: can you use JavaScript to stream live data into these media tags?
Enter the JavaScript audio synth. It generates a handful of samples using very basic time-domain synthesis, wraps them up in a WAVE file header and embeds them in <AUDIO> tags using base64-encoded data URIs. Each sample is then triggered using timers to play the drum pattern. It's quite simple to do and runs fast enough in HTML5 capable browsers to be unnoticeable. Yes, it sounds tinny, but that's just because I'm too lazy to design proper filters for toys like this. Unfortunately, while the synthesis is fast enough to run real-time, you can't actually use it for a full live audio stream, as there is no way to queue up chunks of synthesized audio for seamless playback. I tried triggering multiple <AUDIO> tags in parallel to address this, but that didn't work either.
My final attempt was to generate tons of periodic audio loops only a couple of ms long, and to play them back with looping turned on while altering each tag's volume in real time, hence doing a sort of additive wavetable synthesis. Unfortunately, looping is not a fully supported feature, and the only browser I found that does it (Safari) doesn't loop seamlessly at all.
All in all, my first brush with the <AUDIO> tag was a major disappointment. The <VIDEO> tag's high-level approach leads to similar limitations, but they are offset by the flexibility and power of the <CANVAS> tag. Unfortunately, there is no 'audio canvas' to solve similar problems with audio.