Last night I took a look at a graphics programming system called Processing. It’s quite groovy. I had been dimly aware of it before, but I think I sort of rolled my eyes and muttered, “Just what I need — another software toy.” But I enjoy hobbyist-level computer programming, and this week I’ve been pondering what I might want to do with it. Processing offers some intriguing possibilities.
Csound is programming, but it’s not a very good fit for my own music composition preferences. Interactive fiction is programming, but I’ve become disenchanted with both the traditional IF delivery systems and the possibilities for meaningful storytelling within an interactive framework. Javascript running in a browser is an extraordinary resource, but what on Earth would I do with it?
Dave Phillips posted a link on the Csound mailing list to a new piece that he did using a system called AVSynthesis. I liked the piece — it’s not my style, but it evokes a definite mood. But when I gazed upon the web page for AVSynthesis, it was pretty clear I would never be able to fight my way through what might loosely be called the documentation.
Processing seems to be very well documented. It’s in active development, has a large user community, and does some spiffy things. Basically, you use it by writing code in Java. The code itself is easy to write and easy to understand. You can display and animate geometric shapes in a window, or import graphics files and do stuff with them. While your program is running, it will respond to the usual range of computer inputs — mouse moves, keypresses, and so on. Processing will even transform your program into a cross-platform app with a single menu command.
For my nefarious purposes, I needed to ask two fundamental questions.
First, can the activity in the Processing window be captured as a movie? Answer: Yes, it can. A few minutes of online research, a few lines of code, and I was able to test it and verify that I had a movie file.
Second, will Processing accept real-time input via OSC? This is important to me not because I’m planning to build a real-time interactive installation piece (way too much work — and anyway, my opinion of real-time interactive installations is that they’re gee-whiz consumer toys for kids, not a medium for any sort of serious artistic work), but because I’d like to be able to generate a Processing video that would sync with a pre-recorded music track that I’ve created in FL Studio. OSC communications require a Processing plug-in, and I haven’t tested it yet — but it exists.
The signal flow might be a little tricky. I don’t think FL Studio will transmit OSC. But it will transmit MIDI to Pd, and Pd can very easily translate MIDI messages into arbitrary OSC messages. To conserve CPU bandwidth, Processing could be running on a separate computer and receive the OSC messages over a network. Capture the OSC-controlled movie from Processing, open it up in iMovie, import the mp3 created by FL Studio, and presto! An abstract, animated music video. Just add creativity, stir vigorously, and upload.
On the subject of other Processing, Csound and software toys, you might also like
Quil https://github.com/quil/quil , and Overtone, http://overtone.github.com/ .
Could be good stuff, George. Thanks for the tip. Right now I’m just starting to get my brains around Processing. Not sure I want to try learning Clojure just yet. (For one thing, I’m partial to C-type languages, which at first glance it doesn’t appear to be.)
Yes, Clojure is definitely a Lisp, though a well-designed one.
Really enjoyed your little sketches and am currently playing with Processing myself modelling a harmonograph. It occured to me you could map the pendulum locations/velocity to pitch/modulation/timbre, might be an interesting toy.
Processing has, as it turns out, not only MIDI I/O (via the proMIDI library) but also an audio synthesis library called Beads. I haven’t tried either of them yet. Right now I’m learning to work with .jpgs in Processing — filtering, moving, altering the colors.