How to render synchronous audio and video in Processing in 2021
About a decade ago I wrote a blog post about rendering synchronous audio and video in processing. I posted it on my now-defunct blog, computermusicblog.com. Recently, I searched for the same topic, and found that my old post was one of the top hits, but my old blog was gone.
So in this post I want to give searchers an updated guide for rendering synchronous audio and video in processing.
It's still a headache to render synchronous audio and video in Processing, but with the technique here you should be able to copy my work and create a simple 2-click process that will get you the results you want in under 100 lines of code.
Prerequisites
You must install Processing, Minim, VideoExport, and ffmpeg on your computer. Processing can be installed from processing.org/. Minim and VideoExport are Processing libraries that you can add via Processing menus (Sketch > Import Library > Add Library). You must add ffmpeg to your path. Google how to do that.
The final, crappy prerequisite for this particular tutorial is that you must be working with a pre-rendered wav file. In other words, this will work for generating Processing visuals that are based on an audio file, but not for Processing sketches that synthesize video and audio at the same time.
Overview
Here's what the overall process looks like.
- Run the Processing sketch. Press q to quit and render the video file.
- Run ffmpeg to combine the source audio file with the rendered video.
Source Code
Without further ado, here's the source. This code is a simple audio visualizer that paints the waveform over a background image. Notice the ffmpeg instructions in the long comment at the top.
/*
This is a basic audio visualizer created using Processing.
Press q to quit and render the video.
For more information about Minim, see http://code.compartmental.net/tools/minim/quickstart/
For more information about VideoExport, see https://timrodenbroeker.de/processing-tutorial-video-export/
Use ffmpeg to combine the source audio with the rendered video.
See https://superuser.com/questions/277642/how-to-merge-audio-and-video-file-in-ffmpeg
The command will look something like this:
ffmpeg -i render.mp4 -i data/audio.wav -c:v copy -c:a aac -shortest output.mp4
I prefer to add ffmpeg to my path (google how to do this), then put the above command
into a batch file.
*/
// Minim for playing audio files
import ddf.minim.*;
// VideoExport for rendering videos
import com.hamoid.*;
// Audio related objects
Minim minim;
AudioPlayer song;
String audioFile = "audio.wav"; // The filename for your music. Must be a 16 bit wav file. Use Audacity to convert.
// image related objects
float scaleFactor = 0.25f; // Multiplied by the image size to set the canvas size. Changing this is how you change the resolution of the sketch.
int middleY = 0; // this will be overridden in setup
PImage background; // the background image
String imageFile = "background.jpg"; // The filename for your background image. The file must be present in the data folder for your sketch.
// video related objects
int frameRate = 24; // This framerate MUST be achievable by your computer. Consider lowering the resolution.
VideoExport videoExport;
public void settings() {
background = loadImage(imageFile);
// set the size of the canvas window based on the loaded image
size((int)(background.width * scaleFactor), (int)(background.height * scaleFactor));
}
void setup() {
frameRate(frameRate);
videoExport = new VideoExport(this, "render.mp4");
videoExport.setFrameRate(frameRate);
videoExport.startMovie();
minim = new Minim(this);
// the second param sets the buffer size to the width of the canvas
song = minim.loadFile(audioFile, width);
middleY = height / 2;
if(song != null) {
song.play();
}
fill(255);
stroke(255);
strokeWeight(2);
// tell Processing to draw images semi-transparent
tint(255, 255, 255, 80);
}
void draw() {
image(background, 0, 0, width, height);
for(int i = 0; i < song.bufferSize() - 1; i++) {
line(i, middleY + (song.mix.get(i) * middleY), i+1, middleY + (song.mix.get(i+1) * middleY));
}
videoExport.saveFrame(); // render a video frame
}
void keyPressed() {
if (key == 'q') {
videoExport.endMovie(); // Render a silent mp4 video.
exit();
}
}