# Giving QCRenderer a breather

A big regret I have these days is that I can’t seem to shake off interruptions for long enough to do more than mere snippets of documents or code.

Still, there has been a considerable amount of hacking during the past week - besides spending quite a bit inside Quartz Composer to do some fancy stuff that, alas, might never be public and a little Objective-C to go alongside it, there has been quite a bit of Python and JavaScript as well.

For starters, I’ve been cleaning up Yaki on Github a bit and tracking down an HTTP redirect loop that surfaced only on the public branch for some reason - since I stripped away some of the site-specific bits in order to publish the core codebase, it seems that I forgot some of the more esoteric ones.

And there’s been quite a bit of JavaScript and WebGL, but the results are too incipient to share as yet.

But coming back to Objective-C, if you ever feel the need to wrap Quartz Composer to capture video output from it and you need to not just pass parameters to the composition’s input ports (the bit that is properly documented) but also to actually read from it, here’s the bit that will save you hours of frustration1:

- (void)start
{
NSError *error = nil;
movie_ = [[QTMovie alloc] initToWritableFile:exportedMoviePath_ error:&error];
if(!movie_) {
// Handle the error here
}

// Cached attributes -- a bit more efficient this way

if(havePorts_) {
// we've tested before if the composition ports existed, but we have to set the
// the input ports on the renderer, not the composition.
// Let's assume that you need to send in a single float value, for the sake of argument
NSLog(@"Setting value: %f", theValue_);
[renderer_ setValue:[NSNumber numberWithFloat:theValue_] forInputKey:QCCompositionInputTheValue];
}

NSTimeInterval time = 0.0;
float delta = 0;
int haveFeedback = 0;
for(time = 0.0; time <= exportDuration_; time += 1.0/exportFPS_) {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

NSImage *frame = [self renderFrameAtTime:time];

if(havePorts_) {
// This is the first part of the tricky bit - you need to get something out of the composition _after_ it's started running,
// but not necessarily after the first frame
feedbackNumber_ = [[renderer_ valueForOutputKey:kQCCompositionOutputFeedbackNumberKey] floatValue];

if ((feedbackNumber_ > 0) && (!haveFeedback)) {
// do something with the feedback (for simplicity's sake, let's assume this is a one off)
haveFeedback = 1;
}
else {
// This is the REALLY tricky bit. If you're just looping in a CLI executable,
// you'll starve the Composer threads and they'll never get the chance to compute anything
// (which is why the Apple dev samples use a timer, incidentally...)
[[NSRunLoop currentRunLoop] runUntilDate: [NSDate dateWithTimeIntervalSinceNow: 0.1]]; // artificially high yield() time
// force re-rendering the same frame until the composition tells us something
continue;
}
}

if(frame) {
}
[pool drain];
}
if([movie_ canUpdateMovieFile]) {
[movie_ updateMovieFile];
}
else {
// deal with error
}

NSLog(@"Finished Exporting movie");
}


Note that the goal here was to create a CLI app (not a desktop app) where we wanted to stick something in and get something out immediately, capturing only the video frames that made sense.

However, since reading composition output ports and capturing them to video isn’t as well documented as I’d like2, I thought I’d share this particular snippet here.

The rest of the code is not in the public domain, although it might be some day.

And now, if you’ll excuse me, I have to catch some sleep. And commit a bunch of changes to a few Trac plugins, too.

1. The original code was a colleagues’ and worked fine until I realized I had failed to explain what we really needed (a classic case of extending the specs), so I went in and hacked it myself. ↩︎

2. This sample was where I managed to figure out part of it - but it was not written for a batch/CLI environment and is timer-based, so thread starvation was not on Apple’s mind at the time. ↩︎