I have a storage box filled to the brim with all manner of pointing devices, and I just might be about to toss another one in.
This because I recently got a Leap Motion. The hows and wherefores are not of consequence here, but the whys and the likely outcome are, even if it all turns out to be nothing more than a blip in the inexorable stampede of technological progress.
The whys are simple – I wanted to explore 3D input a bit further, partly because I missed out on all the Kinect hacking and partly because I think there’s room for something else between the touchscreen and the trackpad in today’s interaction models.
I did that very consciously and wrote it down under “acceptable losses” in my personal gadget budget because of that (possibly misguided) faith in the feasibility of alternate input models.
Most of line-of-sight tracking solutions are a royal pain due to occlusion and the computational power required, but the Leap Motion appeared to be a decent compromise due to it focusing on the immediate environs of a screen – which is still the most common use case for computer input, regardless of what console manufacturers are trying to tell you this year.
I had previously futzed about with a Nintendo Wii controller and accelerometer input (which actually worked pretty well as far as tracking was concerned and required very little – if any – computing power), but touch surfaces have made controllers largely obsolete except when you’d like some depth perception.
Then came my digital signage hobby, and the search for something a little more sophisticated. I’ve been looking at touch frames for a good while – even though it would be insane for me to get one for playing around at home, and the Leap seemed a suitable replacement for a touch frame without all the hassle (and expense) involved.
In a Nutshell
It doesn’t work as advertised. Yet.
Tracking is extremely responsive, and they deserve substantial credit there – latency is very low, even on an older Mac. But even with a wealth of data coming out of the sensor, it’s incredibly hard to get meaningful input (such as, say, the equivalent of a tap or any other deliberate motion), and all the demo apps I’ve tried suffered immensely from that.
Oh, and in case you harbour any delusions (I didn’t) it’s just too fiddly to use as a mouse replacement. If you want to experience the best possible computer desktop, just get a Mac with a Magic Trackpad and go to town on advanced preferences1
The device also turned out to be weirdly sensitive to indoor lighting, quick motion outside its immediate field of view, and, overall, infuriatingly inconsistent in terms of accuracy even when coddled and plugged in to a relatively modern quad-core Mac in a “safe”, “quiet” office setting with stable lighting and a lot of room to wave arms about.
But yeah, the ergonomics for this just aren’t there yet, not until gesture processing makes it possible for you to perform a single, unambiguous gesture once for it to catch on.
Also, I have a number of gripes with the built-in software. It has a number of high points (like notifying you when it detects a smudge on the device’s surface, which speaks to the degree of fine tuning it has), but it grates somewhat as far as basic integration is concerned.
For starters, it takes up a significant amount of processing power when running, which makes my computers sluggish. I originally believed that to be just as much due to glitzy graphics that my aging hardware can’t fully GPU accelerate as much as pure CPU-bound input analysis, but we’re talking about 17% CPU time in an idle state, so that was a big downer.
Also, I hate that the software auto-starts by default and has no visible options2 to disable that behavior – it might make sense for a kiosk, but it’s simply bad manners on a standard desktop install, and a major nuisance when you’re running out of menu bar space – even though the idle footprint (33MB of RAM without a sensor plugged in) is fairly small by today’s standards.
Finally, I took an immediate dislike to their AirSpace storefront. Call it App Store overdose, but even though I’ll grant the point that standard software isn’t ready to work with the Leap Motion, I don’t see the point in turning it into an app launcher here.
Fortunately, apps I downloaded from AirSpace went into a private Applications/AirspaceApps
folder in my home directory (likely due to my account not having administrative privileges), so it was easy to ignore.
What’s Missing
I’m going to be brutal here – I see very little chance of the Leap Motion replacing the mouse, even if desktop paradigms change twice as much as they’ve done since touch screens came about, since (like every 3D controller ever) there’s little added benefit to using it unless there’s a specific application in mind.
As such, I just don’t see them becoming a mainstream product, even if they manage to land some juicy hardware deals – although they might have a fighting change if they manage to integrate the sensor with a laptop keyboard somehow.
To cater to today’s tinkering crowd, Leap Motion needs a Linux (or, preferably Android) SDK if they want to go beyond the “curio” stage, at least for folk like me.
And possibly even to attain a niche. Consider, for instance, that a bunch of people who’re into stuff like digital signage (and I’ve amassed a few interesting contacts in the past few months) would like to use it somehow, but not if they can’t target the kind of low-end embedded hardware that is all the rage these days – otherwise they’d be limited to one-off kiosks, which simply isn’t a tenable business these days.
Sure, there are companies out there sticking US$500 PCs behind plasma screens all over the place, but most of them (the challengers, at least, not the incumbents with big fat corporate customers) are cutting costs by using some flavour of Linux or looking into severely downsizing hardware.
Hobbyists and researchers can, of course, stick to using PCs or Macs, and I’m positive there are quite a few people out there coding (or, most likely, porting) games to the Leap Motion, but the tinkerer crowd who’s bound to develop something truly innovative for this would love to have an open SDK of some kind, and I just don’t see that happening in today’s patent-oriented business logic.
But just in case it does, I’ll keep mine around for a while – you never know, really.
-
In my experience, even Better Touch Tool was too prone to erroneous input, even though it was more immediately useful than the built-in
Touchless
app. ↩︎ -
On a Mac, it doesn’t even set up a login item you can toggle, and you have to futz about with
launchd
to disable it. ↩︎