So What Is Rosetta, Anyway?

As I waited for my old Wiki snapshot to be uploaded to this server and planned for the umpteen little fixes I still have to do (my servers’ demise couldn’t have come at a worse time, but hey, it’s Monday, and I’m used to Mondays being bad days), I glanced around at the initial reactions, and after laughing myself silly at Melo’s take, I went back to the keynote transcripts and waded through my RSS feeds trying to figure out if ““ is (or isn’t) something out of Transitive Corporation.

Apparently (and I quote): “ will allow PowerPC compiled apps to work on an “, but not the other way around.

Maybe there’s more in the webcast, once it’s up.

Update: It’s up, and it’s in H.264 format, which means I finally get some decent video quality – and has just used a widget to show that it’s 576 days until Longhorn ships. :) He also mentioned “universal binaries” alongside dual application loaders, so these are “fat” binaries indeed.

As to , it’s definitely PowerPC to only. There was no specific mention of performance – only that it’s fast (enough). Which is just about as good as we’re likely to get, since we’re a year away from general availability. and ran speedily on the 3.6 GHz Pentium 4 Steve was demoing them in, but hey, it looks to be plenty fast:

Nevertheless, ‘s dynamic binary translation is sure to be a very hot topic in the days ahead. It might be Transitive‘s QuickTransit, some kind of VM, or something else entirely, but since we have the “fat binaries” options there right from the outset, developers have an extra incentive to get things running on .

Update: The WSJ mentions that “acknowledged that was based in part on technology developed by Transitive, and CNET has a piece with more background (via ArsTechnica, thanks Clint) but I guess it will take a while yet to sort out precisely what flavor of QuickTransit was distilled into , and what its limitations are. – Gruber, as usual, has an overview.

My Take on this “ Inside” Thing

I, for one, with my iBook‘s and having recently upgraded to an iMac and gotten a for a home server (which is also, incidentally, being prepared to replace my burnt-out box), am of two minds about this.

For starters, I’m not going to buy another until the first models come out. That’s pretty damn obvious, and has got to be prepared for that (maybe they’ll drop prices now, maybe only later to clear stock as the time comes – we’ll see). My biggest concern, therefore, is hardware support, but usually has that angle covered.

Of course, trying to sell my iMac to finance a future upgrade is now likely to make me lose quite a bit of money, and the fact that I’m sitting in front of a machine that will depreciate steeply until 2007 is a major pain. That I’m not happy with at all, no matter how good the hardware is.

My iBook wasn’t slated to be replaced until 2006, and I sure as hell hope it’s repairable, since it will have to last me that long – and even then, I will probably be very wary of the early based stuff too, because no matter how good turns out to be, I still have painful memories of the 680×0-toPowerPC transition (I wasn’t using s much myself, but I often had to help people who were, and believe me, it was a pain).

Of course, all bets are off if releases some sort of tablet or an ultra-light laptop, but that’s just because my iBook has become pretty damn indispensable, and I don’t see myself replacing it with some half-baked laptop for any extended period of time.

We’ll just have to see, and try to figure out if is out to get or simply carve out a larger niche. Some folk think this is a way to strike back at with a vengeance on the IT marketplace, but that’s just too far-fetched, no matter how “mainstream” Intels are.

Software and Roadmaps

The transition to , however, does raise some issues where it regards third-party software (how many people do you know that are still running old versions of , for instance?) and the roadmap.

Being primarily -oriented and used to either compile or develop the stuff I’m interested in, I have very little to say on the subject of software. To me, the CPU is largely irrelevant, as long as it’s speedy enough to let me use gcc and its associated menagerie to do my thing. After years of compiling stuff from source code in umpteen places, one more architecture shift isn’t going to bother me much.

And, largely thanks to their use of and a largely homogenous set of tools, I expect the developer community to adjust very easily ( 2.1 is sure to be a tremendous help there).

As to stuff, I for one can live without and being updated next year to “universal” binaries – although I’m skeptical of ‘s performance on a lesser machine (and by “lesser” I mean something a bit less powerful than what Steve used during the keynote) – but given ‘s penchant for yearly OS upgrades, and until we know more about how the existing PowerPC range fits into the picture, I’d hazard that 10.5 (Leopard) might well be the last PowerPC version – and that 10.6 is likely to be -only.

Of course, it’s far, far too early to tell. One thing’s for sure: emulation is going to be one hell of a lot faster, and a lot of speculation is certain to be heading down that road soon.

This page is referenced in: