my apple speculation

Just for the record:

People making guesses about Apple’s Intel strategy have focused on two possibilities for OS X on Intel:

One, that third-party application vendors would have to re-compile their code for the new architecture. Two, that Apple has developed or will develop a reasonably fast PPC emulation layer.

I think there’s a third possibility: that Apple has developed a layer which will, upon the first attempt to execute an incompatible application bundle, dis-assemble, roughly translate, re-assemble, and save the binary for the new architecture. Admittedly, I’m not really qualified to assess how impossible a task this is. Since we’re looking at a pretty constrained set of binaries, compiled for known APIs, and with known toolchains, it seems like writing a really smart disassembler wouldn’t be quite as difficult as writing a really fast emulator.

A fourth possibility: some kind of internet-based distribution mechanism for binaries. Vendors supply Apple with new binaries, or Apple builds new binaries itself, somehow. Since the binary itself makes up such a small chunk of the application bundle, why not a mechanism which checks for a simple hash in an online database, downloads, and updates the relevant bundle?

Just throwing it out there.

Comments are closed.

Staypressed theme by Themocracy