Mike Schaeffer's Blog

Articles with tag: apple
July 27, 2005

As rumored, Apple just refreshed the iBook. The other rumor, the one about a new chassis and a widescreen display, did not come true. Between that and Apple's desire not to encroach too much on the PowerBooks, there wasn't much headroom for major upgrades:

  • 2-finger trackpad scrolling.
  • Sudden motion sensing for the disk. (Is this done by the disk itself with a built in motion sensor or by the motherboard/CPU?)
  • Standard Bluetooth
  • A minor speed bump: the peak CPU is now a 1.42GHz G4 with a 142MHz bus.

I was hoping for more, but given Apple's total lack of manuvering room in the laptop space, this is an understandable bump. If they upgraded the iBook too much, there'd be little reason to pay extra for the PowerBook. Since they can't upgrade the PowerBook too much (thanks to the stagnant G4) they have a natural cap on the features in the iBook. Thus, Apple is restricted to selling up its five year old laptop with slogans like "a fast 133MHz or 142MHz system bus" (fast? Dell's $500 Inspiron 1200 runs its system bus at 400MHz) and "brilliant 1024 by 768 pixel resolution" (maybe it was brilliant five years ago).

Anyway, I've recently come to have a theory on the limited display resolution of Apple's notebooks. It seems obvious in retrospect, but Apple can't scale up the display resolution since they don't have the CPU or memory bandwidth to support higher resolutions as well as they want. With modern display stacks like Quartz and Quartz Extreme, pushing pixels around is one of the biggest user-visible performance burdens on a modern machine (hence, "the snappy"). While a GPU can help, there's no getting around the fact that if they doubled the resolution, they'd double the number of bytes their system has to process to render the same sized desktop on the screen. Given that Apple's best G4's have less than half the main memory bandwidth of the lowest end Centrinos, there's no wonder Apple's not chomping on the bit to eat up more of their bus.

Since Apple's first wave of Centrino laptops should bring fixes for all of this, the computing community has some pretty amazing hardware to look forwards to in a year or so.

June 30, 2005

I ran across this quote the other day from I, cringely:

The market has stupidly decided that Intel microprocessors are better than Apple's preferred PowerPCs, so Apple will be at a disadvantage trying to sell PowerPC machines into the Intel market. This is what's right now killing Silicon Graphics, which is finding rough going pitting its MIPS processors against Intel. ... Yes, Apple will build computers with Intel processors. Their aim, as in all of these products, is for the high end. Based on Intel's new Merced chip, the new Apple machine will have PCI slots, Universal Serial Bus, Fast Ethernet, IEEE 1394 FireWire, IRDA, DIMM sockets, but no ISA slots and no backwards compatibility to DOS. So this is NOT a PC in the strictest sense, since it will only run Rhapsody, but not System 8 or Windows NT. It will run Mac applications inside Rhapsody. And because Apple is both the author of Rhapsody and the designer of this machine, Jobs believes that more customers will want to buy their Rhapsody wrapped in Apple hardware than not.

Funny thing is... that quote is from October of 1997. A lot has changed since then, but since the core reasoning was sound it probably shouldn't be too much of a suprise that he was ultimately right.

The other interesting bit was that Cringely wrote that piece around 1997, which is when the NDA for 'Project Star Trek' expired. Star Trek was a project in which a few Apple, Novell, and Intel software engineers got MacOS 7 running on PC hardware. I'm not sure what the business story would've been, but it was a nice technical accomplishment nonetheless.

June 6, 2005

I didn't believe it was possible when I first heard the rumors a few weeks ago, but Here it is: Apple will transition to x86, specifically Intel, in 2006. The whole line will go x86 in 2007. Microsoft is behind the switch, as is Adobe. Interestingly, the developer transition kit has an Intel compiler at its core. I wonder why not GCC.

The next question is how well it will be pulled off. In theory it could be seamless. It needs to be.

June 6, 2005

So, the big rumor is that Apple is switching to Intel processors, and Steve Jobs is going to make the announcement during his WWDC keynote address this morning (10:00AM PST). I had been planning on writing a debunking article, but now I'm not so sure. Here's why:

Reason not to switchCounterargument
If Apple switches to Intel, they introduce another archicture break into their hardware platform. Emulation can make existing binaries run seamlessly on Intel.
But isn't emulation really slow? Modern emulation technology has gotten a lot better, it can compile code on the fly, just like a modern JVM or Virtual PC.
But I've run virtual machines before, and they're still really slow. All of the operating system services can be made to run natively, at full speed. The only thing that will be emulated is the application code itself. So, except for very computation-intensive application code things could still run smoothly.
Okay, but a lot of OS X (like Quartz Extreme) is optimized to run on Macintosh hardware. Macintosh video hardware is the exact same as PC video hardware these days. In fact, most of the supporting hardware in Macintosh is the same as on a PC.
The PowerPC is part of Apple's 'uniqueness'. It doesn't matter to most consumers what chip or ISA is running their software. The reason people pay for Apple, their core unique value, is their appealing design and the attenion they spend developing a well integrated system. Even if Apple switches to Intel, there's no reason any of that has to change. (Anyway, they could still do something pretty unusual, like putting a Pentium M in a desktop).
Lots of new stuff in Tiger like CoreImage uses AltiVec a great deal. CoreImage actually compiles dataflow graphs to native hardware at runtime, picking the approach that runs best on the target hardware. CoreImage could well compile to x86/SSE2 (or whatever else). That means that even a PPC binary running emulated on an Intel Macintosh could have access to full speed CoreImage services compiled to SSE2.
This will alienate existing PowerPC customers. Why does it have to? If their emulation works well enough, Apple could easily introduce Intel hardware and retain PowerPC as the standard binary format for a while. The common case for ISV's would be to continue developing PowerPC binaries and selling into both the x86/OSX and PPC/OSX markets. The only 'schism' would be arise for software vendors who had to have full performance on x86/OSX. They'd have to worry about shipping some kind of fat binary that ran on both platforms. There still, PPC/OSX customers wouldn't see a difference.
Will Windows run on an Intel Mac? Won't that make it easier for Microsoft to drop Office for OS X? Apple could easily make it virtually impossible to run Windows on whatever hardware they sell. With respect to Office for OS X, Microsoft doesn't really care what the target archicture is: they just want to sell licenses to Office. They'll go where the money is, and that might end up being an OSX/Intel port.

Now that I think about it, the switch to Intel would basically boil down to the same story Apple told in 1993, when it initially switched from the Motorola 680X0 to the PowerPC. Apple pulled it off well in 1993, and now they have the benefit of experience (they've done it before), better emulation technology, and an already more standard hardware platform. It seems plausible to me. The only thing that's left is to figure out why they'd do it, and I have some ideas there too:

  • They could finally move their laptops to a faster chip than the G4.
  • x86 is not going away and it's not going to end up marginalized any time soon. This could be a 'final' switch.
  • If IBM is growing cold on the desktop CPU business (and who could blame them), Apple's hand might be forced into switching away from PPC. Right now, IBM is the only high performance CPU story Apple has.

Anyway, let's see what Jobs says...

Older Articles...