Advanced Micro Devices' approach to graphics processing could make its chips compelling for some Macs, says Businessweek.com's Arik Hesseldahl
(This story was updated to remove a quotation that was used in error.)
In the pantheon of Apple rumors, some refuse to die. Among these is the idea that Apple (AAPL) should team up with Advanced Micro Devices to offer its chips inside Macs. That team-up may be coming sooner rather than later, perhaps during the first half of 2011. AMD's forthcoming Fusion line of processors, due near the end of this year, will combine two chips—a main central processing unit and a specialized graphics chip—onto a single, fast-performing piece of silicon. If Apple decides to use the chips, breaking a lock on using only Intel (INTC) CPUs in Macs, they would supply Apple computers with graphics muscle for a variety of applications in a small, low-power package. AMD (AMD) would get a key design win and a big psychological boost. One scenario could be Apple including AMD chips in some models of its Mac Mini, MacBook, or 13-in. MacBook Pro. "AMD has a better-than-average chance at landing a chip in an Apple system," says Jim MacGregor, an analyst with market research firm In-Stat. "Apple would be crazy not to seriously consider all the options it has available." Lower End of the Lineup
If Apple uses AMD CPUs, it would join DELL (DELL), Hewlett-Packard (HPQ), Acer, and Toshiba as computer makers that use the chips in some of their machines. Fusion may fit in the lower end of Apple's lineup. It could be used in one version of the Mac Mini, the stripped-down consumer desktop that's about the size of a box of tissues and has a starting price of $599. A mobile version of the Fusion chip might also work in lower-priced models of Apple's MacBook notebooks. If Fusion's performance is good enough, it could also be an option for the ultrathin MacBook Air. "Fusion is a chip that is designed for low cost, low power consumption, and full-fledged graphics performance," says Nathan Brookwood, head of research firm Insight64. "Intel won't have a product that will counter it." To be clear, Apple already has a relationship with AMD, and uses its ATI Radeon graphics chips in certain models of the iMac and Mac Pro. Apple and AMD both declined to comment for the record. But analysts who have studied the companies' product plans say it's only a matter of time before Apple does what other top-tier PC manufacturers do and taps AMD as a second source of CPUs after Intel. AMD could use a lift from supplying Apple. It claims just 18 percent of the worldwide market for PC and server chips, compared with Intel's 82 percent share, according to analyst Dean McCarron, president of Mercury Research. AMD's ATI graphics chip unit that it bought in 2006 could be about to pay dividends by helping AMD land Apple's business. Graphics Performance Plus Battery Life
Usually graphics chips, typically supplied by Nvidia (NVDA) or AMD, are separate components inside a computer. Independent graphics processing units, known in the industry as "discrete" graphics chips, are priced separately from CPUs. In notebooks, additional chips consume more battery power and take up valuable space. The advantage of discrete graphic chips is they provide more oomph for processing photos and video clips, or for playing games. New 17- and 15-in. MacBooks introduced on Apr. 13 use Nvidia's GeForce mobile graphics chips in combination with Intel's Core i5 or Core i7 processors. The 13-in. MacBook Pro still uses an older Intel Core 2 Duo in combination with Nvidia's 320M. Rather than give the 13-in. MacBook Pro a slightly faster CPU and a discrete GPU that would sap battery life, Apple chose to stick with an older, slower CPU and a substantially faster GPU that can sit right alongside the CPU. The choice sends a message from Apple that graphics performance matters a lot. AMD's Fusion promises to offer a different solution: a powerful CPU combined on the same piece of silicon with a powerful graphics chip that won't drain a computer's battery too quickly. Fusion will be an interesting product for makers of Windows-based PCs as well, since it turns out GPUs aren't just good at processing graphics, but also good for handling complex general computing jobs like financial modeling and analyzing scientific data. Open Computing Language's Priority
Apple has spearheaded an industry effort to make it easier for programmers to take advantage of GPUs for general computing tasks. It developed technology for developers called the Open Computing Language that lets programs harness GPUs' power to give computers a speed boost. All the evidence suggests that Apple considers OpenCL a priority. AMD's Fusion will support OpenCL, but it's not clear that Intel chips will. Although Intel sits on the committee setting OpenCL standards, spokesman Tom Beerman says the chipmaker hasn't "disclosed specific product plans at this point." Why would Intel be cool to OpenCL? In part because it sees general computing as the province of its CPUs. "Intel hasn't said anything about supporting OpenCL, in part because Intel wants people to buy bigger, faster CPUs and not off-load those computing tasks to GPUs," Brookwood says. "If software developers don't embrace OpenCL, Intel will dodge a bullet." Winning Apple's business won't be a slam dunk for AMD. By 2011, Intel will launch a next-generation processor code-named Sandy Bridge, which, like Fusion, will have its own on-chip GPU. There are also unknown factors that haven't yet played out. AMD may not be able to deliver Fusion on time. Intel may change its mind and embrace OpenCL, or simply offer Apple a really good deal. Selling CPU chips to Apple would constitute an important marketing victory for AMD, and couldn't help but be seen as a setback for Intel. For now it seems to me that this is AMD's opportunity to lose.