Already a Bloomberg.com user?
Sign in with the same account.
The concept is one of the most compelling in the history of computing: Spare corporations a befuddling labyrinth of computers, software programs, data storage devices, and networks. Instead, make information technology as easy to use as plugging into an electrical outlet. This idea is commonly called utility computing, and many experts believe it's going to sweep the infotech world like a digital tidal wave. IBM, for one, is spending $800 million this year on marketing its vision of utility computing, which it calls e-business on demand.
Unlike past Next Big Things in computing, this wave of innovation doesn't require corporations to rip out technology already installed and replace it with expensive new hardware and software. Instead, they can gradually add technologies or services that make their computing systems more automated. As a result, much of the cost and complexity is being wrung out. "We think this is the third major computing revolution -- after mainframes and the Internet," says analyst Frank Gillett of Forrester Research (FORR
The idea is that the power plant-like computing systems of the future will operate both at remote data centers and within a company's offices -- under a variety of novel payment schemes. Whatever setup, the systems can be managed by the company's own tech staff or by outsiders. And rather than requiring customers to buy computer servers outright for use inside their own walls, hardware makers, including IBM, Sun Microsystems, and Hewlett-Packard, each offer computing-as-used payment options. American Express (AXP
) Co., for instance, today pays IBM a monthly fee based on the number of computers it uses. It hopes someday to pay based on the actual computing power it consumes.
Problem is, there's still a yawning gap between the hype being generated by the tech giants and the reality of what utility computing is today. It turns out that purging complexity is a mighty complex process in itself. It could take a decade or even longer for the bulk of computing to become as easy to tap as electric current. Even corporate buyers that are interested are moving cautiously. "It's a great concept, but it's not quite mature enough," says David Bergen, chief information officer at Levi Strauss & Co.
There's danger in all the hype. The world is just emerging from the dot-com bubble, when some tech companies promised customers the sun, moon, and stars but delivered only meteor showers. The last thing once-burned tech purchasing executives want to hear is that there's yet another tech cure-all on the horizon.
It's no wonder tech purchasers are wary. The utility term was one of several trotted out in the late 1990s to describe a new form of computing -- where independent companies ran corporations' whole range of computing tasks in remote data centers and delivered computing power over the Net, charging a monthly fee. That didn't take off. Most corporations weren't comfortable farming out crucial operations, and the technologies for managing such computing systems efficiently and dependably weren't quite ready.
Companies say their new ideas for utility computing are different. And they are becoming increasingly aggressive marketers. IBM's vision, called e-business on demand, includes a role for its huge consulting arm to help customers set up computer utilities. HP has a strategy called the adaptive enterprise, which some of its customers are tinkering with in-house. Sun's approach, called N1, also aims to make computing more manageable within a corporation's own facilities. Microsoft Corp., meanwhile, has a dynamic systems initiative aimed at automating computer systems, whether they're handled in-house or by a utility computing service supplier -- such as IBM or Electronic Data Systems.
Everybody agrees there's plenty to be done to make computing less complex and more efficient. Because of complexity, 75% of the costs of operating a corporate computing system come from staffing, consulting, and maintenance. Adding to the problem is that many server computers are dedicated to a single application -- accounting, supply-chain management, or human resources. That's wasteful. Market researchers estimate that most servers are used at only 20% of capacity. New technologies let software programs and the data related to them be routed to different servers, storage devices, or sections of a network -- depending on where excess capacity exists at any moment. There's a minimum of human intervention. Tech suppliers believe they can raise utilization to 50% to 80%, thus requiring less gear. "We're using technology instead of people to manage computing," says Nora M. Denzel, senior vice-president for software at Hewlett-Packard.
Some of the technology is real today. HP sells a product called the Utility Data Center that is essentially a switching device for doling out computing jobs. A customer's entire network is wired to the machine, so an application can be switched to any computer server or storage device on the network. Other tech suppliers are pushing hard in niches. Veritas Software (VRTS
) Corp. sells software that anticipates when additional storage capacity is going to be needed and alerts operators to provide it.
Not all elements of the utility-computing concept are ready for mainstream corporate use. One that still faces major challenges is so-called grid computing. Unlike the Utility Data Center, which moves whole programs between machines, grid computing usually chops computation-heavy tasks into small pieces that are farmed out to dozens or even hundreds of computers, taking advantage of underutilized machines. IBM, Sun, and other computer companies sell grid technology and or services that companies use for scientific or financial number-crunching jobs. Many corporate software programs don't benefit from this kind of slicing and dicing. But for those that do, such as financial calculations and pharmaceutical research, the tech companies are trying to make it easier to parcel the tasks out to grids.
The most difficult part of utility computing might turn out to be getting the big shots to agree on standards. Typically, companies prefer to use their own proprietary technologies. On July 21, for instance, storage leader EMC Corp. released what it claimed was the world's first automated billing system for storage devices -- allowing precise, utility-type payments to be made for storage services rendered. The drawback for customers: It works only on machines made by EMC. Utility computing won't deliver on its promise unless devices made by different manufacturers work together as one.
Because of these unresolved issues, cynics say this new utility is doomed to produce more heat than light. But the quip misses the mark. If the industry stays focused, bits and bytes could some day be as simple to deal with as watts. By Steve Hamm
Contributing: Peter Burrows in San Mateo, Calif.