Companies & Industries

The Case Against Digital Sprawl


The Case Against Digital Sprawl

Photograph by Henrik Spohler/laif/Redux

Imagine a real estate market where it is profitable to build new skyscrapers, even if they sit at 40 percent occupancy. Eventually you run out of land—not to mention the staggering maintenance and energy costs you rack up.

Certainly no chief executive officer would sign off on such a business model, right? Well, that is the current state of corporate information systems. Over the decades, the arrival of ever-cheaper, more powerful computers every few years has made it easy and inexpensive for large companies to simply add new systems to handle extra capacity, creating acres of slap-dash, poorly designed data centers.

This ad-hoc approach to the design of corporate computer systems has left the majority of the world’s enterprises with highly inefficient, brittle digital foundations that siphon off money that would be better applied to creating new revenue and business opportunities. Today, more than 70 percent of the average corporate IT budget goes to basic operations and maintenance—just keeping the lights on, according to a new study from IDG commissioned by IBM (IBM).

The traditional IT business model has rested largely upon the promise of ever-denser microprocessors, as defined by Moore’s Law. That promise is failing us. Over-built and outdated data centers have brought the world’s corporations and governments face-to-face with the physics of density when applied to silicon-based semiconductors: heat. Denser, more powerful chips produce servers that run hotter, causing power and cooling costs to grow in inverse proportion to Moore’s Law.

With the total cost of owning and operating a new data center approaching $1 billion in some cases, cost-per-watt has become a guiding metric for many C-suite executives. This has left some companies flirting with such seemingly absurd plans as locating new data centers near the Arctic Circle to leverage the earth’s cooling power.

This digital sprawl and inefficiency would probably have gone uninterrupted, were it not for the explosion of “Big Data” and mobile computing brought on by omnipresent Web and cell-phone connectivity. Today’s data-creating entities—with socially networked customers, tweeting employees, YouTube (GOOG)-uploading marketers, and an Internet of data-savvy and data-spewing objects—are challenging the limits of the traditional IT model, clogging networks and overflowing storage systems.

Despite the challenges, the virtues of density will continue to shape the industry, but in different ways. Three-dimensional chip packaging, for example, is the latest technique to create super-dense microprocessors and memory chips. As chips extend “upward,” clutter can be cleared. In the next three years or so, new nano materials made from organic and man-made compounds will allow bricks of silicon to more easily dissipate heat. The thousands of cables that you see in a data center will be packed into a silicon cube one-inch square. These and other technologies, coupled with new models such as cloud computing, will offer companies an opportunity to remodel their IT infrastructure.

Another radical shift will be to bring computing to the data. Today’s corporate systems routinely move mountains of data to different applications and servers for various processing tasks. With corporate data piles now measured in exabytes (five exabytes is the equivalent of all the words ever spoken by human beings), it is no longer feasible to bring the data to the server. IBM estimates that the fixed costs of shuttling an exabyte of medical data for processing can easily approach $10 million. That figure includes the hardware, software, energy, and manpower needed to store and move the files. In addition, data that is constantly on the move stands a higher risk of encountering errors or getting lost.

These are important technological developments, but they won’t go far enough unless senior management pushes for new data centers and corporate systems that make real-time processing of data far more common than it is today, when it is largely relegated to factory-floor operations or a few niche applications. The ability to cull and process data wherever it’s created, vastly reducing the amount of information that warrants storage space or transit across networks, will become a top corporate priority—and an important competitive advantage.

Smart IT managers and executives will use this quick analysis of data to create new business models. Productivity levels also stand to gain as more engineering talent is freed from low-value work such as maintenance and directed toward IT projects that can enhance the company.

From the beginning of recorded time until 2003, humans created five billion gigabytes of information (five exabytes). Last year, the same amount of information was created every two days. By next year, IBM and others expect that interval to shrink to every 10 minutes.  It is clear that to deal with all that data, we need new computing designs that solve the density dilemma. If you don’t believe that, I have some land in the Arctic Circle I’d like to sell you.

Dave Turek is the executive in charge of supercomputer development for IBM. He was a key designer of the IBM line of supercomputers which included Deep Blue, world chess champion, retired. Turek is a member of the U.S. Council on Competitiveness High Performance Computing Advisory Committee.

Best LBO Ever
LIMITED-TIME OFFER SUBSCRIBE NOW

Companies Mentioned

  • IBM
    (International Business Machines Corp)
    • $192.81 USD
    • -0.15
    • -0.08%
  • GOOG
    (Google Inc)
    • $583.77 USD
    • 3.82
    • 0.65%
Market data is delayed at least 15 minutes.
 
blog comments powered by Disqus