While Grandma can flip through photo albums on a state-of-the-art laptop or, before long, an Apple (AAPL) iPad, many government agencies and corporations are still entrusting critical tasks to antiquated computer systems that cost a fortune to operate and maintain.
The problem is particularly acute at the state level. Each U.S. state has its own unique computer systems to process the same types of information and provide the same services as every other state. Worse, even within states, each division or agency has its own IT department and maintains its own computer systems. We're talking about hundreds of billions of dollars of IT spending every year—on clunky old infrastructure.
Consider California. The most populous U.S. state is more advanced than most, though it faces big IT challenges. It has roughly 130 agencies and departments, each with its own IT staff and computer systems. Each collects its own information and maintains its own databases. The systems of one department are not usually integrated with the systems of another. When they do share data, it is usually through the computer equivalent of Excel spreadsheets. The state has more than 40 separate computer applications to collect the same personal and demographic information about citizens. So, for example, when a business has to update an address, it typically has to inform multiple agencies.
Keeping up with regulatory changes is also a huge burden for the state's IT staff. Simple changes cost tens of millions of dollars and can take years. When President Obama signed legislation extending benefits for unemployed workers in November, out-of-work Californians had to wait as long as two months because the systems couldn't be updated.
Today's PCs and the Web are often more robust, secure, and fast than the massive enterprise systems used by governments and some businesses. A modern laptop has greater processing power than the mainframes for which many enterprise systems were designed.
A social networking site like Facebook or Twitter processes more transactions (in the form of messages) in a day than many financial companies and states process in a month. Sophisticated computer applications that used to take years to build can be built in months. And the newer applications are usually far easier to use and much more scalable.
So why isn't there a massive move to new technologies? If anything, the chasm between the old and new has grown wider. This was made plain to me after I posted a blog to the tech Web site TechCrunch. I wrote about California's IT challenges and encouraged Silicon Valley entrepreneurs to come to the rescue. I cited an example of one system for which the state has budgeted $50 million over several years, which I believed could be rebuilt for less than $5 million in less than a year. I received several credible offers.
Yet the idea appears to be very threatening to a handful of large system administrators that have built enormous businesses on antiquated state systems. I was challenged by a senior vice-president at CA Inc. (CA) who called me naive and chided me for knocking something I "know absolutely squat about." Her argument was that standards, processes, and people had to change first. Otherwise disaster would happen and "set California back even further." Others wrote to argue that government procurement processes can't be changed and that big government contractors with political connections would always hold back progress.
I believe these critics are simply out of touch with the new reality. Security breaches of large government IT systems are common due to the ongoing cyberwars taking place between nations. Amazon.com (AMZN) is just as juicy a target, but has a far better history of IT security than most big government agencies. As for my lack of understanding of the problem: In my tech days, I developed several large enterprise systems, and I started two companies that marketed systems development and legacy systems reengineering software.
So I know there is a problem and a relatively simple solution. I also realize the challenges for governments to allocate contracts in a fair and equitable manner. But the disaster I see is if we continue the way we are. And it's a failure beyond the bloated costs, entrenched mainframe systems integrators, and dated computer languages. The greater danger is that, by trapping the public-sector IT architecture in this tired old wrapper, we miss out on huge chances not only to improve system performance but also to reinvent government. The public sector is effectively walled off from the innovation that has made Web 2.0 a rich, contextually relevant environment. In this context, there will be no Netflix (NFLX) Prize, no Google (GOOG) voice-automated transcription engine, and virtually no other true technology innovation.
Many people realize this. Web founder Sir Tim Berners-Lee has just launched a venture for the U.K.government to make public data available online so that entrepreneurs can build technologies to harness this information.Federal Chief Technology Officer Aneesh Chopra also launched an initiative to make federal government data available.And in the wake of my earlier posting, California Chief Technology Officer P.K.Agarwal has launched a crowd-sourcing Web site where he's asking the public to weigh in on how the state might improve its tech strategy.
So there is progress. More states need to make similar moves. We need to open the bidding to new players, loosen opaque requirements written under the false guise of security and compatibility, and retool our way of thinking about IT for the public sector. In the cloud computing era, big government IT doesn't have to be so big. The rest of America has done more with fewer IT dollars for over a decade now. It's time for Uncle Sam and his state and local brothers and sisters to join the parade.