With his article in the May issue of the Harvard Business Review, entitled "IT Doesn't Matter," Nicholas G. Carr might as well have painted a target on his chest. The former editor-at-large of the magazine contends that the strategic value that information technology can provide corporate buyers is falling as it becomes more pervasive. That view has elicited howls of protest from many tech execs and buyers, who believe that continuing technology advances provide plenty of potential to achieve a competitive advantage. But Carr hasn't backed down one bit. Now a consultant in business strategy and information technology, he's working on a book that expands upon his article. In an interview in July, Silicon Valley Bureau Chief Robert D. Hof asked Carr to explain himself.
What led you to the view that information technology doesn't matter?
In the wake of the dot-com bust, there was a lot of copying of functionality across companies. A lot of what has been written about IT has been focused on what it can do, which is an enormous amount. But it hasn't focused on whether a company can do something with it in a different way or whether it's rolled out quickly throughout entire industries. That can lead to good things, such as increased productivity at the industry level. But it doesn't necessarily have much of an impact on individual companies and their ability to distinguish themselves and thus achieve higher profitability than their rivals.
Do you think the bust contributed to the lack of competitive advantage from IT that you write about?
That's a different issue. But I think there's a related aspect. Everybody said: "The Internet is great because it removes the friction from commerce." And that is good if you're a customer or an economist looking at business productivity. But it's often the friction in commerce that gives companies their profits. If you remove the friction, you're making it easier for customers to see through your cost structure and making it easier for competitors to replicate each other's strategy. In pursuing that business model, a lot of companies were dooming themselves to fail.
Does the pervasiveness of IT mean there will be less innovation now?
I'm not arguing that there won't continue to be innovations in technology. I'm arguing that a lot of those innovations are going to go beyond the needs of a lot of companies. The history of IT has been one of different technologies -- whether we're talking about PCs or servers or networking gear or business-software applications -- evolving quickly to the point where they meet most companies' needs. Every time they push it out further, they lose more and more of their customers, who say: "Gee, I don't need that," and switch to commodity gear or software.
Is there now less incentive to use innovative products?
Almost any innovation can be replicated by competitors. So it's not enough to be the first company to use this new, innovative technology and thus gain an advantage over competitors. The question is: How long is that advantage going to last? As the first company investing in it, you're going to pay a lot more, so you have to be assured of having that advantage that allows you to charge more or produce at a lower cost long enough to recoup that investment. The advantage that it has provided simply isn't lasting long enough to make it economically worthwhile.
Hasn't competitive advantage come from unique use of the technology, not just from the technology itself?
It's always difficult to draw a bright line between where the technology ends and the use of it starts. But I would disagree with the argument that technology itself, the systems themselves, have never been the source of advantage. In fact, they often were. When a company like American Airlines (AMR) spent almost 10 years working with IBM to create its Sabre reservation system, it took a long time for competitors to copy that system. That system itself provided the advantage.
Was it the system itself, or was it how they applied it to their business?
It was the system, because all their competitors had to build that same system. Eventually they did, but it took too long, and Sabre was locked in. [Now] all the developments in technology -- toward standardization, toward openness, toward greater power, toward ever more affordable hardware and software -- have made it much more difficult for the systems themselves to provide you with advantage.
If all this is now widely available, why are Wal-Mart and Dell able to use the same technologies yet still get a competitive advantage?
I don't think information technology lies at the source of their strategic advantage. Wal-Mart's advantage comes from the complex system of things that it does better, from where it builds stores to its merchandising policies.
Dell's basic business model, and the source of its economic advantage, is its direct sales. That was in place back when it was taking orders with the telephone and the fax machine. Hewlett-Packard could replicate Dell's systems if it wanted to or needed to. They haven't because they have a completely different business model.
It's very revealing that, almost without exception, when people take issue with my argument, they say: "Look at Dell and Wal-Mart." But they built their advantage a number of years ago, when the ability of IT to provide advantage was greater. Where are all the other companies that have more recently gained advantage?
You compare IT to previous tech revolutions such as railroads and electricity. But some people suggest that IT is different -- it's still evolving.
When you look across the earlier examples of technologies that were widely adopted and had a transformative effect on business, I don't think they were entirely different from it. Electricity is a good example. Back when it first became available, it was electric motors that could be adapted to all sorts of uses, just as software can. There was this long period where companies found creative ways to adapt their machinery and manufacturing to electricity. Today, they're completely hidden from us, because we take it for granted. It's not beyond imagining that over time, the same thing will happen with most basic IT functions.
Then where are we in the cycle of the IT revolution? History suggests the biggest buildout of a technology comes after the crash.
The buildout goes on for quite a long time, but whether it's the buildout of a commodity infrastructure or whether it's something companies use to get an advantage is a different question. It's always dangerous to say: "That's it. It's over." But it's not as if there's this infinite, unknown quantity of things that businesses need to do that we haven't realized yet. A lot of the core things that businesses do have already been automated with information technologies.
A lot of new consumer technologies seem to be bubbling up that show little sign of maturing.
I think the real innovation in IT now is going to happen on the consumer side. You really do have this merging of IT and the whole entertainment and media infrastructure. When I look at the average home users of technology, their needs for innovative new software and for more processing power are greater than your average person working at a desk.
Since IT companies will be supplying consumers, directly or indirectly, will that prop up overall technology spending?
It would be dangerous for an IT company to assume the recent slowdown is purely a cyclical event. Of course, when business capital spending goes up, it's going to bring IT spending up with it. But I'm very doubtful that IT spending will grow faster than general increases in capital spending.
What should tech companies do to contend with the trends you outline?
The big challenge will be for companies that have built their businesses on proprietary technologies that they could sell at premium prices. There are a number of companies like that, whether it's EMC in storage, Cisco in networking gear, Sun in servers, or on the software side, even Microsoft with its operating system. If you're in that position, there are generic competitive products, whether it's open-source software or Dell boxes, that more and more customers are going to buy because they're cheaper.
Now, that doesn't mean I'd count out any of those companies. But it means they will have to figure out some new ways to make money. They're all trying. Microsoft is trying to extend its proprietary control over to Web services. EMC and Cisco are expanding heavily into services and trying to reduce their dependence on the basic equipment. How that's going to play out, I don't know.
Will we see a lot of disruption among companies that have been around for a couple of decades?
Yeah. When industries mature, there's a lot of consolidation at the top, and we're seeing that: HP and Compaq, Oracle trying to buy PeopleSoft. For the big companies, it becomes a scale and scope game -- getting as many customers as possible and selling them as much as you can.
How will buyers use technology differently as these trends play out?
When a resource becomes a competitive necessity but doesn't provide strategic advantage anymore, then companies have to shift their focus from the opportunities it provides to the risks it presents. How companies use technologies is going to be focused on reducing risks. That includes things like security and reliability. But it also has to do with costs. If you manage your IT poorly and wastefully, you can definitely put yourself at a cost disadvantage.
What's ahead for tech startups?
I don't think it's going to be as good as it was in the '90s. In fact, there can be new technologies that are quickly adopted throughout entire industries that improve productivity without giving any particular company an advantage. It's not inconceivable that a startup could pioneer a broadly adopted technology and grow quite handsomely. But it's going to have to prove its value to business. Startups aren't going to be funded with wishful thinking anymore. Those days are gone.