Anemic job creation plagues the U.S. economy. Here's how information technology could play a key role in addressing it, says Chris Farrell
The risk of economic stagnation should dominate discussions this week among the voting members of the Federal Reserve Board at its two-day meeting. Without dramatic action, the U.S. confronts an uncertain future—one that suggests "Americans on average would experience slower gains in living standards than did their parents and grandparents," according to consultancy McKinsey & Co. The catastrophic danger is not the federal debt/deficit issue that dominates the current debate in Washington and in hyperventilated news reports on cable TV. It is employment. The once-great American job machine badly needs repair. Even before the Great Recession struck, total employment growth from 2000 to 2007 amounted to less than half the increase reached in preceding decades. That was the worst performance since the Great Depression. Three years after the recession officially ended in June 2009, some 24.6 million people are unemployed, underemployed, and marginally employed, according to the latest figures from the U.S. Bureau of Labor Statistics. It will take at least 21 million net new jobs over the coming decade to return to a 5 percent unemployment rate, according to McKinsey. Sad to say, it appears that America's monetary mandarins are set on avoiding bold action, content to cross their fingers and hope that the current slowdown is nothing more than a soft patch. That so many policymakers are complicit in the sorry state of the nation's job market is risking a long-term, Japanese-like stagnation of dashed dreams and pinched opportunities. It doesn't have to happen. It shouldn't happen. But it could. Remember the '90s—and Get Moving
Maybe that's why it was briefly heartening when Republican Presidential candidate Tim Pawlenty called for 5 percent gross domestic product growth over a decade. The problem is that his plan has been widely and justifiably panned for relying on tax cuts that would roughly double the size of America's already disastrous long-term fiscal gap. Although no one really knows the economy's speed limit, University of California, Berkeley economist Brad DeLong makes a realistic case for an "optimistic-aspirational 3.8 percent growth rate," while Stanford University economist John Taylor manages to hit 4.7 percent. Here's the thing: Despite growing gloom about the economy's growth prospects in the Wall Street-to-Washington power corridor, it wouldn't take much to improve the odds of a much brighter future. Policymakers should remember the lessons of the '90s. Specifically, in the early 1990s most forecasters assumed the economy could expand at a 2 percent to 2.5 percent rate before igniting inflationary pressures. The GDP growth outlook was based on consensus projections of 1 percent average annual productivity growth and labor force growth of 1.3 percent or less. Yet, powered by an emerging web of technological and commercial innovations, the productivity growth rate ran at more than double expectations, starting in the mid-'90s. What's more, putting out the welcome mat for immigrants energized everything from high-tech innovation to urban renewal. By decade's end, the unemployment rate had dropped to 4.2 percent and inflation was averaging 2.2 percent—figures not seen since the 1960s. The information technology universe is stirring after a long pause that followed the dot.com bust. The gains reflect the rise of social media and the spread of the mobile Internet, from Facebook to the Apple (AAPL) iPad. Unfortunately, much of the discussion seems dominated by worries about a second digital-age bubble, rather than the spread of underlying innovations. Focus on Transformation, not Bubbles
In many sectors of the economy, very smart people are working with extremely powerful computers to mine enormous troves of data. They aim to discover patterns that might suss out inefficiencies in a corporate supply chain or convince a consumer to click for a product. Prospects are ripe for using information technologies to boost productivity in health care, education, media, the government, and other services. It takes time for major technological innovations to spread throughout an economy. For example, computers are commonplace in the workplace, yet the PC is only three decades old. As recently as the 1980s, many users at work needed to do some of their own software programming to take advantage of the technology. Yet the average worker today doesn't need to know programming and can still do very sophisticated operations with the tap, tap, tap of a finger. What's true for workers holds for companies and industries. "And as the laggards catch up, the leaders will have moved on, using the large and ever-growing tool kit of digital technology to make improvements elsewhere," MIT information technology scholars Andrew McAfee and Eric Brynjolfsson write in an essay published at McKinsey's website: "The Digital Revolution Will Transform the Economy—Again and Again." What can be done to accelerate the process? In the short run, forget tax cuts. The Fed should continue to support the economy with monetary ease. Legislators could embrace emergency infrastructure projects that bring long-term economic value and take advantage of cheap labor and cheap capital—especially investments in 21st century information technology and energy initiatives. Regulators should quickly get out of the rule-making weeds of the information technology ecology. The welcome mat could go out to legal immigrants. Long-term, there's much more to do to boost economic fundamentals, particularly when it comes to education. But for now, avidly embracing the digital frontier is the best prospect the U.S. has for generating the kind of growth that creates jobs.