Fed Scours Data for Signs of a Robot Takeover
(Bloomberg) -- Productivity growth has been in a rut for decades despite the advent of iPhones and artificial intelligence. It’s an economic enigma, and one that’s top of mind at the Federal Reserve.
Fed Governor Randal Quarles in a speech this month expressed optimism that America is on the brink of a productivity breakout, labeling himself a “techno-enthusiast.” His colleague Richard Clarida, in his first public speech since becoming vice chairman, last week called a pickup “a possibility that deserves close monitoring.”
A boom would be great news, boosting potential growth and theoretically allowing the economy to expand faster and longer without stoking inflation. In fact, Chairman Jerome Powell recently praised Alan Greenspan for raising rates patiently in the 1990’s when his predecessor “had a hunch that the United States was experiencing the wonders of a ‘new economy.’”
But it’s hard to say whether a new Greenspan moment is nigh. Productivity is tough to predict, and even optimistic studies, like a recent one from Erik Brynjolfsson, warn official data might miss breakthroughs. We sum up top research on the topic below. Check this column each Tuesday for a review of studies relevant to the economic headlines.
In the footnotes of his speech last week, Clarida cited “Artificial Intelligence and the Modern Productivity Paradox: A Clash of Expectations and Statistics,” a study by Massachusetts Institute of Technology economist Brynjolfsson and co-authors Daniel Rock and Chad Syverson. In it, the trio suggest that the “most impressive” capabilities of AI haven’t yet diffused widely.
“Their full effects won’t be realized until waves of complementary innovations are developed and implemented,” the authors write. It took 25 years for computers to earn widespread takeup, and electricity took more than 30 years to really catch on in factories, the authors note. If AI is taking a while to show up in the data, that’s only natural.
That said, AI might be tricky to find in the numbers, even if it proliferates. The technology and its complements are intangible, so their development and end-products might be hard to capture in traditional data -- both exist on servers, not factory floors and retail shelves.
In fact, Fed researchers have already been digging into mis-measurement, and not just as it pertains to AI. In a new NBER paper, the Fed Board’s David Byrne and his co-authors say that investment in cloud computing is probably being under-counted in data on gross domestic product. They say this new way of accessing computing services “likely will have important consequences for the structure of the economy, productivity growth, and economic measurement.”
Byrne’s earlier work with San Francisco Fed economist John Fernald and the International Monetary Fund’s Marshall Reinsdorf argued that while Internet access, e-commerce, globalization and fracking are mis-measured today, most of their effects were even more poorly-measured in the past.
Best Behind Us
If the Fed crew combines caution and optimism, Northwestern University economist Robert Gordon is the consummate pessimist: he believes that productivity has truly slowed and projects that the tepid growth is here to stay. In a recent article, he “does not argue that everything useful has already been invented” but just that “the transformative change in business methods made possible by the digital revolution was largely over by 2006.”
He says that “prospective innovations” -- think 3-D printing, autonomous vehicles, robots and AI -- will probably evolve gradually, rather than causing a sudden and sharp jump up in productivity. And he sees a slowdown in business investment over the past decade or so as both a cause and a result of slower GDP growth: with declining population gains and a muted boost from innovation, there are fewer profitable investment opportunities.