Universities have boomed in recent decades. Higher-education institutions across the world now employ on the order of 15m researchers, up from 4m in 1980. These workers produce five times the number of papers each year. Governments have ramped up spending on the sector. The justification for this rapid expansion has, in part, followed sound economic principles. Universities are supposed to produce intellectual and scientific breakthroughs that can be employed by businesses, the government and regular folk. Such ideas are placed in the public domain, available to all. In theory, therefore, universities should be an excellent source of productivity growth.
In practice, however, the great expansion of higher education has coincided with a productivity slowdown. Whereas in the 1950s and 1960s workers’ output per hour across the rich world rose by 4% a year, in the decade before the covid-19 pandemic 1% a year was the norm. Even with the wave of innovation in artificial intelligence (ai), productivity growth remains weak—less than 1% a year, on a rough estimate—which is bad news for economic growth. A new paper by Ashish Arora, Sharon Belenzon, Larisa C. Cioaca, Lia Sheer and Hansen Zhang, five economists, suggests that universities’ blistering growth and the rich world’s stagnant productivity could be two sides of the same coin.
To see why, turn to history. In the post-war period higher education played a modest role in innovation. Businesses had more responsibility for achieving scientific breakthroughs: in America during the 1950s they spent four times as much on research as universities. Companies like at&t, a telecoms firm, and General Electric, an energy firm, were as scholarly as they were profitable. In the 1960s the research and development (r&d) unit of DuPont, a chemicals company, published more articles in the Journal of the American Chemical Society than the Massachusetts Institute of Technology and Caltech combined. Ten or so people did research at Bell Labs, once part of at&t, which won them Nobel prizes.
Giant corporate labs emerged in part because of tough anti-monopoly laws. These often made it difficult for a firm to acquire another firm’s inventions by buying them. So businesses had little choice but to develop ideas themselves. The golden age of the corporate lab then came to an end when competition policy loosened in the 1970s and 1980s. At the same time, growth in university research convinced many bosses that they no longer needed to spend money on their own. Today only a few firms, in big tech and pharma, offer anything comparable to the DuPonts of the past.
The new paper by Mr Arora and his colleagues, as well as one from 2019 with a slightly different group of authors, makes a subtle but devastating suggestion: that when it came to delivering productivity gains, the old, big-business model of science worked better than the new, university-led one. The authors draw on an immense range of data, covering everything from counts of phds to analysis of citations. In order to identify a causal link between public science and corporate r&d, they employ a complex methodology that involves analysing changes to federal budgets. Broadly, they find that scientific breakthroughs from public institutions “elicit little or no response from established corporations” over a number of years. A boffin in a university lab might publish brilliant paper after brilliant paper, pushing the frontier of a discipline. Often, however, this has no impact on corporations’ own publications, their patents or the number of scientists that they employ, with life sciences being the exception. And this, in turn, points to a small impact on economy-wide productivity.
Why do companies struggle to use ideas produced by universities? The loss of the corporate lab is one part of the answer. Such institutions were home to a lively mixture of thinkers and doers. In the 1940s Bell Labs had the interdisciplinary team of chemists, metallurgists and physicists necessary to solve the overlapping theoretical and practical problems associated with developing the transistor. That cross-cutting expertise is now largely gone. Another part of the answer concerns universities. Free from the demands of corporate overlords, research focuses more on satisfying geeks’ curiosity or boosting citation counts than it does on finding breakthroughs that will change the world or make money. In moderation, research for research’s sake is no bad thing; some breakthrough technologies, such as penicillin, were discovered almost by accident. But if everyone is arguing over how many angels dance on the head of a pin, the economy suffers.
When higher-education institutions do produce work that is more relevant to the real world, the consequences are troubling. As universities produce more freshly minted phd graduates, companies seem to find it easier to invent new stuff, the authors find. Yet universities’ patents have an offsetting effect, provoking corporations to produce fewer patents themselves. It is possible that incumbent businesses, worried about competition from university spinoffs, cut back on r&d in that field. Although no one knows for sure how these opposing effects balance out, the authors point to a net decline in corporate patenting of about 1.5% a year. The vast fiscal resources devoted to public science, in other words, probably make businesses across the rich world less innovative.
If you’re so smart, why aren’t you rich?
Perhaps, with time, universities and the corporate sector will work together more profitably. Tighter competition policy could force businesses to behave a little more like they did in the post-war period, and beef up their internal research. And corporate researchers, rather than universities, are driving the current generative ai innovation boom: in a few cases, the corporate lab has already risen from the ashes. At some point, though, governments will need to ask themselves hard questions. In a world of weak economic growth, lavish public support for universities may come to seem an unjustifiable luxury. ■
Source: Finance - economist.com