Human Capital

Marshall Olson, third year student, from a farm in Minnesota. Working in a soils laboratory in Agricultural Hall. Iowa State College. Ames, Iowa. Jack Delano, May 1942. Courtesy: Photogrammar Project

Marshall Olson, third year student, from a farm in Minnesota. Working in a soils laboratory in Agricultural Hall. Iowa State College. Ames, Iowa. Jack Delano, May 1942. Courtesy: Photogrammar Project

The phrase “human capital” has an ominous sound to many people. It sounds like disposability, abstraction, homogeneous anonymity. It sounds like the language used to justify “downsizing” or to demand longer hours from employees who are already overworked. “We are not getting a sufficient return on our human capital,” might be a recurrent boss’s lament.

Analysts have attacked the term from the left, where it is seen as a shibboleth of capitalism, and particularly of that stage of capitalism in which the term has arisen—late capitalism or neoliberalism. The very combination of these two words, “human” and “capital,” is taken as a sacrilege, a form of dehumanization intrinsic to the extension of market society and economistic thinking taken to its absolute extreme. “Human capital,” for these critics, is merely a euphemism for the reduction of the individual to their economic value, to the price that their accumulated skills and knowledge will fetch on an open labor market. Even more, it is an insidious form of self-discipline, a technique of control whereby an individual learns to appraise themselves constantly, foregoing habits that have no market value and perpetually seeking out opportunities to increase their marketability by whatever meager increment they can. “Human capital” seems to render people into curricula vitae: no more, no less.

Economists, on the other hand, customarily take a different view—one that is both more sanguine and more bloodless. Human capital is, in practice, merely the name for the product of the ordinary habits of rational people who want to succeed economically: they acquire skills and knowledge and they make life decisions about their health, their social habits, and their residence with an eye to gaining economic advantages. As a theoretical construct, human capital is just as innocuous. It is simply a conceptual tool: it translates certain aspects of human experience—such as education or family life—into a code that is legible to the economist’s tools and techniques. It has permitted research programs to expand and ramify; it has set economists to work at new problems that strain the familiar bounds of the discipline. It has enabled economists to keep pace with an economy that has shifted monumentally toward intangible value creation—toward capital that resides not in the firm physical assets of looms or locomotives, but in patents, algorithms, and creativity.

There has been little direct contestation between the economists and the critics of neoliberalism over the “real” meaning of the phrase, but that is just as well. For as I will argue in this book, the significance of human capital goes far beyond nomenclature: it is not the aptness or the malevolence of the phrase itself that generates its importance as a keyword for our times, but the set of premises on which it rests. And if both economists and neoliberalism’s critics sense instinctively that human capital both names and embodies a decisive transition in the economy of the United States and other “advanced” nations after World War II, they are absolutely right—they just have an enormous recency bias.

The argument of my second book, Human Capital: The Career of an Idea, is that the idea of human capital has been with us for a long time, and not just as a corporate buzzword or a concept employed by economists. Where previous scholarship has tended to assume that the precursor ideas of “human capital” can be found in the world of post-industrial society, this study instead locates the concept’s origin in mid-19th century agriculture, and particularly in the state-building project of agricultural improvement. Closely connected to higher education through both federal and state legislation, agricultural improvement presumed that better crops required not just better seeds and better fertilizer, but more knowledgeable, healthier, and—when necessary—more mobile farmers.

This commitment to agricultural improvement was continuously renewed over the next century even as the nation’s economic structure changed immensely. Programs to assist and, in a sense, “upgrade” farmers were, in fact, so successful that they became models of other publicly funded and administered programs dedicated to building intellectual infrastructure, and eventually intertwined with private charitable foundations and corporate enterprises in the crucible of two world wars.

In the wake of the Second World War, with the discipline firmly committed to the dogma of “economic growthmanship,” economists first began asking the questions that would cause “human capital” to show up—at first as a “residual” factor—in their research, but they were trying to capture theoretically a concept that had already been a tacit practice for over a century. It is that practice—and the ideas that fueled its operations—that this book endeavors to explain.