Posted in Capitalism, Higher Education, Meritocracy, Politics

Sandel: The Tyranny of Merit

This post is a reflection on Michael Sandel’s new book, The Tyranny of Merit: What’s Become of the Common Good?  He’s a philosopher at Harvard and this is his analysis of the dangers posed by the American meritocracy.  The issue is one I’ve been exploring here for the last two years in a variety of posts (here, here, here, here, here, here, and here.)

I find Sandel’s analysis compelling, both in the ways it resonates with other takes on the subject and also in his distinctive contributions to the discussion.  My only complaint is that the whole discussion could have been carried out more effectively in a single magazine article.  The book tends to be repetitive, and it also gets into the weeds on some philosophical issues that blur its focus and undercut its impact.  Here I present what I think are the key points.  I hope you find it useful.

Sandel Cover

Both the good news and the bad news about meritocracy is its promise of opportunity for all based on individual merit rather than the luck of birth.  It’s hard to hate a principle that frees us from the tyranny of inheritance. 

The meritocratic ideal places great weight on the notion of personal responsibility. Holding people responsible for what they do is a good thing, up to a point. It respects their capacity to think and act for themselves, as moral agents and as citizens. But it is one thing to hold people responsible for acting morally; it is something else to assume that we are, each of us, wholly responsible for our lot in life.

The problem is that simply calling the new model of status attainment “achievement” rather than “ascription” doesn’t mean that your ability to get ahead is truly free of circumstances beyond your control.  

But the rhetoric of rising now rings hollow. In today’s economy, it is not easy to rise. Americans born to poor parents tend to stay poor as adults. Of those born in the bottom fifth of the income scale, only about one in twenty will make it to the top fifth; most will not even rise to the middle class. It is easier to rise from poverty in Canada or Germany, Denmark, and other European countries than it is in the United States.

The meritocratic faith argues that the social structure of inequality provides a powerful incentive for individuals to work hard to get ahead in order to escape from a bad situation and move on to something better.  The more inequality, such as in the US, the more incentive to move up.  The reality, however, is quite different.

But today, the countries with the highest mobility tend to be those with the greatest equality. The ability to rise, it seems, depends less on the spur of poverty than on access to education, health care, and other resources that equip people to succeed in the world of work.

Sandel goes on to point out additional problems with meritocracy beyond the difficulties in trying to get ahead all on your own: 1) demoralizing the losers in the race; 2) denigrating those without a college degree; and 3) turning politics into the realm of the expert rather than the citizen.

The tyranny of merit arises from more than the rhetoric of rising. It consists in a cluster of attitudes and circumstances that, taken together, have made meritocracy toxic. First, under conditions of rampant inequality and stalled mobility, reiterating the message that we are responsible for our fate and deserve what we get erodes solidarity and demoralizes those left behind by globalization. Second, insisting that a college degree is the primary route to a respectable job and a decent life creates a credentialist prejudice that undermines the dignity of work and demeans those who have not been to college; and third, insisting that social and political problems are best solved by highly educated, value-neutral experts is a technocratic conceit that corrupts democracy and disempowers ordinary citizens.

Consider the first point. Meritocracy fosters triumphalism for the winners and despair for the losers.  It you succeed or fail, you alone get the credit or the blame.  This was not the case in the bad old days of aristocrats and peasants.

If, in a feudal society, you were born into serfdom, your life would be hard, but you would not be burdened by the thought that you were responsible for your subordinate position. Nor would you labor under the belief that the landlord for whom you toiled had achieved his position by being more capable and resourceful than you. You would know he was not more deserving than you, only luckier.

If, by contrast, you found yourself on the bottom rung of a meritocratic society, it would be difficult to resist the thought that your disadvantage was at least partly your own doing, a reflection of your failure to display sufficient talent and ambition to get ahead. A society that enables people to rise, and that celebrates rising, pronounces a harsh verdict on those who fail to do so.

This triumphalist aspect of meritocracy is a kind of providentialism without God, at least without a God who intervenes in human affairs. The successful make it on their own, but their success attests to their virtue. This way of thinking heightens the moral stakes of economic competition. It sanctifies the winners and denigrates the losers.

One key issue that makes meritocracy potentially toxic is its assumption that we deserve the talents that earn us such great rewards.

There are two reasons to question this assumption. First, my having this or that talent is not my doing but a matter of good luck, and I do not merit or deserve the benefits (or burdens) that derive from luck. Meritocrats acknowledge that I do not deserve the benefits that arise from being born into a wealthy family. So why should other forms of luck—such as having a particular talent—be any different? 

Second, that I live in a society that prizes the talents I happen to have is also not something for which I can claim credit. This too is a matter of good fortune. LeBron James makes tens of millions of dollars playing basketball, a hugely popular game. Beyond being blessed with prodigious athletic gifts, LeBron is lucky to live in a society that values and rewards them. It is not his doing that he lives today, when people love the game at which he excels, rather than in Renaissance Florence, when fresco painters, not basketball players, were in high demand.

The same can be said of those who excel in pursuits our society values less highly. The world champion arm wrestler may be as good at arm wrestling as LeBron is at basketball. It is not his fault that, except for a few pub patrons, no one is willing to pay to watch him pin an opponent’s arm to the table.

He then moves on to the second point, about the central role of college in determining who’s got merit. 

Should colleges and universities take on the role of sorting people based on talent to determine who gets ahead in life?

There are at least two reasons to doubt that they should. The first concerns the invidious judgments such sorting implies for those who get sorted out, and the damaging consequences for a shared civic life. The second concerns the injury the meritocratic struggle inflicts on those who get sorted in and the risk that the sorting mission becomes so all-consuming that it diverts colleges and universities from their educational mission. In short, turning higher education into a hyper-competitive sorting contest is unhealthy for democracy and education alike.

The difficulty of predicting which talents are most socially beneficial is particularly true for the complex array of skills that people pick up in college.  Which ones matter most for determining a person’s ability to make an important contribution to society and which don’t?  How do we know if an elite college provides more of those skills than an open-access college?  This matters because a graduate from the former gets a much higher reward than one from the latter.  Pretending that a prestigious college degree is the best way to measure future performance is particularly difficult to sustain because success and degree are conflated.  Graduates of top colleges get the best jobs and thus seem to have the greatest impact, whereas non-grads never get the chance to show what they can do.

Another sports analogy helps to make this point.

Consider how difficult it is to assess even more narrowly defined talents and skills. Nolan Ryan, one of the greatest pitchers in the history of baseball, holds the all-time record for most strikeouts and was elected on the first ballot to baseball’s Hall of Fame. When he was eighteen years old, he was not signed until the twelfth round of the baseball draft; teams chose 294 other, seemingly more promising players before he was chosen. Tom Brady, one of the greatest quarterbacks in the history of football, was the 199th draft pick. If even so circumscribed a talent as the ability to throw a baseball or a football is hard to predict with much certainty, it is folly to think that the ability to have a broad and significant impact on society, or on some future field of endeavor, can be predicted well enough to justify fine-grained rankings of promising high school seniors.

And then there’s the third point, the damage that meritocracy does to democratic politics.  One element of of this is that it turns politics into an arena for credentialed experts, consigning ordinary citizens to the back seat.  How many political leaders today are without a college degree?  Vanishingly few.  Another is that meritocracy not only bars non-grads from power but they also bars them from social respect.  

Grievances arising from disrespect are at the heart of the populist movement that has swept across Europe and the US.  Sandel calls this a “politics of humiliation.”

The politics of humiliation differs in this respect from the politics of injustice. Protest against injustice looks outward; it complains that the system is rigged, that the winners have cheated or manipulated their way to the top. Protest against humiliation is psychologically more freighted. It combines resentment of the winners with nagging self-doubt: perhaps the rich are rich because they are more deserving than the poor; maybe the losers are complicit in their misfortune after all.

This feature of the politics of humiliation makes it more combustible than other political sentiments. It is a potent ingredient in the volatile brew of anger and resentment that fuels populist protest.

Sandel draws on a wonderful book by Arlie Hochschild, Strangers in Their Own Land, in which she interviews Trump supporters in Louisiana.

Hochschild offered this sympathetic account of the predicament confronting her beleaguered working-class hosts:

You are a stranger in your own land. You do not recognize yourself in how others see you. It is a struggle to feel seen and honored. And to feel honored you have to feel—and feel seen as—moving forward. But through no fault of your own, and in ways that are hidden, you are slipping backward.

Once consequence of this for those left behind is a rise in “deaths of despair.”

The overall death rate for white men and women in middle age (ages 45–54) has not changed much over the past two decades. But mortality varies greatly by education. Since the 1990s, death rates for college graduates declined by 40 percent. For those without a college degree, they rose by 25 percent. Here then is another advantage of the well-credentialed. If you have a bachelor’s degree, your risk of dying in middle age is only one quarter of the risk facing those without a college diploma. 

Deaths of despair account for much of this difference. People with less education have long been at greater risk than those with college degrees of dying from alcohol, drugs, or suicide. But the diploma divide in death has become increasingly stark. By 2017, men without a bachelor’s degree were three times more likely than college graduates to die deaths of despair.

Sandel offers two relatively reforms that might help mitigate the tyranny of meritocracy.  One focuses on elite college admissions.  

Of the 40,000-plus applicants, winnow out those who are unlikely to flourish at Harvard or Stanford, those who are not qualified to perform well and to contribute to the education of their fellow students. This would leave the admissions committee with, say, 30,000 qualified contenders, or 25,000, or 20,000. Rather than engage in the exceedingly difficult and uncertain task of trying to predict who among them are the most surpassingly meritorious, choose the entering class by lottery. In other words, toss the folders of the qualified applicants down the stairs, pick up 2,000 of them, and leave it at that.

This helps get around two problems:  the difficulty in trying to predict merit; and the outsize rewards of a winner-take-all admissions system.  But good luck trying to get this put in place over the howls of outrage from upper-middle-class parents, who have learned how to game the system to their advantage.  Consider this one small example of the reaction when an elite Alexandria high school proposed random admission from a pool of the most qualified.

Another reform is more radical and even harder to imagine putting into practice.  It begins with reconsideration of what we mean by the “common good.”

The contrast between consumer and producer identities points to two different ways of understanding the common good. One approach, familiar among economic policy makers, defines the common good as the sum of everyone’s preferences and interests. According to this account, we achieve the common good by maximizing consumer welfare, typically by maximizing economic growth. If the common good is simply a matter of satisfying consumer preferences, then market wages are a good measure of who has contributed what. Those who make the most money have presumably made the most valuable contribution to the common good, by producing the goods and services that consumers want.

A second approach rejects this consumerist notion of the common good in favor of what might be called a civic conception. According to the civic ideal, the common good is not simply about adding up preferences or maximizing consumer welfare. It is about reflecting critically on our preferences—ideally, elevating and improving them—so that we can live worthwhile and flourishing lives. This cannot be achieved through economic activity alone. It requires deliberating with our fellow citizens about how to bring about a just and good society, one that cultivates civic virtue and enables us to reason together about the purposes worthy of our political community.

If we can carry out this deliberation — a big if indeed — then we can proceed to implement a system for shifting the basis for individual compensation from what the market is willing to pay to what we collectively feel is most valuable to society.  

Thinking about pay, most would agree that what people make for this or that job often overstates or understates the true social value of the work they do. Only an ardent libertarian would insist that the wealthy casino magnate’s contribution to society is a thousand times more valuable than that of a pediatrician. The pandemic of 2020 prompted many to reflect, at least fleetingly, on the importance of the work performed by grocery store clerks, delivery workers, home care providers, and other essential but modestly paid workers. In a market society, however, it is hard to resist the tendency to confuse the money we make with the value of our contribution to the common good.

To implement a system based on public benefit rather than marketability would require completely revamping our structure of determining salaries and taxes. 

The idea is that the government would provide a supplementary payment for each hour worked by a low-wage employee, based on a target hourly-wage rate. The wage subsidy is, in a way, the opposite of a payroll tax. Rather than deduct a certain amount of each worker’s earnings, the government would contribute a certain amount, in hopes of enabling low-income workers to make a decent living even if they lack the skills to command a substantial market wage.

Generally speaking, this would mean shifting the tax burden from work to consumption and speculation. A radical way of doing so would be to lower or even eliminate payroll taxes and to raise revenue instead by taxing consumption, wealth, and financial transactions. A modest step in this direction would be to reduce the payroll tax (which makes work expensive for employers and employees alike) and make up the lost revenue with a financial transactions tax on high-frequency trading, which contributes little to the real economy.

This is how Sandel ends his book:

The meritocratic conviction that people deserve whatever riches the market bestows on their talents makes solidarity an almost impossible project. For why do the successful owe anything to the less-advantaged members of society? The answer to this question depends on recognizing that, for all our striving, we are not self-made and self-sufficient; finding ourselves in a society that prizes our talents is our good fortune, not our due. A lively sense of the contingency of our lot can inspire a certain humility: “There, but for the grace of God, or the accident of birth, or the mystery of fate, go I.” Such humility is the beginning of the way back from the harsh ethic of success that drives us apart. It points beyond the tyranny of merit toward a less rancorous, more generous public life.

Posted in Capitalism, Culture, Meritocracy, Uncategorized

Clare Coffey — Closing Time: We’re All Counting Bodies

This is a lovely essay by Clare Coffey from the summer issue of Hedgehog Review.  In it she explores the extremes in contemporary American life through the medium of two recent books:  those who have been shunted aside in the knowledge economy and destined to deaths of despair, and those who occupy the flashiest reaches of the new uber class.  She does this through an adept analysis of two recent books:  Deaths of Despair and the Future of Capitalism, by Anne Case and Angus Deaton; and Very Important People: Status and Beauty in the Global Party Circuit, by Ashley Mears.  In combination, the books tell a powerful story.

Closing Time

We’re All Counting Bodies

Clare Coffey

Lenin’s maxim that “there are decades when nothing happens, and there are weeks when decades happen” can be tough on writers. You spend years carefully marshaling an argument, anticipating objections, tightening your focus, sacrificing claims that might interfere with the suasion of your central point, and then—bam, the gun goes off. Something happens that makes the point toward which you were gently cajoling the reader not only obvious but insufficient. Your thoroughbred stands ready, but the rest of the field has already left the gate.

So it is with Deaths of Despair and the Future of Capitalism. In 2014, Princeton economists Anne Case and Angus Deaton, the latter a Nobel Prize winner, noted that for the first time, the mortality rate among white Americans without a college degree was climbing rather than dropping; further, while members of this group remained relatively advantaged compared to their black peers, the two cohorts’ mortality rates were moving in opposite directions. Case and Deaton found that a significant portion of this hike in mortality was due to deaths from alcoholism, drug use, and suicide—phenomena which, bundled together, they labeled “deaths of despair.”

Deaths of Despair Cover

Six years later, in this new book, the two economists attempt to turn these observations into a thesis: What can this horrifying data can tell us about American society at large? Instead of linking the deaths to any single deprivation, the authors place them in a context of wholesale loss of social status and coherent identity for those without purchase in the knowledge professions—a loss that encompasses wage stagnation, the decline of union power, and the transition from a manufacturing to a service economy.

For Case and Deaton, the closing of a factory involves all three, and cannot be understood strictly in terms of lost earnings or job numbers. Even in a “success” story, in which workers get new jobs at a staffing agency or an Amazon fulfillment center, a qualitative catastrophe occurs: to the prestige of difficult, directly productive work; to a measure of democratic control over the conditions of work; to the sense of valued belonging to socially important organizations; to the norms governing work, marriage, and sociality that developed in a particular material context, and which cannot simply transfer over or remake themselves overnight. At least some of these losses are downstream of sectoral transition only insofar as firm structure and historic labor organization is concerned. There is no purely sectoral reason for companies to outsource all non-knowledge jobs to staffing companies, or for Amazon to fire whistleblowers. The differences between NYC taxis and Uber lie in the fact that one has a union and the other classifies its workers as independent contractors, not in NAICS codes. But however carefully you parse the causes, deaths of despair are the final result of a long, slow social death.

Who are the culprits? Case and Deaton are careful not to absolve capitalism, but they insist that the problem is not really capitalism itself but its abuses: “We are not against capitalism. We believe in the power of competition and free markets. Capitalism has brought an end to misery and death for millions in now rich countries over the past 250 years and, much more rapidly, in countries like India and China, over the past 50 years.” This qualification is not unique to them; it takes different forms, from the regulatory reformism of political liberals such as Elizabeth Warren to the attacks on “crony capitalism” of doctrinaire libertarians, for whom the true free market has not yet been tried. For Case and Deaton, the big-picture problem is unchecked economic trends that encourage “upward redistribution”; their more specific and more representative target is a rent-seeking health-care industry.

Their complaint is not only that companies like Purdue Pharma arguably jump-started the opioid epidemic by hard-selling their pain medications and concealing these drugs’ addictive potential. Case and Deaton also argue that the health-care sector has eaten up American wage gains with insurance costs, funneling more and more money to health-care spending while delivering less and less in terms of health outcomes. The numbers the authors have assembled are convincing. But who at this juncture needs to be convinced? A teenager recently died of COVID-19 after being turned away from an urgent care clinic for lack of insurance. Hospital personnel are getting laid off in the midst of a pandemic to stanch balance sheet losses resulting from delayed elective care. Hospitals that have been operated on the basis of years of business school orthodoxy lack the extra capacity to deal with anything more momentous than a worse-than-usual flu season. Who is in any serious doubt that the American health-care system is cobbled together out of rusty tin cans and profit margins? The more pertinent question is what in America isn’t.

The release of Case and Deaton’s book just as an often fatal communicable disease was going pandemic was not, of course, the fault of the authors. But it makes for oddly frustrating reading. Positing a link between deindustrialization and health-care rent seeking and deaths of despair is an abductive argument about historical and present actors rather than a purely statistical inference. As Case and Deaton freely admit, you cannot prove by means of regression analysis that any of their targets are the unmistakable causes of these deaths. For that matter, there’s too much bundling among both the phenomena (alcoholic diseases, overdoses, suicides) and the proposed causes (deindustrialization, the decline of organized labor, wage stagnation, corporate restructuring) to conduct even a controlled test.

While it may not be possible to demonstrate airtight causality, Deaths of Despair nonetheless provides valuable documentation of the humiliations, losses, and unmoorings of those on the wrong end of a widening economic divide. The book is less a technocratic prescription than a grim body count.

In Very Important People: Status and Beauty in the Global Party Circuit, Ashley Mears is counting bodies too, albeit very different ones. From New York to Miami, from Ibiza to Saint-Tropez, all over the elite global party scene in which Mears, a sociologist and former fashion model, did eighteen months of research, everyone is counting bodies. The bodies are those of models, ruthlessly quantified and highly valuable to the owners of elite nightclubs. Very Important People hinges on one insight: The image of a rooftop party filled with glamorous models drinking champagne isn’t just a pop-culture cliché. It is a lucrative business model.

VIP Cover

According to Mears, up through the nineties the business model for nightclubs was simple. There was a bar and a dance floor. You paid to get in and you paid to drink. Ideally, you’d want a certain ratio of women to men, but the pleasures on offer were fairly straightforward. But in the early 2000s, a new model emerged, ironically enough, in the repurposed industrial buildings of New York’s Meatpacking District. Rather than rely on the dance floor and bar, clubs encouraged (usually male) customers to put down serious cash for immediately available and strategically placed tables and VIP sections, where bottles of liquor at marked-up prices could be brought to them. Clubs that could successfully brand themselves as elite might make enormous sums off out-of-town dentists on a spree, young financiers looking to woo or compete with business associates by demonstrating access to the city’s most exclusive pleasures, and the mega-rich “whales” proclaiming their status by over-the-top performances of generosity and waste.

The table is crucial for this strategy to succeed. It allows maximum visibility for both the whale’s endless parade of bottles of Dom Perignon (much of it left undrunk by virtue of sheer volume) and the groups of models that signal that this is the kind of club where a whale might be found. The good that is being advertised is indistinguishable from the advertising process.

A whole secondary ecosystem has grown up around this glitzy “potlach,” as Mears calls it—this elaborately choreographed wasting of wealth. There are the elite club promoters, who might make thousands a night if they show up with enough models, and whose transactional relationships with the models are defined in useful, fragile terms of mutual care. There are the models, young and broke in expensive cities, who get free meals, free champagne, and sometimes free housing as long as they show up and play nice. There are the bouncers, who police the height and looks of entrants, and the whales, who both command the scene and function as an advertisement for its desirability. Being adjacent to real wealth is a powerful incentive, especially for promoters, who dream of rubbing shoulders and making deals of their own through connections forged in the club.

The owners make money, and everyone else gets a little something and a little scammed. Perhaps among those who are scammed the least are the models, the majority of whom seem to be in it for a good party rather than upward mobility. When you are very young and very beautiful, the world tends to see those traits as the most important things about you. One way to register dissent is to trade them only for things equally ephemeral, inconsequential, delightful: a glass of champagne, moonlight over the Riviera, a night spent dancing till dawn. Reaping the benefits of belonging to an intrinsically exclusive club is not heroic. But it seems no worse than the trade made by the wives of the superwealthy, who in one scene appear, disapproving and hostile, at a table adjacent to their husbands’ at an Upper East Side restaurant. They have made a more thoroughgoing negotiation of their value to wealthy men—one resting on the ability to reproduce the upper class as well as attest to its presence.

Demarcating status is the limit of the model’s power. It is what she is at the club to do. The model is not there primarily to be sexually alluring—that is the role of the lower-class-coded bottle waitress. One of Mears’s subjects even confesses that models aren’t his type: They are too tall and skinny, too stereotyped, and after all, desire is so highly personal—less an estimation that a face has been arranged in the single best way as delight that it has been arranged in such a way. But models are necessary precisely because their bodies and faces have transcended the whims of any personally desiring subject, to the objectivity of market value. Their beauty can be quantified in inches, and dollars.

To contemplate and cultivate beauty is perhaps noble. To desire and consume it is at least human. To desire not any object in itself, but an image of desirability, is ghastly. There are many scenes in Very Important People, from the physical dissipation to the moments bordering on human trafficking, that are morally horrifying. What lingers, though, is this spectral quality: huge amounts of money, time, and flesh in service to a recursive and finally imaginary value. If anyone has gained from the losses of Case and Deaton’s subjects, it is the patrons of the global party circuit. But their gains seem less hoarded than unmade, in a kind of reverse alchemy—transmuted into the allurements of a phantom world, elusive, seductive, and all too soluble in the light of day.

Posted in Capitalism, History, Modernity, Religion, Theory

Blaustein: Searching for Consolation in Max Weber’s Work Ethic

 

Last summer I posted a classic lecture by the great German sociologist, Max Weber, “Science as a vocation.” Recently I ran across a terrific essay by George Blaustein about Weber’s vision of the modern world, drawing on this lecture and two other seminal works: the lecture “Politics as a Vocation” (delivered a year after the science lecture) and the seminal book The Protestant Ethic and the Spirit of CapitalismHere’s a link to the original Blaustein essay on the New Republic website.

Like so many great theorists (Marx, Durkheim, Foucault, etc.), Weber was intensely interested in understanding the formation of modernity.  How did the shift from premodern to modern come about?  What prompted it?  What are the central characteristics of modernity?  What are the main forces that drive it?  As Blaustein shows so adeptly, Weber’s take is a remarkably gloomy one.  He sees the change as one of disenchantment, in which we lost the certitudes of faith and tradition and are left with a regime of soulless rationalism and relentless industry.  Here’s how he put it in his science lecture:

The fate of our times is characterized by rationalization and intellectualization and, above all, by the ‘disenchantment of the world.’ Precisely the ultimate and most sublime values have retreated from public life either into the transcendental realm of mystic life or into the brotherliness of direct and personal human relations….

In his view, there is no turning back, no matter how much you feel you have lost, unless you are willing to surrender reason to faith.  This he is not willing to do, but he understands why others might choose differently.

To the person who cannot bear the fate of the times like a man, one must say: may he rather return silently, without the usual publicity build-up of renegades, but simply and plainly. The arms of the old churches are opened widely and compassionately for him. After all, they do not make it hard for him. One way or another he has to bring his ‘intellectual sacrifice‘ — that is inevitable. If he can really do it, we shall not rebuke him.

In The Protestant Ethic, he explores the Calvinist roots of the capitalist work ethic, in which the living saints worked hard in this world to demonstrate (especially to themselves) that they had been elected to eternal life in the next world.  Instead of earning to spend on themselves, they reinvested their earnings in economic capital on earth and spiritual capital in heaven.  But the ironic legacy of this noble quest is our own situation. in which we work in order to work, without purpose or hope.  Here’s how he puts it in the famous words that close his book.

The Puritan wanted to work in a calling; we are forced to do so. For when asceticism was carried out of monastic cells into everyday life, and began to dominate worldly morality, it did its part in building the tremendous cosmos of the modern economic order. This order is now bound to the technical and economic conditions of machine production which to-day determine the lives of all the individuals who are born into this mechanism, not only those directly concerned with economic acquisition, with irresistible force.  Perhaps it will so determine them until the last ton of fossilized coal is burnt.  In Baxter’s view the care for external goods should only lie on the shoulders of the “saint like a light cloak, which can be thrown aside at any moment.” But fate decreed that the cloak should become an iron cage.

I hope you gain as much insight from this essay as I did.

Protestant Ethic

Searching for Consolation in Max Weber’s Work Ethic

People worked hard long before there was a thing called the “work ethic,” much less a “Protestant work ethic.” The phrase itself emerged early in the twentieth century and has since congealed into a cliché. It is less a real thing than a story that people, and nations, tell themselves about themselves. I am from the United States but now live in Amsterdam; the Dutch often claim the mantle of an industrious, Apollonian Northern Europe, as distinct from a dissolute, Dionysian, imaginary South. Or the Dutch invoke the Protestant ethic with self-deprecating smugness: Alas, we are so productive. Both invocations are absurd. The modern Dutch, bless them, are at least as lazy as everyone else, and their enjoyments are vulgar and plentiful.

In the U.S., meanwhile, celebrations of the “work ethic” add insult to the injury of overwhelming precarity. As the pandemic loomed, it should have been obvious that the U.S. would particularly suffer. People go to work because they have no choice. Those who did not face immediate economic peril could experience quarantine as a kind of relief and then immediately feel a peculiar guilt for that very feeling of relief. Others, hooray, could sustain and perform their work ethic from home.

The German sociologist Max Weber was the first great theorist of the Protestant ethic. If all scholarship is autobiography, it brings an odd comfort to learn that he had himself suffered a nervous breakdown. Travel was his main strategy of recuperation, and it brought him to the Netherlands and to the U.S., among other places. The Hague was “bright and shiny,” he wrote in 1903. “Everyone is well-to-do, exceedingly ungraceful, and rather untastefully dressed.” He had dinner in a vegetarian restaurant. (“No drinks, no tips.”) Dutch architecture made him feel “like Gulliver when he returned from Brobdingnag.” America, by contrast, was Brobdingnag. Weber visited the U.S. for three months in 1904 and faced the lurid enormity of capitalism. Chicago, with its strikes, slaughterhouses, and multi-ethnic working class, seemed to him “like a man whose skin has been peeled off and whose intestines are seen at work.”

Weber theorized the rise of capitalism, the state and its relationship to violence, the role of “charisma” in politics. Again and again he returned, as we still do, to the vocation—the calling—as both a crushing predicament and a noble aspiration. He died 100 years ago, in a later wave of the Spanish flu. It is poignant to read him now, in our own era of pandemic and cataclysm. It might offer consolation. Or it might fail to console.

The Protestant Ethic and the Spirit of Capitalism emerged, in part, from that American journey. It first appeared in two parts, in 1904 and 1905, in a journal, the Archiv für Sozialwissenschaft und Sozialpolitik. A revised version appeared in 1920, shortly before his death. Race did not figure into his account of capitalism’s rise, though the American color line had confronted him vividly. In 1906 he would publish W.E.B. Du Bois’s “The Negro Question in the United States” in the same journal, which he edited.

Modern invocations of the work ethic are usually misreadings: The Protestant Ethic was more lament than celebration. Weber sought to narrate the arrival of what had become a no-longer-questioned assumption: that our duty was to labor in a calling, even to labor for labor’s sake. He sought the origins of this attitude toward work and the meaning of life, of an ethic that saved money but somehow never enjoyed it, of a joyless and irrational rationality. He found the origins in Calvinism, specifically in what he called Calvinism’s “this-worldly asceticism.”

Weber’s argument was not that Calvinism caused capitalism; rather, The Protestant Ethic was a speculative psycho-historical excavation of capitalism’s emergence. The interpretation, like most of his interpretations, had twists that are not easy to summarize. It was, after all, really the failure of Calvinism—in the sense of the unmeetableness of Calvinism’s demands on an individual psyche and soul—that generated a proto-capitalist orientation to the world. The centerpiece of Calvin’s theology—the absolute, opaque sovereignty of God and our utter noncontrol over our own salvation—was, in Weber’s account, impossibly severe, unsustainable for the average person. The strictures of that dogma ended up creating a new kind of individual and a new kind of community: a community bound paradoxically together by their desperate anxiety about their individual salvation. Together and alone.

The germ of the capitalist “spirit” lay in the way Calvinists dealt with that predicament. They labored in their calling, for what else was there to do? To work for work’s sake was Calvinism’s solution to the problem of itself. Having foreclosed all other Christian comforts—a rosary, an indulgence, a ritual, a communion—Weber’s original Calvinists needed always to perform their own salvation, to themselves and to others, precisely because they could never be sure of it. No wonder they would come to see their material blessings as a sign that they were in fact blessed. And no wonder their unlucky descendants would internalize our economic miseries as somehow just.

Calvinism, in other words, was less capitalism’s cause than its ironic precondition. The things people did for desperate religious reasons gave way to a secular psychology. That secular psychology was no “freer” than the religious one; we had been emancipated into jobs. “The Puritans wanted to be men of the calling,” Weber wrote; “we, on the other hand, must be.” As a historical process—i.e., something happening over time—this process was gradual enough that the people participating in it did not really apprehend it as it happened. In Milton’s Paradise Lost, when Adam and Eve are expelled from Eden and into the world, the archangel Michael offers faith as a consolation within the worldliness that is humanity’s lot: The faithful, Michael promises Adam, “shal[l] possess / A Paradise within thee, happier by far.” Those lines appeared in 1674, more than a century after John Calvin’s death; for Weber, they were an inadvertent expression of the capitalist spirit’s historical unfolding. Only later still could the gloomy sociologist see, mirrored in that Puritan epic, our own dismal tendency to approach life itself as a task.

For historians of capitalism, the book is inspiring but soon turns frustrating. Weberian interpretations tend to stand back from history’s contingencies and exploitations in order to find some churning and ultimately unstoppable process: “rationalization,” for instance, by which tradition gives way ironically but inexorably to modernity. Humans wanted things like wholeness, community, or salvation; but our efforts, systematized in ways our feeble consciousness can’t ever fully grasp, end up ushering in anomie, bureaucracy, or profit. The Weberian analysis then offers no relief from that process, only a fatalism without a teleology. The moral of the story, if there is a moral, is to reconcile yourself to the modernity that has been narrated and to find in the narrative itself something like an intellectual consolation, which is the only consolation that matters.

Still, the book’s melancholy resonates, if only aesthetically. At moments, it even stabs with a sharpness that Weber could not have foreseen: The “monstrous cosmos” of capitalism now “determines, with overwhelming coercion, the style of life not only of those directly involved in business but of every individual who is born into this mechanism,” he wrote in the book’s final pages, “and may well continue to do so until the day that the last ton of fossil fuel has been consumed.” Gothic images—ghosts and shadowy monsters—abound in what is, at times, a remarkably literary portrait. “The idea of the ‘duty in a calling’ haunts our lives like the ghost of once-held religious beliefs.”

The book’s most famous image is the “iron cage.” For Puritans, material concerns were supposed to lie lightly on one’s shoulders, “like a thin cloak which can be thrown off at any time” (Weber was quoting the English poet Richard Baxter), but for us moderns, “fate decreed that the cloak should become an iron cage.” That morsel of sociological poetry was not in fact Weber’s but that of the American sociologist Talcott Parsons, whose English translation in 1930 became the definitive version outside of Germany. Weber’s phrase was “stahlhartes Gehäuse”—a shell hard as steel. It describes not a room we can’t leave but a suit we can’t take off.

One wonders what Weber would make of our era’s quarantines. What is a Zoom meeting but another communal experience of intense loneliness? Weber’s portrait of Calvinist isolation might ring a bell. Working from home traps us ever more firmly in the ideology or mystique of a calling. We might then take refuge in a secondary ethic, what we might call the iron cage of “fulfillment.” It is built on the ruins of the work ethic or, just as plausibly, it is the work ethic’s ironic apotheosis: secular salvation through sourdough.

It brings a sardonic pleasure to puncture the mental and emotional habits of a service economy in Weberian terms. But it doesn’t last. The so-called work ethic is no longer a spiritual contagion but a medical one, especially in America. Weber’s interpretation now offers little illumination and even less consolation. It is not some inner ethic that brings, say, Amazon’s workers to the hideously named “fulfillment centers”; it is a balder cruelty.

The breakdown happened in 1898, when Weber was 34. “When he was overloaded with work,” his wife, Marianne, wrote in her biography of him, after his death, “an evil thing from the unconscious underground of life stretched out its claws toward him.” His father, a politician in the National Liberal Party, had died half a year earlier, mere weeks after a family standoff that remained unresolved. In the dispute, Max had defended his devoutly religious mother against his autocratic father. The guilt was severe. (The Protestant Ethic would lend itself too easily to a Freudian reading.) A psychiatrist diagnosed him with neurasthenia, then the modern medical label for depression, anxiety, panic, fatigue. The neurasthenic brain, befitting an industrial age, was figured as an exhausted steam engine. Marianne, elsewhere in her biography, described the condition as an uprising to be squashed: “Weber’s mind laboriously maintained its dominion over its rebellious vassals.”

As an undergraduate at the University of Heidelberg, Weber had studied law. His doctoral dissertation was on medieval trading companies. By his early thirties he was a full professor in economics and finance, in Freiburg and then back in Heidelberg. After his breakdown, he was released from teaching and eventually given a long leave of absence. He resigned his professorship in 1903, keeping only an honorary title for more than a decade. Weberian neurasthenia meant a life of travel; medical sojourns in Alpine clinics; and convalescent trips to France, Italy, and Austro-Hungary—extravagant settings for insomnia and a genuine inner turmoil. Money was not the problem. Marianne, a prolific scholar and a key but complex figure in the history of German feminism, would inherit money from the family’s linen factory.

Though only an honorary professor, with periods of profound study alternating with periods of depression, Weber loomed large in German academic life. In 1917, students in Munich invited the “myth of Heidelberg,” as he was known, to lecture about “the vocation of scholarship.” He did not mention his peculiar psychological and institutional trajectory in that lecture, now a classic, though one can glimpse it between the lines. “Wissenschaft als Beruf” (“Science as a Vocation”) and another lecture from a year and a half later, “Politik als Beruf” (“Politics as a Vocation”) are Weber’s best-known texts outside The Protestant Ethic. A new English translation by Damion Searls rescues them from the formal German (as translations sometimes must) and from the viscous English into which they’re usually rendered. It restores their vividness and eloquence as lectures.

Of course, now they would be Zoom lectures, which would entirely break the spell. Picture him: bearded and severe, a facial scar still visible from his own college days in a dueling fraternity. He would see not a room full of students but rather his own face in a screen, looking back at him yet unable to make true eye contact. Neurasthenia would claw at him again.

Some lines from “Wissenschaft als Beruf,” even today, would have worked well in the graduation speeches that have been canceled because of the pandemic. Notably: “Nothing is humanly worth doing except what someone can do with passion.” Sounds nice! “Wissenschaft als Beruf” approached the confines of the calling in a more affirmative mode. Other parts of the speech, though—and even that inspirational line, in context—boast a bleak and bracing existentialism. My favorite moment is when Weber channeled Tolstoy on the meaningless of death (and life!) in a rationalized, disenchanted modernity. Since modern scholarship is now predicated on the nonfinality of truth, Weber said, and since any would-be scholar will absorb “merely a tiny fraction of all the new ideas that intellectual life continually produces,” and since “even those ideas are merely provisional, never definitive,” death can no longer mark a life’s harmonious conclusion. Death is now “simply pointless.” And the kicker: “And so too is life as such in our culture, which in its meaningless ‘progression’ stamps death with its own meaninglessness.” If only I had heard that from a graduation speaker.

Weber’s subject was the meaning of scholarship in a “disenchanted” world. “Disenchantment” is another one of Weber’s processes—twisted, born of unintended consequences, but nevertheless unstoppable. It meant a scholar could no longer find “proof of God’s providence in the anatomy of a louse.” Worse, the modern scholar was doomed to work in so dismal an institution as a university. “There are a lot of mediocrities in leading university positions,” said Weber about the bureaucratized university of his day, “safe mediocrities or partisan careerists” serving the powers that funded them. Still true.

So why do it? To be a scholar meant caring, as if it mattered, about a thing that objectively does not matter and caring as if “the fate of his very soul depends on whether he gets this specific conjecture exactly right about this particular point in this particular manuscript.” Scholarship was the good kind of calling, insofar as one could make one’s way to some kind of meaning, however provisional that meaning was, and however fleeting and inscrutable the spark of “inspiration.”

That part of the sermon is no longer quite so moving. Weber styled himself a tough-minded realist when it came to institutions, but our era’s exploitation of adjunct academic labor punctures the romance that Weber could nevertheless still inflate. Universities in an age of austerity do not support or reward scholarly inquiry as a self-justifying vocation. Scholars must act more and more like entrepreneurs, manufacturing and marketing our own “relevance.” For some university managers (as for many corporate CEOs), the coronavirus is as much an opportunity as a crisis, to further strip and “streamline” the university—to conjoin, cheaply, the incompatible ethics of efficiency and intellect. And we teachers are stuck in the gears: The digital technologies by which we persist in our Beruf will only further erode our professional stability. “Who but a blessed, tenured few,” the translation’s editors, Paul Reitter and Chad Wellmon, ask, “could continue to believe that scholarship is a vocation?”

And yet as a sermon on teaching, Weber’s lecture still stirs me. Having given up on absolute claims about truth or beauty, and having given up on academic inquiry revealing the workings of God, he arrived at a religious truth about pedagogy that you can still hang a hat on:

If we understand our own subject (I am necessarily assuming we do), we can force, or at least help, an individual to reckon with the ultimate meaning of his own actions. This strikes me as no small matter, in terms of a student’s inner life too.

I want this to be true. On good days, teaching delivers what Weber called that “strange intoxication,” even on Zoom.

An enormous historical gulf divides the two vocation lectures, though they were delivered only 14 months apart. In November 1917, Weber didn’t even mention the war. When it broke out in 1914, he served for a year as a medical director in the reserve forces; he did not see combat but supported German aspiration to the status of Machtstaat and its claim to empire. The war dragged miserably on, but in late 1917 it was far from clear that Germany would lose. Tsarist Russia had collapsed, and the American entry into the war had not proved decisive. The defeat that Germany would experience in the coming months was then unimaginable.

Weber was a progressive nationalist, moving between social democracy and the political center. During the war, besides his essays on the sociology of religion, he wrote about German political futures and criticized military management, all while angling for some role in the affairs of state himself. As the tide turned, he argued for military retrenchment as the honorable course. A month after Germany’s surrender on November 11, 1918, he stood unsuccessfully for election to parliament with the new German Democratic Party, of which he was a founder.

In January 1919 he returned to a Munich gripped by socialist revolution. It was now the capital of the People’s State of Bavaria, which would be short-lived. Weber, for years, had dismissed both pacifism and revolution as naïve. Many in the room where he spoke supported the revolution that he so disdained, and many of them had seen industrial slaughter in the state’s trenches. Part of the lecture’s mystique is its timing: He stood at a podium in the eye of the storm.

“Politik als Beruf” would seem to speak to our times, from one era of calamity and revolution to another. It is about the modern state and its vast machineries. It is about statesmen and epigones, bureaucracy and its discontents, “leadership” and political breakdown. To that moment’s overwhelming historical flux, Weber brought, or tried to bring, the intellectual sturdiness of sociological categories, “essential” vocabularies that could in theory apply at any time.

He offered a now-famous definition of the state in general: “the state is the only human community that (successfully) claims a monopoly on legitimate physical violence for itself, within a certain geographical territory.… All other groups and individuals are granted the right to use physical violence only insofar as the state allows it.” This definition, powerfully tautological, was the sociological floor on which stood all of the battles over what we might want the state to be. Philosophically, it operated beneath all ideological or moral debates over rights, democracy, welfare. It countered liberalism’s fantasy of a social contract, because Weber’s state, both foundationally and when push came to shove, was not contractual but coercive.

It was a bracing demystification. Legitimacy had nothing to do with justice; it meant only that the people acquiesced to the state’s authority. Some regimes “legitimately” protected “rights,” while others “legitimately” trampled them. Why did we acquiesce? Weber identified three “pure” categories of acquiescence: We’re conditioned to it, by custom or tradition; or we’re devoted to a leader’s charisma; or we’ve been convinced that the state’s legitimacy is in fact just, that its laws are valid and its ministers competent. Real political life, Weber wryly said, was always a cocktail of these three categories of acquiescence, never mind what stories we might tell ourselves about why we go along with anything.

With that floor of a definition laid, varieties of statehood could now emerge. Every state was a configuration of power and bureaucratic machinery, and the many massive apparatuses that made it up had their own deep sociological genealogies, each with their own Weberian twists. So did the apparatuses that produced those people who felt called to politics. Weber’s sweep encompassed parliaments, monarchs, political parties, corporations, newspapers, universities (law schools especially), a professional civil service, militaries.

Any reader now will be tempted to decode our politicians in Weber’s terms. Trump: ostensibly from the world of business, which, in Weber’s scheme, would usually keep such a figure out of electoral politics (although Weber did note that “plutocratic leaders certainly can try to live ‘from’ politics,” to “exploit their political dominance for private economic gain”). Maybe we’d say that Trump hijacked the apparatus of the administrative state, already in a state of erosion, and that he grifts from that apparatus while wrecking it further. Or maybe Trump is returning American politics to the pre-professional, “amateur” spoils system of the nineteenth century. Or he is himself a grotesque amateur, brought to the fore by an already odious political party that somehow collapsed to victory. Or maybe Trump is an ersatz aristocrat, from inherited wealth, who only played a businessman on television. (Weber’s writings do not anticipate our hideous celebrity politics.) Or Trump is a would-be warlord, postmodern or atavistically neo-feudal, committed to stamping a personal brand on the formerly “professional” military. Or, or, or. All are true, in their way. Maybe Weber would see in Trump a moron on the order of Kaiser Wilhelm—an equally cogent analysis.

Do these decodings clarify the matter or complicate it? Do they help us at all? They deliver a rhetorical satisfaction, certainly, and maybe an intellectual consolation. Then what? “Politik als Beruf” leaves sociology behind and becomes a secular sermon about “leadership,” and here the spell begins to break. Weber sought political salvation, of a kind, in charisma. The word is now a cliché, but for him it had a specific charge. Politics, he told his listeners in so many words, was a postlapsarian business. It cannot save any souls, because violence and coercion are conceptually essential to politics. A disenchanted universe is still a fallen universe. What had emerged from the fall was the monstrous apparatus of the modern nation-state. It was there, with its attendant armies of professionals and hangers-on, it fed you or it starved you. It was a mountain that no one really built but that we all had to live on.

Politics for Weber was brutally Darwinian in the end: Some states succeeded, and others failed. His Germany did not deserve defeat any more than the Allies deserved victory. That same moral arbitrariness made him look with a kind of grudging respect at Britain and the U.S.—made him even congratulate America for graduating from political amateurism into professional power. Meanwhile, he belittled revolutionaries. Anyone who imagined they could escape power’s realities or usher in some fundamentally new arrangement of power, he mocked. “Let’s be honest with ourselves here,” he said to the revolutionists in Munich. A belief in a revolutionary cause, “as subjectively sincere as it may be, is almost always merely a moral ‘legitimation’ for the desire for power, revenge, booty, and benefits.” (He was recycling a straw-man argument he had made for several years.)

To be enchanted by this argument is to end up thinking in a particular way about history with a capital h and politics with a capital p. History was always a kind of test of the state: wars, economic calamities, pandemics. Such things arrived, like natural disasters. For all the twists and complexities of Weber’s sociology, this conception of History is superficial, and its prescription for Politics thin. He demystified the state only to remystify the statesman. It is an insider’s sermon, because politics was an insider’s game, and it is the state’s insiders who, nowadays, will thrill to it. Very well.

“The relationship between violence and the state is particularly close at present,” Weber said, early in his lecture. At present could mean this week, this decade, this century, this modernity. The lecture retains, no doubt, a curious power in times of calamity. I am inclined to call it a literary power. Weber held two things in profound narrative tension: We feel both the state’s glacial inevitability and the terror of its collapse. Without a bureaucrat’s “discipline and self-restraint, which is in the deepest sense ethical,” Weber said in passing, “the whole system would fall apart.” So too would it fall apart without a leader’s charisma. If this horror vacui was powerful, for Weber and his listeners, it was because in 1919 things would fall apart, or were falling apart, or had already fallen apart. The lecture contemplates that layered historical collapse with both dread and wonder.

A century on, Weber’s definition of the state is still, sometimes, a good tool to think with. The coronavirus lockdowns, for instance, laid bare the state’s essentially coercive function. In Europe, on balance, lockdowns have been accepted—acquiesced to—as a benevolent coercion, an expression of a trusted bureaucracy and a responsible leadership. In some American states, too. The lockdowns even generated their own (in Weberian terms) legitimating civic rituals. Fifteen months after “Politik als Beruf,” Weber himself would die of the flu that his lecture did not mention.

In the Netherlands, where I live and teach, the drama of that lecture, even in a pandemic, might fall on deaf ears. The peril and fragileness that Weber channeled can be hard to imagine in the low countries, which boasted an “intelligent lockdown” that needed no spectacular show of coercion. History, here, tends not to feel like an onrushing avalanche, or a panorama of sin and suffering, or a test we might fail, but rather a march of manageable problems, all of which seem—seem—solvable. This conception is a luxury.

As for the study of the U.S., which I suppose is my own meaningful or meaningless calling, Weber said, in 1917, that “it is often possible to see things in their purest form there.” In the century since his death, the transatlantic tables have turned, and American Studies often becomes the study of political breakdown. The vocabulary of failed statehood abounds in commentaries on America, from within and without, while American liberals look often to Germany’s Angela Merkel as the paragon of Weberian statesmanship. Step back from such commentaries, though, and American history will overwhelm even Weber’s bleak definition. America sits atop other kinds of violence, it accommodates a privatized violence, it outsources violence, it brings its wars back home.

I started this essay before the murder of George Floyd, and I am finishing it during the uprising that has followed in its wake. Weber’s definition of the state, ironically, can now fit with a political temperament more far radical than Weber’s own. The uprising has as its premise that the social contract, if it ever held, has long since been broken: The state’s veil is thus drawn back. The uprising then looks Weber’s definition in the eye: The monstrous state’s violence is unjust, therefore we do not accept it as legitimate.

I was looking in Weber for illumination, or consolation, or something. I haven’t found a rudder for the present, and I don’t know how to end. But the desire for consolation brought to my mind, of all things, the unconsoling diary of Franz Kafka. I read it years ago, and every once in a while its last lines will suddenly haunt me, like the opposite of a mantra, for reasons I don’t entirely understand. Kafka died in 1924, more an outsider than an insider; his diary’s last entry reflects, in an elliptical or inscrutable way, on another disease—tuberculosis—and on another calling. “More and more fearful as I write. Every word,” he felt, was “twisted in the hands of the spirit” and became “a spear turned against the speaker.” He also looked for consolation. “The only consolation would be: it happens whether you like or no. And what you like is of infinitesimally little help.” He then looked beyond it. “More than consolation is: You too have weapons.”

 

Posted in Capitalism, Global History, Higher Education, History, State Formation, Theory

Escape from Rome: How the Loss of Empire Spurred the Rise of Modernity — and What this Suggests about US Higher Ed

This post is a brief commentary on historian Walter Scheidel’s latest book, Escape from Rome.  It’s a stunningly original analysis of a topic that has long fascinated scholars like me:  How did Europe come to create the modern world?  His answer is this:  Europe became the cauldron of modernity and the dominant power in the world because of the collapse of the Roman empire — coupled with the fact that no other power was able to replace it for the next millennium.  The secret of European success was the absence of central control.  This is what led to the extraordinary inventions that characterized modernity — in technology, energy, war, finance, governance, science, and economy.

Below I lay out central elements of his argument, providing a series of salient quotes from the text to flesh out the story.  In the last few years I’ve come to read books exclusively on Kindle and Epub, which allows me to copy passages that catch my interest into Evernote for future reference. So that’s were these quotes come from and why they don’t include page numbers.

At the end, I connect Scheidel’s analysis with my own take on the peculiar history of US higher education, as spelled out in my book A Perfect Mess.  My argument parallels his, showing how the US system arose in the absence of a strong state and dominant church, which fostered creative competition among colleges for students and money.  Out of this unpromising mess of institutions emerged a system of higher ed that came to dominate the academic world.

Escape from Rome

Here’s how Scheidel describes the consequences for Europe that arose from the fall of Rome and the long-time failure of efforts to impose a new empire there.

I argue that a single condition was essential in making the initial breakthroughs possible: competitive fragmentation of power. The nursery of modernity was riven by numerous fractures, not only by those between the warring states of medieval and early modern Europe but also by others within society: between state and church, rulers and lords, cities and magnates, knights and merchants, and, most recently, Catholics and Protestants. This often violent history of conflict and compromise was long but had a clear beginning: the fall of the Roman empire that had lorded it over most of Europe, much as successive Chinese dynasties lorded it over most of East Asia. Yet in contrast to China, nothing like the Roman empire ever returned to Europe.

Recurrent empire on European soil would have interfered with the creation and flourishing of a stable state system that sustained productive competition and diversity in design and outcome. This made the fall and lasting disappearance of hegemonic empire an indispensable precondition for later European exceptionalism and thus, ultimately, for the making of the modern world we now inhabit.

From this developmental perspective, the death of the Roman empire had a much greater impact than its prior existence and the legacy it bequeathed to later European civilization.

Contrast this with China, where dynasties rose and fell but where empire was a constant until the start of the 20th century.  It’s an extension of an argument that others, such as David Landes, have developed about the creative possibilities unleashed by a competitive state system in comparison to the stability and stasis of an imperial power.  Think about the relative stagnation of the  Ottoman, Austro-Hungarian, and Russian empires in 17th, 18th, and 19th centuries compared with the dynamic emerging nation states of Western Europe.  Think also of the paradox within Western Europe, in which the drivers of modernization came not from the richest and strongest imperial powers — Spain, France, and Austria — but from the marginal kingdom of England and the tiny Dutch republic.

The comparison between Europe in China during the second half of the first millennium is telling:

Two things matter most. One is the unidirectional character of European developments compared to the back and forth in China. The other is the level of state capacity and scale from and to which these shifts occurred. If we look at the notional endpoints of around 500 and 1000 CE, the dominant trends moved toward imperial restoration in China and toward inter-and intrastate fragmentation in Europe.

Scheidel shows how social power fragmented after the fall of Rome in such a way that made it impossible for a new hegemonic power to emerge.

After Rome’s collapse, the four principal sources of social power became increasingly unbundled. Political power was claimed by monarchs who gradually lost their grip on material resources and thence on their subordinates. Military power devolved upon lords and knights. Ideological power resided in the Catholic Church, which fiercely guarded its long-standing autonomy even as its leadership was deeply immersed in secular governance and the management of capital and labor. Economic power was contested between feudal lords and urban merchants and entrepreneurs, with the latter slowly gaining the upper hand. In the heyday of these fractures, in the High Middle Ages, weak kings, powerful lords, belligerent knights, the pope and his bishops and abbots, and autonomous capitalists all controlled different levers of social power. Locked in unceasing struggle, they were compelled to cooperate and compromise to make collective action possible.

He points out that “The Christian church was the most powerful and enduring legacy of the Roman empire,” becoming “Europe’s only functioning international organization.”  But in the realms of politics, war, and economy the local element was critical, which produced a situation where local innovation could emerge without interference from higher authority.

The rise of estates and the communal movement shared one crucial characteristic: they produced bodies such as citizen communes, scholarly establishments, merchant guilds, and councils of nobles and commoners that were, by necessity, relatively democratic in the sense that they involved formalized deliberative and consensus-building interactions. Over the long run, these bodies gave Latin Europe an edge in the development of institutions for impersonal exchange that operated under the rule of law and could be scaled up in response to technological change.

Under these circumstances, the states that started to emerge in Europe in the middle ages built on the base of distributed power and local initiative that developed in the vacuum left by the Roman Empire.

As state power recoalesced in Latin Europe, it did so restrained by the peculiar institutional evolution and attendant entitlements and liberties that this acutely fractured environment had engendered and that—not for want of rulers’ trying—could not be fully undone. These powerful medieval legacies nurtured the growth of a more “organic” version of the state—as opposed to the traditional imperial “capstone” state—in close engagement with organized representatives of civil society.

Two features were thus critical: strong local government and its routinized integration into polity-wide institutions, which constrained both despotic power and aristocratic autonomy, and sustained interstate conflict. Both were direct consequences of the fading of late Roman institutions and the competitive polycentrism born of the failure of hegemonic empire. And both were particularly prominent in medieval England: the least Roman of Western Europe’s former Roman provinces, it experienced what with the benefit of hindsight turned out to be the most propitious initial conditions for future transformative development.

The Pax Romana was replaced by a nearly constant state of war, with the proliferation of castle building and the dispersion of military capacity at the local level.  These wars were devastating for the participants but became primary spur for technological, political, and economic innovation.  Everyone needed to develop an edge to help with the inevitable coming conflict.

After the reformation, the small marginal and Protestant states on the North Sea enjoyed a paradoxical advantage in the early modern period, when Catholic Spain, France, and Austria were developing increasingly strong centralized states.  Their marginality allowed them to build most effectively on the inherited medieval model.

…it made a difference that the North Sea region was alone in preserving medieval decentralized political structures and communitarian legacies and building on them during the Reformation while more authoritarian monarchies rose across much of the continent—what Jan Luiten van Zanden deems “an unbroken democratic tradition” from the communal movement of the High Middle Ages to the Dutch Revolt and England’s Glorious Revolution.

England in particular benefited from the differential process of development in Europe.

Yet even as a comprehensive balance sheet remains beyond our reach, there is a case to be made that the British economy expanded and modernized in part because of rather than in spite of the tremendous burdens of war, taxation, and protectionism. By focusing on trade and manufacture as a means of strengthening the state, Britain’s elites came to pursue developmental policies geared toward the production of “goods with high(er) added value, that were (more) knowledge and capital intensive and that were better than those of foreign competitors so they could be sold abroad for a good price.”

Thanks to a combination of historical legacies and geography, England and then Britain happened to make the most of their pricey membership in the European state system. Economic growth had set in early; medieval integrative institutions and bargaining mechanisms were preserved and adapted to govern a more cohesive state; elite commitments facilitated high levels of taxation and public debt; and the wars that mattered most were won.

Reduced to its essentials, the story of institutional development followed a clear arc. In the Middle Ages, the dispersion of power within polities constrained the intensity of interstate competition by depriving rulers of the means to engage in sustained conflict. In the early modern period, these conditions were reversed. Interstate conflict escalated as diversity within states diminished and state capacity increased. Enduring differences between rival polities shaped and were in turn shaped by the ways in which elements of earlier domestic heterogeneity, bargaining and balancing survived and influenced centralization to varying degrees. The key to success was to capitalize on these medieval legacies in maximizing internal cohesion and state capacity later. This alone made it possible to prevail in interstate conflict without adopting authoritarian governance that stifled innovation. The closest approximations of this “Goldilocks scenario” could be found in the North Sea region, first in the Netherlands and then in England.

As maritime European states (England, Spain, Portugal, and the Dutch Republic) spread out across the globe, the competition increased exponentially — which then provided even stronger incentives for innovation at all levels of state and society.

Polycentrism was key. Interstate conflict did not merely foster technological innovation in areas such as ship design and weaponry that proved vital for global expansion, it also raised the stakes by amplifying both the benefits of overseas conquest and its inverse, the costs of opportunities forgone: successful ventures deprived rivals from rewards they might otherwise have reaped, and vice versa. States played a zero-sum game: their involvements overseas have been aptly described as “a competitive process driven as much by anxiety over loss as by hope of gain.”

In conclusion, Scheidel argues that bloody and costly conflict among competing states was the the source of rapid modernization and the rise of European domination of the globe.

I am advocating a perspective that steers clear of old-fashioned triumphalist narratives of “Western” exceptionalism and opposing denunciations of colonialist victimization. The question is not who did what to whom: precisely because competitive fragmentation proved so persistent, Europeans inflicted horrors on each other just as liberally as they meted them out to others around the globe. Humanity paid a staggering price for modernity. In the end, although this may seem perverse to those of us who would prefer to think that progress can be attained in peace and harmony, it was ceaseless struggle that ushered in the most dramatic and exhilaratingly open-ended transformation in the history of our species: the “Great Escape.” Long may it last.

I strongly recommend that you read this book.  There’s insight and provocation on every page.

The Parallel with the History US Higher Education

As I mentioned at the beginning, my own analysis of the emergence of American higher ed tracks nicely on Scheidel’s analysis of Europe after the fall of Rome.  US colleges arose in the early 19th century under conditions where the state was weak, the church divided, and the market strong.  In the absence of a strong central power and a reliable source of financial support, these colleges came into existence as corporations with state charters but not state funding.  (State colleges came later but followed the model of their private predecessors.)  Their creation had less to do with advancing knowledge than with serving more immediately practical aims.

One was to advance the faith in a highly competitive religious environment.  This provided a strong incentive to plant the denominational flag across the countryside, especially on steadily moving western frontier.  A college was a way for Lutherans and Methodists and Presbyterians and others to announce their presence, educate congregants, attract newcomers, and train clergy.  Thus the huge number of colleges in Ohio, the old Northwest Territory.

Another spur for college formation was the crass pursuit of money.  The early US was a huge, underpopulated territory which had too much land and not enough buyers.  This turned nearly everyone on the frontier into a land speculator (ministers included), feverishly coming up with schemes to make the land in their town more valuable for future residents than the land in other towns in the area.  One way to do this was to set up a school, telegraphing to prospects that this was a place to settle down and raise a family.  When other towns followed suit, you could up the ante by establishing a college, usually bearing the town name, which told the world that yours was not some dusty agricultural village but a vibrant center of culture.

The result was a vast number of tiny and unimpressive colleges scattered across the less populated parts of a growing country.  Without strong funding from church or state, they struggled to survive in a highly competitive setting.  This they managed by creating lean institutions that were adept at attracting and retaining student consumers and eliciting donations from alumni and from the wealthier people in town.  The result was the most overbuilt system of higher education the world has ever seen, with five times as many colleges in 1880 than the entire continent of Europe.

All the system was lacking was academic credibility and a strong incentive for student enrollment.  These conditions were met at the end of the century, with the arrival of the German research university to crown the system and give it legitimacy and with the rise of the corporation and its need for white collar workers.

At this point, the chaotic fragmentation and chronic competition that characterized the American system of higher education turned out to be enormously functional.  Free from the constraints that European nation states and national churches imposed on universities, American institutions could develop programs, lure students, hustle for dollars, and promote innovations in knowledge production and technology.  They knew how to make themselves useful to their communities and their states, developing a broad base of political and financial support and demonstrating their social and economic value.

Competing colleges, like competing states, promoted a bottom-up vitality in the American higher ed system that was generally lacking in the older institutions of Europe that were under control of a strong state or church.  Early institutional chaos led to later institutional strength, a system what was not created by design but emerged from an organic process of evolutionary competition.  In the absence of Rome (read: a hegemonic national university), the US higher education system became Rome.

Posted in Academic writing, Capitalism, History

E.P. Thompson: Time, Work-Discipline, and Industrial Capitalism

This post is a tribute to a wonderful essay by the great British historian of working-class history, E. P. Thompson.  His classic work is The Making of the English Working Class, published in 1966.  The paper I’m touting here provides a lovely window into the heart of his craft, which is an unlikely combination of Oxbridge erudition and Marxist analysis.

It’s the story of the rise of a new sense of time in the world that emerged with the arrival of capitalism, at which point suddenly time became money.  If you’re making shoes to order in a precapitalist workshop, you work until the order is completed and then you take it easy.  But if your labor is being hired by the hour, then your employer has an enormous incentive to squeeze as much productivity as possible out of every minute you are on the clock. The old model is more natural for humans: work until you’ve accomplished what you need and then stop.  Binge and break.  Think about the way college students spend their time when they’re not being supervised — a mix of all-nighters and partying.

Thompson captures the essence of the change between natural time and the time clock with this beautiful epigraph from Thomas Hardy’s Tess of the D’Urbervilles.

Tess … started on her way up the dark and crooked lane or street not made for hasty progress; a street laid out before inches of land had value, and when one-handed clocks sufficiently subdivided the day.

This quote and his analysis has had a huge impact on the way I came to see the world as a scholar of history.

Here’s a link to the paper, which was published in the journal Past and Present in 1967.  Enjoy.

front page time work discipline -- pp 67