Posted in Democracy, Inequality, Meritocracy, Public Good

What the Old Establishment Can Teach the New Tech Elite

It is unlikely that Mark Zuckerberg, Jeff Bezos and the other lords and ladies of Silicon Valley spend any time in English churchyards. But if they were to visit these delightfully melancholic places, the first things that they would encounter would be monuments to the fallen of the Great War. Their initial emotion, like anybody else’s looking at these morbid plinths, would rightly be one of relief. It is good that the West’s young men are no longer herded into uniform and marched toward machine guns.

If they looked harder, however, today’s elite would spot something else in these cemeteries. The whole of society is commemorated in stone: The baronet’s heir was shot to pieces in Flanders alongside the gamekeeper’s son. Recall that in the controversial D.H. Lawrence novel “Lady Chatterley’s Lover,” Lady Chatterley is driven into the arms of the local gamekeeper in part because her husband, Sir Clifford, was paralyzed from the waist down in the Great War.

Such monuments to the dead, which can be found across Europe, are a reminder that a century ago the elite, whatever its other sins, believed in public service. The rich shared common experiences with the poor, rooted in a common love of their country and a common willingness to sacrifice life and limb for something bigger.

That bond survived until the 1960s. Most young men in Europe did a version of what was called “national service”: They had to serve in the armed forces for a couple of years and learned the rudiments of warfare in case total war struck again. The U.S. called on people of all classes to fight in World War II—including John F. Kennedy and George H.W. Bush, who were both nearly killed serving their country—and the Korean War.

The economic elites and the political elites were intertwined. In Britain, a “magic circle” of Old Etonians helped choose the leader of the Conservative Party, convening over lunch at the Beefsteak Club or dinner at Pratt’s to discuss the fate of the nation, as well as the quality of that year’s hunting. What became the European Union was constructed behind closed doors by the continent’s ruling class, while Charles de Gaulle set up the Ecole Nationale d’Administration for the purpose of training a new ruling elite for a new age. American presidents turned to “wise men” of the East Coast Establishment, such as Averell Harriman, the son of a railroad tycoon, or one of the Rockefellers. The “best and the brightest” were supposed to do a stint in Washington.

A memorial to soldiers who died in the two world wars, Oxfordshire, U.K.

PHOTO: TIM GRAHAM/GETTY IMAGES

The Establishment on both sides of the Atlantic was convinced that good government mattered more than anything else. Mess up government and you end up with the Depression and Hitler.

That sense has gone. The New Establishment of Wall Street and the City of London and the New New Establishment of Silicon Valley have precious little to do with Washington or Whitehall. The public sector is for losers. As today’s elite see it, the best thing that government can do is to get out of the way of the really talented people and let them exercise their wealth-creating magic. Pester them too much or tax them too heavily and they will pick up their sticks and take their game elsewhere.

As for common experiences, the smart young people who go from the Ivy League or Oxbridge to work at Google or Goldman Sachs are often as distant from the laboring masses as the class that H.G. Wells, in “The Time Machine,” called the Eloi—pampered, ethereal, childlike creatures that the time traveler discovers at the end of his long journey into the future. Separated from the masses by elite education and pricey lifestyles in fashionable enclaves, today’s elite often have few ties to the country they work in. One former British spy points out that his children are immensely better educated than he was and far more tolerant, but the only time they meet the working class is when their internet shopping arrives; they haven’t shared a barracks with them.

Does this matter? Again, many will point to progress. The old elite was overwhelmingly male and white (with a few exceptions, such as Lady Violet Bonham Carter and Katharine Graham, who often wielded power through dinner parties). It often made a hash of things. Britain’s “magic circle” didn’t cope well with the swinging ‘60s—most catastrophically with the Profumo sex scandal, which greatly damaged the Conservative Party—while America’s whiz kids hardly excelled in Vietnam. By the 1960s, the very term “The Establishment” had become an insult.

Modern money is also far cleaner than old money. The officers who were mowed down at the Somme often came from grand homes, but they were built with the grubby proceeds of coal, slavery and slaughter. (Clifford Chatterley, in his wife’s view, treated miners “as objects rather than men.”) Say what you like against monopolistic tech barons, greedy hedge-fund managers or tax-dodging real estate tycoons, they aren’t sinners in the same league. Men like Mr. Bezos and Mr. Zuckerberg build great businesses and often give away their money to worthy causes. What more should they do?

Quite a lot, actually.

Lieutenant John F. Kennedy, right, and his PT 109 crew in the South Pacific, July 1943.

PHOTO: ASSOCIATED PRESS

The idea that the elite has a responsibility to tend to the state was brilliantly set out by Plato more than 2,000 years ago. In “The Republic” he likened the state to a ship that can easily flounder on the rocks or head in the wrong direction. He argued that for a voyage to succeed, you need a captain who has spent his life studying “the seasons of the years, the skies, the stars and other professional subjects.” He wanted to trust his state to a group of Guardians, selected for their wisdom and character and trained, through an austere and demanding education, in the arts of government.

Covid-19 is a wake-up call for the West, especially for its elite. This year could mark a reverse in history. Five hundred years ago, Europe was a bloody backwater while China was the most advanced country in the world, with the world’s most sophisticated civil service, selected by rigorous examination from across the whole country. The West overtook the East because its leaders mastered the art of government, producing a succession of powerful innovations—the nation-state, the liberal state, the welfare state—while the Chinese state ossified, its Mandarin elite unaware that it was even in competition with anyone else. By the 1960s, America was putting a man on the moon while millions of Chinese were dying of starvation.

Since the 1960s, however, this process has been reversed. Led by Singapore, Asia has been improving its state machinery while the West has ossified. Covid-19 shows just how far this change in the balance of competence has gone. Countries like South Korea, Singapore and even China have done far better at protecting their citizens than either the U.S. or Britain, where governments have conspicuously failed to work.

The elite bears much of the responsibility for this sorry state of affairs. The 1960s was the last time that they had a marked sense of public duty. What followed might be called the great abandonment. The Vietnam War discredited “wise men” such as McGeorge Bundy, a self-styled Platonic Guardian who served as national security adviser to both JFK and LBJ. The Establishment split into warring tribes of progressives and conservatives who were so divided by the culture wars that they seldom come together to fix anything. The explosion of pay in the private sector drew talent away from government. The constant refrain from the Right that the state is a parasite on the productive economy eroded what remained of the public ethic, while the Left, drugged by its ties to public sector unions, lost its appetite for reform. Government became a zombie, preserved and indeed inflated by its staff and clients, but robbed of ideas and talent.

National Service recruits in the U.K. line up to be issued caps, 1953.

PHOTO: POPPERFOTO/GETTY IMAGES

The difference with the East is marked. Singapore has put a Platonic premium on public service. It recruits the brightest young people for the government, makes sure they move frequently between the public and private sectors, and pays them well: Its top civil servants can earn more than a million dollars a year. (It stops short of forbidding its Guardians to marry and laying on orgies for them, as Plato advised, but it does force them to live in public housing.) Other Asian dragons have recruited a cadre of elite civil servants. China’s attempt to follow suit is complicated by the corruption and secrecy that surround the regime, but at its best it is learning from Singapore, creating a new class of mandarins, this time trained in technical fields and science rather than the classics.

What could the West do to rebind the elite to the state? Better pay for civil servants is one answer, especially if it comes with a keenness to shed poor performers in the public sector, as Singapore does. The idea of giving students generous university scholarships in exchange for working for the civil service for a number of years was pioneered by Thomas Jefferson. An even more ambitious idea would be to reintroduce nonmilitary national service, an idea that Emmanuel Macron has raised for France.

But the biggest change that is needed is a change of mind-set. Unlike the dead aristocrats in the churchyards, the geeks who run Google and Facebook have no sense of guilt to give them pause and few ties of blood and soil to connect them to a particular patch of land. They believe that their fortunes are the product of nothing but their own innate genius. They owe the rest of us nothing.

This needs to change. Over the past decade both the Democratic Party and the Republican Party have been shaken by the forces of populism. The shaking will only get worse if the elites don’t play a more active role in politics. Since the Covid-19 outbreak, we have been reminded that good government can make the difference between life and death. Look at the two cities where the Western elite feel most at home: New York has lost more than 20,000 people, London 6,000 (at times the mortality rate was higher than the Blitz). By contrast, in Seoul, a bigger city with subways, nightclubs and everything else, only around 30 have died.

We live in a knowledge economy. For elites, exercising social responsibility should mean more than giving away money, though that is an admirable thing. It should mean sharing your brain—serving, not just giving. Michael Bloomberg did that as mayor of New York during the difficult decade after 9/11 (disclosure: Mr. Bloomberg employs one of us), and Bill Gates is the greatest philanthropist of his time not just because of the amount of money he has spent but because he devotes so much time to designing and driving his philanthropic work.

The habit must be set from early adulthood. More bright young things need to remember John F. Kennedy’s call to duty and think not of what their country can do for them but what they can do for their country. If more of the young flowing out of the Ivy League and Oxbridge worked in the public sector, its technology wouldn’t be so shoddy and its ethos so sluggish.

There is a twist in the dystopian tale that H.G. Wells told in “The Time Machine” more than a century ago. The Eloi seem to live wonderful lives. They frolic above the ground, subsisting on a diet of fruit and living in futuristic (if deteriorating) buildings, while the Morlocks, brutish, apelike creatures, lurk underground, tending machinery and occasionally surfacing to feed and clothe the Eloi. But this is an illusion. The Morlocks are in fact farming the Eloi as a food source, just as we farm cattle, sheep and pigs.

Unless the ethic of public service is once again reignited, the American world order will ossify, just as other empires did before it. That is the message today’s Eloi should take from English churchyards.

Mr. Micklethwait is the editor in chief of Bloomberg and Mr. Wooldridge is the political editor of The Economist. This essay is adapted from their new book, “The Wake Up Call: Why the Pandemic Has Exposed the Weakness of the West, and How to Fix It,” published by Harper Via (which, like The Wall Street Journal, is owned by News Corp).

Posted in History of education, Public Good, Schooling, Welfare

Public Schooling as Social Welfare

This post is a follow-up to a piece I posted three weeks ago, which was Michael Katz’s 2020 essay, Public Education as Welfare.  Below is my own take on this subject, which I wrote for a book that will be published in recognition of the hundredth anniversary of the Horace Mann League.  The tentative title of the book is Public Education: The Cornerstone of American Democracy and the editors are David Berliner and Carl Hermanns.  All of the contributions focus on the role that public schools play in American life.  Here’s a link to a pdf of my piece.

Public Schooling as Social Welfare

David F. Labaree

            In the mid nineteenth century, Horace Mann made a forceful case for a distinctly political vision of public schooling, as a mechanism for creating citizens for the American republic. In the twentieth century, policymakers put forth an alternative economic vision for this institution, as a mechanism for turning out productive workers to promote growth of the American economy. In this essay, I explore a third view of public schooling, which is less readily recognizable than the other two but no less important.  This is a social vision, in which public schooling serves as a mechanism for promoting social welfare, by working to ameliorate the inequalities of American society.  

All three of these visions construe public schooling as a public good.  As a public good, its benefits flow to the entire community, including those who never attended school, by enriching the broad spectrum of political, economic, and social life.  But public schooling is also a private good.  As such, its benefits accrue only to its graduates, who use their diplomas to gain selective access to jobs at the expense of those who lack these credentials. 

Consider the relative costs and benefits of these two types of goods.  Investing in public goods is highly inclusive, in that every dollar invested goes to support the common weal.  But at the same time this investment is also highly contingent, since individuals will gain the benefits even if they don’t contribute, getting a free ride on the contributions of others.  The usual way around the free rider problem is to make such investment mandatory for everyone through the mechanism of taxation.  By contrast, investment in private goods is self-sustaining, with no state action needed.  Individuals have a strong incentive to invest because only they gain the benefit.  In addition, as a private good its effects are highly exclusive, benefiting some people at the expense of others and thus tending to increase social inequality. 

Like the political and economic visions of schooling, the welfare vision carries the traits of its condition as a public good.  Its scope is inclusive, its impact is egalitarian, and its sustainability depends heavily on state mandate.  But it lacks a key advantage shared by the other two, whose benefits clearly flow to the population as a whole.  Everyone benefits by being part of a polity in which citizens are capable, law abiding, and informed.  Everyone benefits by being part of an economy in which workers contribute productively to the general prosperity. 

In contrast, however, it’s less obvious that everyone benefits from transferring public resources to disadvantaged citizens in order to improve their quality of life.  The word welfare carries a foul odor in American politics, redolent of laziness, bad behavior, and criminality.  It’s so bad that in 1980 the federal government changed the name of the Department Health, Education, and Welfare to Health and Human Services just to get rid of the stigmatized term.

So one reason that the welfare function doesn’t jump to mind when you think of schools is that we really don’t want to associate the two.  Don’t besmirch schooling by calling it welfare.  Michael Katz caught this feeling in the opening sentences of his 2010 essay, “Public Education as Welfare,” which serves as a reference point for my own essay:  “Welfare is the most despised public institution in America. Public education is the most iconic. To associate them with each other will strike most Americans as bizarre, even offensive.”  But let’s give it a try anyway.

My own essay arises from the time when I’m writing it – the summer of 2020 during the early phases of Covid-19 pandemic.  Like everyone else in the US, I watched in amazement this spring when schools suddenly shut down across the country and students started a new regime of online learning from home.  It started me thinking about what schools mean to us, what they do for us. 

Often it’s only when an institution goes missing that we come to recognize its value.  After the Covid shutdown, parents, children, officials, and citizens discovered just what they lost when the kids came home to stay.  You could hear voices around the country and around the globe pleading, “When are schools going to open again?”

I didn’t hear people talking much about the other two public goods views of schooling.  There wasn’t a groundswell of opinion complaining about the absence of citizenship formation or the falloff of human capital production.  Instead, there was a growing awareness of the various social welfare functions of schooling that were now suddenly gone.  Here are a few, in no particular order.

Schools are the main source of child care for working parents.  When schools close, someone needs to stay home to take care of the younger children.  For parents with the kind of white collar jobs that allow them to work from home, this causes a major inconvenience as they try to juggle work and child care and online schooling.  But for parents who can’t phone in their work, having to stay home with the kids is a huge financial sacrifice, and it’s even bigger for single parents in this category.

Schools are a key place for children to get healthy meals.  In the U.S., about 30 million students receive free or discounted lunch (and often breakfast) at school every day.  It’s so common that researchers use the proportion of “students on free or reduced lunch” as a measure of the poverty rate in individual schools.  When schools close, these children go hungry.  In response to this problem, a number of closed school systems have continued to prepare these meals for parents to pick up and take home with them.

Schools are crucial for the health of children.  In the absence of universal health care in the U.S., schools have served as a frail substitute.  They require all students to have vaccinations.  They provide health education.  And they have school nurses who can check for student ailments and make referrals.

Schools are especially important for dealing with the mental health of young people.  Teachers and school psychologists can identify mental illness and serve as prompts for getting students treatment.  Special education programs identify developmental disabilities in students and devise individualized plans for treating them.

Schools serve as oases for children who are abused at home.  Educators are required by law to look out for signs of mental or physical abuse and to report these cases to authorities.  When schools close, these children are trapped in abusive settings at home, which gives the lie to the idea of sheltering in place.  For many students, the true shelter is the school itself.  In the absence of teacher referrals, agencies reported a sharp drop-off in the reports of child abuse.

Schools are domains for relative safety for students who live in dangerous neighborhoods.  For many kids, who live in settings with gangs and drugs and crime, getting to and from school is the most treacherous part of the day.  Once inside the walls of the school, they are relatively free of physical threats.  Closing school doors to students puts them at risk.

Schools are environments that are often healthier than their own homes.  Students in wealthy neighborhoods may look on schools in poor neighborhoods as relatively shabby and depressing, but for many children the buildings have a degree of heat, light, cleanliness, and safety that they can’t find at home.  These schools may not have swimming pools and tennis courts, but they also don’t have rats and refuse.

Schools may be the only institutional setting for many kids in which the professional norm is to serve the best interests of the child.  We know that students can be harmed by schools.  All it takes is a bully or a disparaging judgment.  The core of the educator’s job is to foster growth, spur interest, increase knowledge, enhance skill, and promote development.  Being cut off from such an environment for a long period of time is a major loss for any student, rich or poor.

Schools are one of the few places in American life where young people undergo a shared experience.  This is especially true at the elementary level, where most children in a neighborhood attend the same school and undergo a relatively homogeneous curriculum.  It’s less true in high school, where the tracked curriculum provides more divergent experiences.  A key component of the shared experience is that it places you face-to-face with students who may be different from you.  As we have found, when you turn schooling into online learning, you tend to exacerbate social differences, because students are isolated in disparate family contexts where there is a sharp divide in internet access. 

Schools are where children socialize with each other.  A key reason kids want to go to school is because that’s where their friends are.  It’s where they make friends they otherwise would have never meet, learn to maintain these friendships, and learn how to manage conflicts.  Humans are thoroughly social animals, who need interaction with others in order to grow and thrive.  So being cooped up at home leaves everyone, but especially children, without a central component of human existence.

Schools are the primary public institution for overseeing the development of young children into healthy and capable adults.  Families are the core private institution engaged in this process, but schools serve as the critical intermediary between family and the larger society.  They’re the way our children learn now to live and engage with other people’s children, and they’re a key way that society seeks to ameliorate social differences that might impede children’s development, serving as what Mann called the “a great equalizer of the conditions of men – the balance wheel of the social machinery.”

These are some aspects of schooling that we take for granted but don’t think about very much.  For policymakers, these they may be considered side effects of the school’s academic mission, but for many (maybe most) families they are a main effect.  And the various social support roles that schools play are particularly critical in a country like the United States, where the absence of a robust social welfare system means that schools stand as the primary alternative.  School’s absence made the heart grow fonder for it.  We all become aware of just how much schools do for us.

Systems of universal public schooling did not arise in order to promote social welfare.  During the last 200 years, in countries around the world, the impetus came from the kind of political rationale that Horace Mann so eloquently put forward.  Public schools emerged as part of the process of creating nation states.  Their function was to turn subjects of the crown into citizens of the nation, or, as Eugen Weber put it in the title of his wonderful book, to turn Peasants into Frenchmen.  Schools took localized populations with regional dialects and traditional authority relations and helped affiliate these populations with an imagined community called France or the United States.  They created a common language (in case of France, it was Parisian French), a shared sense of national membership, and a shared educational experience. 

This is the origin story of public schooling.  But once schools became institutionalized and the state’s existence grew relatively secure, they began to accumulate other functions, both private (gaining an edge in the competition for social position) and public (promoting economic growth and supporting social welfare).  In different countries these functions took different forms, and the load the state placed on schooling varied considerably.  The American case, as is so often true, was extreme.

The U.S. bet the farm on the public school.  It was relatively early in establishing a system of publicly funded and governed schools across the country in the second quarter of the nineteenth century.  But it was way ahead of European countries in its rapid upward expansion of the system.  Universal enrollment moved quickly from primary school to grammar school to high school.  By 1900, the average American teenager had completed eight years of schooling.  This led to a massive surge in high school enrollments, which doubled every decade between 1890 and 1940.  By 1951, 75 percent of 16-year olds were enrolled in high school compared to only 14 percent in the United Kingdom.   In the three decades after the Second World War, the surge spilled over into colleges, with the rate of enrollment between 1950 and 1980 rising from 9 to 40 percent of the eligible population.

The US system had an indirect connection to welfare even before it started acting as a kind of social service agency.  The short version of the story is this.  In the second part of the nineteenth century, European countries like Disraeli’s United Kingdom and Bismarck’s Germany set up the framework for a welfare state, with pensions and other elements of a safety net for the working class.  The U.S. chose not to take this route, which it largely deferred until the 1930s.  Instead it put its money on schooling.  The vision was to provide individuals with educational opportunities to get ahead on their own rather than to give them direct aid to improve their current quality of life.  The idea was to focus on developing a promising future rather than on meeting current needs.  People were supposed to educate their way out of poverty, climbing up the ladder with the help of state schooling.  The fear was that provide direct relief for food, clothing, and shelter – the dreaded dole – would only stifle their incentive to get ahead.  Better to stimulate the pursuit of future betterment rather to run the risk that people might get used to subsisting comfortably in the present. 

By nature, schooling is a forward-looking enterprise.  Its focus is on preparing students for their future roles as citizens, workers, and members of society rather than on helping them deal with their current living conditions.  By setting up an educational state rather than a welfare state, the U.S. in effect chose to write off the parents, seen as a lost cause, and concentrate instead on providing opportunities to the children, seen as still salvageable. 

In the twentieth century, spurred by the New Deal’s response to the Great Depression, the U.S. developed the rudiments of a welfare state, with pensions and then health care for the elderly, temporary cash support and health care for the poor, and unemployment insurance for the worker.  At the same time, schools began to deal with the problems arising from poverty that students brought with them to the classroom.  This was propelled by a growing understanding that hungry, sick, and abused children are not going to able to take advantage of educational opportunities in order to attain a better life in the future.  Schooling alone couldn’t provide the chance for schooling to succeed.  Thus the introduction of free meals, the school nurse, de facto day care, and other social-work activities in the school. 

The tale of the rise of the social welfare function of the American public school, therefore, is anything but a success story.  Rather, it’s a story of one failure on top of another.  First is the failure to deal directly with social inequality in American life, when instead we chose to defer the intervention to the future by focusing on educating children while ignoring their parents.  Second, when poverty kept interfering with the schooling process, we introduced rudimentary welfare programs into the school in order give students a better chance, while still leaving poor parents to their own devices. 

As with the American welfare system in general, school welfare is not much but it’s better than nothing.  Carrying on the pattern set in the nineteenth century, we are still shirking responsibility for dealing directly with poverty through the political system by opposing universal health care and a strong safety net.  Instead, we continue to put our money on schooling as the answer when the real solution lies elsewhere.  Until we decide to implement that solution, however, schooling is all we’ve got. 

In the meantime, schools serve as the wobbly but indispensable balance wheel of American social life.  Too bad it took a global pandemic to get us to realize what we lose when schools close down.

Posted in Higher Education, History of education, Inequality, Meritocracy, Public Good, Uncategorized

How NOT to Defend the Private Research University

This post is a piece I published today in the Chronicle Review.  It’s about an issue that has been gnawing at me for years.  How can you justify the existence of institutions of the sort I taught at for the last two decades — rich private research universities?  These institutions obviously benefit their students and faculty, but what about the public as a whole?  Is there a public good they serve; and if so, what is it? 

Here’s the answer I came up with.  These are elite institutions to the core.  Exclusivity is baked in.  By admitting only a small number of elite students, they serve to promote social inequality by providing grads with an exclusive private good, a credential with high exchange value. But, in part because of this, they also produce valuable public goods — through the high quality research and the advanced graduate training that only they can provide. 

Open access institutions can promote the social mobility that private research universities don’t, but they can’t provide the same degree of research and advanced training.  The paradox is this:  It’s in the public’s interest to preserve the elitism of these institutions.  See what you think.

Hoover Tower

How Not to Defend the Private Research University

David F. Labaree

In this populist era, private research universities are easy targets that reek of privilege and entitlement. It was no surprise, then, when the White House pressured Harvard to decline $8.6 million in Covid-19-relief funds, while Stanford, Yale, and Princeton all judiciously decided not to seek such aid. With tens of billions of endowment dollars each, they hardly seemed to deserve the money.

And yet these institutions have long received outsized public subsidies. The economist Richard Vedder estimated that in 2010, Princeton got the equivalent of $50,000 per student in federal and state benefits, while its similar-size public neighbor, the College of New Jersey, got just $2,000 per student. Federal subsidies to private colleges include research grants, which go disproportionately to elite institutions, as well as student loan and scholarship funds. As recipients of such largess, how can presidents of private research universities justify their institutions to the public?

Here’s an example of how not to do so. Not long after he assumed the presidency of Stanford in 2016, Marc Tessier-Lavigne made the rounds of faculty meetings on campus in order to introduce himself and talk about future plans for the university. When he came to a Graduate School of Education meeting that I attended, he told us his top priority was to increase access. Asked how he might accomplish this, he said that one proposal he was considering was to increase the size of the entering undergraduate class by 100 to 200 students.

The problem is this: Stanford admits about 4.3 percent of the candidates who apply to join its class of 1,700. Admitting a couple hundred additional students might raise the admit rate to 5 percent. Now that’s access. The issue is that, for a private research university like Stanford, the essence of its institutional brand is its elitism. The inaccessibility is baked in.

Raj Chetty’s social mobility data for Stanford show that 66 percent of its undergrads come from the top 20 percent by income, 52 percent from the top 10 percent, 17 percent from the top 1 percent, and just 4 percent from the bottom 20 percent. Only 12 percent of Stanford grads move up by two quintiles or more — it’s hard for a university to promote social mobility when the large majority of its students starts at the top.

Compare that with the data for California State University at Los Angeles, where 12 percent of students are from the top quintile and 22 percent from the bottom quintile. Forty-seven percent of its graduates rise two or more income quintiles. Ten percent make it all the way from the bottom to the top quintile.

My point is that private research universities are elite institutions, and they shouldn’t pretend otherwise. Instead of preaching access and making a mountain out of the molehill of benefits they provide for the few poor students they enroll, they need to demonstrate how they benefit the public in other ways. This is a hard sell in our populist-minded democracy, and it requires acknowledging that the very exclusivity of these institutions serves the public good.

For starters, in making this case, we should embrace the emphasis on research production and graduate education and accept that providing instruction for undergraduates is only a small part of the overall mission. Typically these institutions have a much higher proportion of graduate students than large public universities oriented toward teaching (graduate students are 57 percent of the total at Stanford and just 8.5 percent in the California State University system).

Undergraduates may be able to get a high-quality education at private research universities, but there are plenty of other places where they could get the same or better, especially at liberal-arts colleges. Undergraduate education is not what makes these institutions distinctive. What does make them stand out are their professional schools and doctoral programs.

Private research universities are elite institutions, and they shouldn’t pretend otherwise.

Private research universities are souped up versions of their public counterparts, and in combination they exert an enormous impact on American life.

As of 2017, the American Association of Universities, a club consisting of the top 65 research universities, represented just 2 percent of all four-year colleges and 12 percent of all undergrads. And yet the group accounted for over 20 percent of all U.S. graduate students; 43 percent of all research doctorates; 68 percent of all postdocs; and 38 percent of all Nobel Prize winners. In addition, its graduates occupy the centers of power, including, by 2019, 64 of the Fortune 100 CEOs; 24 governors; and 268 members of Congress.

From 2014 to 2018, AAU institutions collectively produced 2.4-million publications, and their collective scholarship received 21.4 million citations. That research has an economic impact — these same institutions have established 22 research parks and, in 2018 alone, they produced over 4,800 patents, over 5,000 technology license agreements, and over 600 start-up companies.

Put all this together and it’s clear that research universities provide society with a stunning array of benefits. Some of these benefits accrue to individual entrepreneurs and investors, but the benefits for society at a whole are extraordinary. These universities drive widespread employment, technological advances that benefit consumers worldwide, and the improvement of public health (think of all the university researchers and medical schools advancing Covid-19-research efforts right now).

Besides their higher proportion of graduate students and lower student-faculty ratio, private research universities have other major advantages over publics. One is greater institutional autonomy. Private research universities are governed by a board of laypersons who own the university, control its finances, and appoint its officers. Government can dictate how it uses the public subsidies it gets (except tax subsidies), but otherwise it is free to operate as an independent actor in the academic market. This allows these colleges to pivot quickly to take advantage of opportunities for new programs of study, research areas, and sources of funding, largely independent of political influence, though they do face a fierce academic market full of other private colleges.

A 2010 study of universities in Europe and the U.S. by Caroline Hoxby and associates shows that this mix of institutional autonomy and competition is strongly associated with higher rankings in the world hierarchy of higher education. They find that every 1-percent increase in the share of the university budget that comes from government appropriations corresponds with a decrease in international ranking of 3.2 ranks. At the same time, each 1-percent increase in the university budget from competitive grants corresponds with an increase of 6.5 ranks. They also found that universities high in autonomy and competition produced more patents.

Another advantage the private research universities enjoy over their public counterparts, of course, is wealth. Stanford’s endowment is around $28 billion, and Berkeley’s is just under $5 billion, but because Stanford is so much smaller (16,000 versus 42,000 total students) this multiplies the advantage. Stanford’s endowment per student dwarfs Berkeley’s. The result is that private universities have more research resources: better labs, libraries, and physical plant; higher faculty pay (e.g., $254,000 for full professors at Stanford, compared to $200,000 at Berkeley); more funding for grad students, and more staff support.

A central asset of private research universities is their small group of academically and socially elite undergraduate students. The academic skill of these students is an important draw for faculty, but their current and future wealth is particularly important for the institution. From a democratic perspective, this wealth is a negative. The student body’s heavy skew toward the top of the income scale is a sign of how these universities are not only failing to provide much social mobility but are in fact actively engaged in preserving social advantage. We need to be honest about this issue.

But there is a major upside. Undergraduates pay their own way (as do students in professional schools); but the advanced graduate students don’t — they get free tuition plus a stipend to pay living expenses, which is subsidized, both directly and indirectly, by undergrads. The direct subsidy comes from the high sticker price undergrads pay for tuition. Part of this goes to help out upper-middle-class families who still can’t afford the tuition, but the rest goes to subsidize grad students.

The key financial benefits from undergrads come after they graduate, when the donations start rolling in. The university generously admits these students (at the expense of many of their peers), provides them with an education and a credential that jump-starts their careers and papers over their privilege, and then harvests their gratitude over a lifetime. Look around any college campus — particularly at a private research university — and you will find that almost every building, bench, and professor bears the name of a grateful donor. And nearly all of the money comes from former undergrads or professional school students, since it is they, not the doctoral students, who go on to earn the big bucks.

There is, of course, a paradox. Perhaps the gross preservation of privilege these schools traffic in serves a broader public purpose. Perhaps providing a valuable private good for the few enables the institution to provide an even more valuable public good for the many. And yet students who are denied admission to elite institutions are not being denied a college education and a chance to get ahead; they’re just being redirected. Instead of going to a private research university like Stanford or a public research university like Berkeley, many will attend a comprehensive university like San José State. Only the narrow metric of value employed at the pinnacle of the American academic meritocracy could construe this as a tragedy. San José State is a great institution, which accepts the majority of the students who apply and which sends a huge number of graduates to work in the nearby tech sector.

The economist Miguel Urquiola elaborates on this paradox in his book, Markets, Minds, and Money: Why America Leads the World in University Research (Harvard University Press, 2020), which describes how American universities came to dominate the academic world in the 20th century. The 2019 Shanghai Academic Ranking of World Universities shows that eight of the top 10 universities in the world are American, and seven of these are private.

Urquiola argues that the roots of American academe’s success can be found in its competitive marketplace. In most countries, universities are subsidiaries of the state, which controls its funding, defines its scope, and sets its policy. By contrast, American higher education has three defining characteristics: self-rule (institutions have autonomy to govern themselves); free entry (institutions can be started up by federal, state, or local governments or by individuals who acquire a corporate charter); and free scope (institutions can develop programs of research and study on their own initiative without undue governmental constraint).

The result is a radically unequal system of higher education, with extraordinary resources and capabilities concentrated in a few research universities at the top. Caroline Hoxby estimates that the most selective American research universities spend an average of $150,000 per student, 15 times as much as some poorer institutions.

As Urquiola explains, the competitive market structure puts a priority on identifying top research talent, concentrating this talent and the resources needed to support it in a small number of institutions, and motivating these researchers to ramp up their productivity. This concentration then makes it easy for major research-funding agencies, such as the National Institutes of Health, to identity the institutions that are best able to manage the research projects they want to support. And the nature of the research enterprise is such that, when markets concentrate minds and money, the social payoff is much greater than if they were dispersed more evenly.

Radical inequality in the higher-education system therefore produces outsized benefits for the public good. This, paradoxical as it may seem, is how we can truly justify the public investment in private research universities.

David Labaree is a professor emeritus at the Stanford Graduate School of Education.