Posted in Credentialing, Higher Education, Meritocracy

Rampell — It Takes a B.A. to Find a Job as a File Clerk

This blog post is a still salient 2013 article from the New York Times about credential inflation in the American job market. Turns out that if you want to be a file clerk or runner at a law firm these days, you’re going to need a four-year college degree. Here’s a link to the original.

It Takes a BA Photo

February 19, 2013

It Takes a B.A. to Find a Job as a File Clerk

By CATHERINE RAMPELL

ATLANTA —The college degree is becoming the new high school diploma: the new minimum requirement, albeit an expensive one, for getting even the lowest-level job.

Consider the 45-person law firm of Busch, Slipakoff & Schuh here in Atlanta, a place that has seen tremendous growth in the college-educated population. Like other employers across the country, the firm hires only people with a bachelor’s degree, even for jobs that do not require college-level skills.

This prerequisite applies to everyone, including the receptionist, paralegals, administrative assistants and file clerks. Even the office “runner” — the in-house courier who, for $10 an hour, ferries documents back and forth between the courthouse and the office — went to a four-year school.

“College graduates are just more career-oriented,” said Adam Slipakoff, the firm’s managing partner. “Going to college means they are making a real commitment to their futures. They’re not just looking for a paycheck.”

Economists have referred to this phenomenon as “degree inflation,” and it has been steadily infiltrating America’s job market. Across industries and geographic areas, many other jobs that didn’t used to require a diploma — positions like dental hygienists, cargo agents, clerks and claims adjusters — are increasingly requiring one, according to Burning Glass, a company that analyzes job ads from more than 20,000 online sources, including major job boards and small- to midsize-employer sites.

This up-credentialing is pushing the less educated even further down the food chain, and it helps explain why the unemployment rate for workers with no more than a high school diploma is more than twice that for workers with a bachelor’s degree: 8.1 percent versus 3.7 percent.

Some jobs, like those in supply chain management and logistics, have become more technical, and so require more advanced skills today than they did in the past. But more broadly, because so many people are going to college now, those who do not graduate are often assumed to be unambitious or less capable.

Plus, it’s a buyer’s market for employers.

“When you get 800 résumés for every job ad, you need to weed them out somehow,” said Suzanne Manzagol, executive recruiter at Cardinal Recruiting Group, which does headhunting for administrative positions at Busch, Slipakoff & Schuh and other firms in the Atlanta area.

Of all the metropolitan areas in the United States, Atlanta has had one of the largest inflows of college graduates in the last five years, according to an analysis of census data by William Frey, a demographer at the Brookings Institution. In 2012, 39 percent of job postings for secretaries and administrative assistants in the Atlanta metro area requested a bachelor’s degree, up from 28 percent in 2007, according to Burning Glass.

“When I started recruiting in ’06, you didn’t need a college degree, but there weren’t that many candidates,” Ms. Manzagol said.

Even if they are not exactly applying the knowledge they gained in their political science, finance and fashion marketing classes, the young graduates employed by Busch, Slipakoff & Schuh say they are grateful for even the rotest of rote office work they have been given.

“It sure beats washing cars,” said Landon Crider, 24, the firm’s soft-spoken runner.

He would know: he spent several years, while at Georgia State and in the months after graduation, scrubbing sedans at Enterprise Rent-a-Car. Before joining the law firm, he was turned down for a promotion to rental agent at Enterprise — a position that also required a bachelor’s degree — because the company said he didn’t have enough sales experience.

His college-educated colleagues had similarly limited opportunities, working at Ruby Tuesday or behind a retail counter while waiting for a better job to open up.

“I am over $100,000 in student loan debt right now,” said Megan Parker, who earns $37,000 as the firm’s receptionist. She graduated from the Art Institute of Atlanta in 2011 with a degree in fashion and retail management, and spent months waiting on “bridezillas” at a couture boutique, among other stores, while churning out office-job applications.

“I will probably never see the end of that bill, but I’m not really thinking about it right now,” she said. “You know, this is a really great place to work.”

The risk with hiring college graduates for jobs they are supremely overqualified for is, of course, that they will leave as soon as they find something better, particularly as the economy improves.

Mr. Slipakoff said his firm had little turnover, though, largely because of its rapid expansion. The company has grown to more than 30 lawyers from five in 2008, plus a support staff of about 15, and promotions have abounded.

“They expect you to grow, and they want you to grow,” said Ashley Atkinson, who graduated from Georgia Southern University in 2009 with a general studies degree. “You’re not stuck here under some glass ceiling.”

Within a year of being hired as a file clerk, around Halloween 2011, Ms. Atkinson was promoted twice to positions in marketing and office management. Mr. Crider, the runner, was given additional work last month, helping with copying and billing claims. He said he was taking the opportunity to learn more about the legal industry, since he plans to apply to law school next year.

The firm’s greatest success story is Laura Burnett, who in less than a year went from being a file clerk to being the firm’s paralegal for the litigation group. The partners were so impressed with her filing wizardry that they figured she could handle it.

“They gave me a raise, too,” said Ms. Burnett, a 2011 graduate of the University of West Georgia.

The typical paralegal position, which has traditionally offered a path to a well-paying job for less educated workers, requires no more than an associate degree, according to the Labor Department’s occupational handbook, but the job is still a step up from filing. Of the three daughters in her family, Ms. Burnett reckons that she has the best job. One sister, a fellow West Georgia graduate, is processing insurance claims; another, who dropped out of college, is one of the many degree-less young people who still cannot find work.

Besides the promotional pipelines it creates, setting a floor of college attainment also creates more office camaraderie, said Mr. Slipakoff, who handles most of the firm’s hiring and is especially partial to his fellow University of Florida graduates. There is a lot of trash-talking of each other’s college football teams, for example. And this year the office’s Christmas tree ornaments were a colorful menagerie of college mascots — GatorsBlue DevilsYellow JacketsWolvesEagles,TigersPanthers — in which just about every staffer’s school was represented.

“You know, if we had someone here with just a G.E.D. or something, I can see how they might feel slighted by the social atmosphere here,” he says. “There really is something sort of cohesive or binding about the fact that all of us went to college.”

Posted in Democracy, History, Liberty, Race

The Central Link between Liberty and Slavery in American History

In this post, I explore insights from two important books about the peculiar way in which liberty and slavery jointly emerged from the context of colonial America. One is a new book by David Stasavage, The Decline and Rise of Democracy. The other is a 1992 book by Toni Morrison, Playing in the Dark: Whiteness and the Literary Imagination. The core point I draw from Stasavage is that the same factors that nurtured the development of political liberty in the American context also led to the development of slavery. The related point I draw from Morrison is that the existence of slavery was fundamental in energizing the colonists’ push for self rule.

Stasavage Cover

The Stasavage book explores the history of democracy in the world, starting with early forms that emerged in premodern North America, Europe, and Africa and then fell into decline, followed by the rise of modern parliamentary democracy.  He contrasts this with an alternative form of governance, autocracy, which grew up in a large number of times and places but appeared earliest and most enduringly in China.

He argues that three conditions were necessary for the emergence of early democracy. One is small scale, which allows people to confer as a group instead of relying on a distant leader.  Another is that rulers lack the knowledge about what people were producing, such as an administrative bureaucracy could provide, which means they needed to share power in order to be able to levy taxes effectively.  But I want to focus on the third factor — the existence of an exit option — which is most salient to the colonial American case.  Here’s how he describes it:

The third factor that led to early democracy involved the balance between how much rulers needed their people and how much people could do without their rulers. When rulers had a greater need for revenue, they were more likely to accept governing in a collaborative fashion, and this was even more likely if they needed people to fight wars. With inadequate means of simply compelling people to fight, rulers offered them political rights. The flip side of all this was that whenever the populace found it easier to do without a particular ruler—say by moving to a new location—then rulers felt compelled to govern more consensually. The idea that exit options influence hierarchy is, in fact, so general it also applies to species other than humans. Among species as diverse as ants, birds, and wasps, social organization tends to be less hierarchical when the costs of what biologists call “dispersal” are low.

The central factor that supported the development of democracy in the British colonies was the scarcity of labor:

A broad manhood suffrage took hold in the British part of colonial North America not because of distinctive ideas but for the simple reason that in an environment where land was abundant and labor was scarce, ordinary people had good exit options. This was the same fundamental factor that had favored democracy in other societies.

And this was also the factor that promoted slavery:  “Political rights for whites and slavery for Africans derived from the same underlying environmental condition of labor scarcity.”  Because of this scarcity, North American agricultural enterprises in the colonies needed a way to ensure a flow of laborers to the colonies and a way to keep them on the job once they got there.  The central mechanisms for doing that were indentured servitude and slavery.  Some indentured servants were recruited in Britain with the promise of free passage to the new world in return for a contract to work for a certain number of years.  Others were simply kidnapped, shipped, and then forced to work off their passage.  At the same time Africans initially came to the colonies in a variety of statuses, but this increasingly shifted toward full slavery.  Here’s how he describes the situation in Tidewater colonies.

The early days of forced shipment of English to Virginia sounds like it would have been an environment ripe for servitude once they got there. In fact, it did not always work that way. Once they finished their period of indenture, many English migrants established farms of their own. This exit option must have been facilitated by the fact that they looked like Virginia’s existing British colonists, and they also sounded like them. They would have also shared a host of other cultural commonalities. In other words, they had a good outside option.

Now consider the case of Africans in Virginia, Maryland, and the other British colonies in North America who began arriving in 1619. The earliest African arrivals to Virginia and Maryland came in a variety of situations. Some were free and remained so, some were indentured under term contracts analogous to those of many white migrants, and some came entirely unfree. Outside options also mattered for Africans, and for several obvious reasons they were much worse than those for white migrants. Africans looked different than English people, they most often would not have arrived speaking English, or being aware of English cultural practices, and there is plenty of evidence that people in Elizabethan and Jacobean England associated dark skin with inferiority or other negative qualities. Outside options for Africans were remote to nonexistent. The sustainability of slavery in colonies like Virginia and Maryland depended on Africans not being able to escape and find labor elsewhere. For slave owners it of course helped that they had the law on their side. This law evolved quickly to define exactly what a “slave” was, there having been no prior juridical definition of the term. Africans were now to be slaves whereas kidnapped British boys were bound by “the custom of the country,” meaning that eventual release could be expected.

So labor scarcity and the existence of an attractive exit option provided the formative conditions for developing both white self-rule and Black enslavement.

Morrison Book Cover

Toni Morrison’s book is an reflection on the enduring impact of whiteness and blackness in shaping American literature.  In the passage below, from the chapter titled “Romancing the Shadow,” she is talking about the romantic literary tradition in the U.S.

There is no romance free of what Herman Melville called “the power of blackness,” especially not in a country in which there was a resident population, already black, upon which the imagination could play; through which historical, moral, metaphysical, and social fears, problems, and dichotomies could be articulated. The slave population, it could be and was assumed, offered itself up as surrogate selves for meditation on problems of human freedom, its lure and its elusiveness. This black population was available for meditations on terror — the terror of European outcasts, their dread of failure, powerlessness, Nature without limits, natal loneliness, internal aggression, evil, sin, greed. In other words , this slave population was understood to have offered itself up for reflections on human freedom in terms other than the abstractions of human potential and the rights of man.

The ways in which artists — and the society that bred them — transferred internal conflicts to a “blank darkness,” to conveniently bound and violently silenced black bodies, is a major theme in American literature. The rights of man, for example, an organizing principle upon which the nation was founded, was inevitably yoked to Africanism. Its history, its origin is permanently allied with another seductive concept: the hierarchy of race…. The concept of freedom did not emerge in a vacuum. Nothing highlighted freedom — if it did not in fact create it — like slavery.

Black slavery enriched the country’s creative possibilities. For in that construction of blackness and enslavement could be found not only the not-free but also, with the dramatic polarity created by skin color, the projection of the not-me. The result was a playground for the imagination. What rose up out of collective needs to allay internal fears and to rationalize external exploitation was an American Africanism — a fabricated brew of darkness, otherness, alarm, and desire that is uniquely American.

Such a lovely passage describing such an ugly distinction.  She’s saying that for Caucasian plantation owners in the Tidewater colonies, the presence of Black slaves was a vivid and visceral reminder of what it means to be not-free and thus decidedly not-me.  For people like Jefferson and Washington and Madison, the most terrifying form of unfreedom was in their faces every day.  More than their pale brethren in the Northern colonies, they had a compelling desire to never be treated by the king even remotely like the way they treated their own slaves.  

“The concept of freedom did not emerge in a vacuum. Nothing highlighted freedom — if it did not in fact create it — like slavery.”

Posted in History of education, Public Good, Schooling, Welfare

Public Schooling as Social Welfare

This post is a follow-up to a piece I posted three weeks ago, which was Michael Katz’s 2020 essay, Public Education as Welfare.  Below is my own take on this subject, which I wrote for a book that will be published in recognition of the hundredth anniversary of the Horace Mann League.  The tentative title of the book is Public Education: The Cornerstone of American Democracy and the editors are David Berliner and Carl Hermanns.  All of the contributions focus on the role that public schools play in American life.  Here’s a link to a pdf of my piece.

Public Schooling as Social Welfare

David F. Labaree

            In the mid nineteenth century, Horace Mann made a forceful case for a distinctly political vision of public schooling, as a mechanism for creating citizens for the American republic. In the twentieth century, policymakers put forth an alternative economic vision for this institution, as a mechanism for turning out productive workers to promote growth of the American economy. In this essay, I explore a third view of public schooling, which is less readily recognizable than the other two but no less important.  This is a social vision, in which public schooling serves as a mechanism for promoting social welfare, by working to ameliorate the inequalities of American society.  

All three of these visions construe public schooling as a public good.  As a public good, its benefits flow to the entire community, including those who never attended school, by enriching the broad spectrum of political, economic, and social life.  But public schooling is also a private good.  As such, its benefits accrue only to its graduates, who use their diplomas to gain selective access to jobs at the expense of those who lack these credentials. 

Consider the relative costs and benefits of these two types of goods.  Investing in public goods is highly inclusive, in that every dollar invested goes to support the common weal.  But at the same time this investment is also highly contingent, since individuals will gain the benefits even if they don’t contribute, getting a free ride on the contributions of others.  The usual way around the free rider problem is to make such investment mandatory for everyone through the mechanism of taxation.  By contrast, investment in private goods is self-sustaining, with no state action needed.  Individuals have a strong incentive to invest because only they gain the benefit.  In addition, as a private good its effects are highly exclusive, benefiting some people at the expense of others and thus tending to increase social inequality. 

Like the political and economic visions of schooling, the welfare vision carries the traits of its condition as a public good.  Its scope is inclusive, its impact is egalitarian, and its sustainability depends heavily on state mandate.  But it lacks a key advantage shared by the other two, whose benefits clearly flow to the population as a whole.  Everyone benefits by being part of a polity in which citizens are capable, law abiding, and informed.  Everyone benefits by being part of an economy in which workers contribute productively to the general prosperity. 

In contrast, however, it’s less obvious that everyone benefits from transferring public resources to disadvantaged citizens in order to improve their quality of life.  The word welfare carries a foul odor in American politics, redolent of laziness, bad behavior, and criminality.  It’s so bad that in 1980 the federal government changed the name of the Department Health, Education, and Welfare to Health and Human Services just to get rid of the stigmatized term.

So one reason that the welfare function doesn’t jump to mind when you think of schools is that we really don’t want to associate the two.  Don’t besmirch schooling by calling it welfare.  Michael Katz caught this feeling in the opening sentences of his 2010 essay, “Public Education as Welfare,” which serves as a reference point for my own essay:  “Welfare is the most despised public institution in America. Public education is the most iconic. To associate them with each other will strike most Americans as bizarre, even offensive.”  But let’s give it a try anyway.

My own essay arises from the time when I’m writing it – the summer of 2020 during the early phases of Covid-19 pandemic.  Like everyone else in the US, I watched in amazement this spring when schools suddenly shut down across the country and students started a new regime of online learning from home.  It started me thinking about what schools mean to us, what they do for us. 

Often it’s only when an institution goes missing that we come to recognize its value.  After the Covid shutdown, parents, children, officials, and citizens discovered just what they lost when the kids came home to stay.  You could hear voices around the country and around the globe pleading, “When are schools going to open again?”

I didn’t hear people talking much about the other two public goods views of schooling.  There wasn’t a groundswell of opinion complaining about the absence of citizenship formation or the falloff of human capital production.  Instead, there was a growing awareness of the various social welfare functions of schooling that were now suddenly gone.  Here are a few, in no particular order.

Schools are the main source of child care for working parents.  When schools close, someone needs to stay home to take care of the younger children.  For parents with the kind of white collar jobs that allow them to work from home, this causes a major inconvenience as they try to juggle work and child care and online schooling.  But for parents who can’t phone in their work, having to stay home with the kids is a huge financial sacrifice, and it’s even bigger for single parents in this category.

Schools are a key place for children to get healthy meals.  In the U.S., about 30 million students receive free or discounted lunch (and often breakfast) at school every day.  It’s so common that researchers use the proportion of “students on free or reduced lunch” as a measure of the poverty rate in individual schools.  When schools close, these children go hungry.  In response to this problem, a number of closed school systems have continued to prepare these meals for parents to pick up and take home with them.

Schools are crucial for the health of children.  In the absence of universal health care in the U.S., schools have served as a frail substitute.  They require all students to have vaccinations.  They provide health education.  And they have school nurses who can check for student ailments and make referrals.

Schools are especially important for dealing with the mental health of young people.  Teachers and school psychologists can identify mental illness and serve as prompts for getting students treatment.  Special education programs identify developmental disabilities in students and devise individualized plans for treating them.

Schools serve as oases for children who are abused at home.  Educators are required by law to look out for signs of mental or physical abuse and to report these cases to authorities.  When schools close, these children are trapped in abusive settings at home, which gives the lie to the idea of sheltering in place.  For many students, the true shelter is the school itself.  In the absence of teacher referrals, agencies reported a sharp drop-off in the reports of child abuse.

Schools are domains for relative safety for students who live in dangerous neighborhoods.  For many kids, who live in settings with gangs and drugs and crime, getting to and from school is the most treacherous part of the day.  Once inside the walls of the school, they are relatively free of physical threats.  Closing school doors to students puts them at risk.

Schools are environments that are often healthier than their own homes.  Students in wealthy neighborhoods may look on schools in poor neighborhoods as relatively shabby and depressing, but for many children the buildings have a degree of heat, light, cleanliness, and safety that they can’t find at home.  These schools may not have swimming pools and tennis courts, but they also don’t have rats and refuse.

Schools may be the only institutional setting for many kids in which the professional norm is to serve the best interests of the child.  We know that students can be harmed by schools.  All it takes is a bully or a disparaging judgment.  The core of the educator’s job is to foster growth, spur interest, increase knowledge, enhance skill, and promote development.  Being cut off from such an environment for a long period of time is a major loss for any student, rich or poor.

Schools are one of the few places in American life where young people undergo a shared experience.  This is especially true at the elementary level, where most children in a neighborhood attend the same school and undergo a relatively homogeneous curriculum.  It’s less true in high school, where the tracked curriculum provides more divergent experiences.  A key component of the shared experience is that it places you face-to-face with students who may be different from you.  As we have found, when you turn schooling into online learning, you tend to exacerbate social differences, because students are isolated in disparate family contexts where there is a sharp divide in internet access. 

Schools are where children socialize with each other.  A key reason kids want to go to school is because that’s where their friends are.  It’s where they make friends they otherwise would have never meet, learn to maintain these friendships, and learn how to manage conflicts.  Humans are thoroughly social animals, who need interaction with others in order to grow and thrive.  So being cooped up at home leaves everyone, but especially children, without a central component of human existence.

Schools are the primary public institution for overseeing the development of young children into healthy and capable adults.  Families are the core private institution engaged in this process, but schools serve as the critical intermediary between family and the larger society.  They’re the way our children learn now to live and engage with other people’s children, and they’re a key way that society seeks to ameliorate social differences that might impede children’s development, serving as what Mann called the “a great equalizer of the conditions of men – the balance wheel of the social machinery.”

These are some aspects of schooling that we take for granted but don’t think about very much.  For policymakers, these they may be considered side effects of the school’s academic mission, but for many (maybe most) families they are a main effect.  And the various social support roles that schools play are particularly critical in a country like the United States, where the absence of a robust social welfare system means that schools stand as the primary alternative.  School’s absence made the heart grow fonder for it.  We all become aware of just how much schools do for us.

Systems of universal public schooling did not arise in order to promote social welfare.  During the last 200 years, in countries around the world, the impetus came from the kind of political rationale that Horace Mann so eloquently put forward.  Public schools emerged as part of the process of creating nation states.  Their function was to turn subjects of the crown into citizens of the nation, or, as Eugen Weber put it in the title of his wonderful book, to turn Peasants into Frenchmen.  Schools took localized populations with regional dialects and traditional authority relations and helped affiliate these populations with an imagined community called France or the United States.  They created a common language (in case of France, it was Parisian French), a shared sense of national membership, and a shared educational experience. 

This is the origin story of public schooling.  But once schools became institutionalized and the state’s existence grew relatively secure, they began to accumulate other functions, both private (gaining an edge in the competition for social position) and public (promoting economic growth and supporting social welfare).  In different countries these functions took different forms, and the load the state placed on schooling varied considerably.  The American case, as is so often true, was extreme.

The U.S. bet the farm on the public school.  It was relatively early in establishing a system of publicly funded and governed schools across the country in the second quarter of the nineteenth century.  But it was way ahead of European countries in its rapid upward expansion of the system.  Universal enrollment moved quickly from primary school to grammar school to high school.  By 1900, the average American teenager had completed eight years of schooling.  This led to a massive surge in high school enrollments, which doubled every decade between 1890 and 1940.  By 1951, 75 percent of 16-year olds were enrolled in high school compared to only 14 percent in the United Kingdom.   In the three decades after the Second World War, the surge spilled over into colleges, with the rate of enrollment between 1950 and 1980 rising from 9 to 40 percent of the eligible population.

The US system had an indirect connection to welfare even before it started acting as a kind of social service agency.  The short version of the story is this.  In the second part of the nineteenth century, European countries like Disraeli’s United Kingdom and Bismarck’s Germany set up the framework for a welfare state, with pensions and other elements of a safety net for the working class.  The U.S. chose not to take this route, which it largely deferred until the 1930s.  Instead it put its money on schooling.  The vision was to provide individuals with educational opportunities to get ahead on their own rather than to give them direct aid to improve their current quality of life.  The idea was to focus on developing a promising future rather than on meeting current needs.  People were supposed to educate their way out of poverty, climbing up the ladder with the help of state schooling.  The fear was that provide direct relief for food, clothing, and shelter – the dreaded dole – would only stifle their incentive to get ahead.  Better to stimulate the pursuit of future betterment rather to run the risk that people might get used to subsisting comfortably in the present. 

By nature, schooling is a forward-looking enterprise.  Its focus is on preparing students for their future roles as citizens, workers, and members of society rather than on helping them deal with their current living conditions.  By setting up an educational state rather than a welfare state, the U.S. in effect chose to write off the parents, seen as a lost cause, and concentrate instead on providing opportunities to the children, seen as still salvageable. 

In the twentieth century, spurred by the New Deal’s response to the Great Depression, the U.S. developed the rudiments of a welfare state, with pensions and then health care for the elderly, temporary cash support and health care for the poor, and unemployment insurance for the worker.  At the same time, schools began to deal with the problems arising from poverty that students brought with them to the classroom.  This was propelled by a growing understanding that hungry, sick, and abused children are not going to able to take advantage of educational opportunities in order to attain a better life in the future.  Schooling alone couldn’t provide the chance for schooling to succeed.  Thus the introduction of free meals, the school nurse, de facto day care, and other social-work activities in the school. 

The tale of the rise of the social welfare function of the American public school, therefore, is anything but a success story.  Rather, it’s a story of one failure on top of another.  First is the failure to deal directly with social inequality in American life, when instead we chose to defer the intervention to the future by focusing on educating children while ignoring their parents.  Second, when poverty kept interfering with the schooling process, we introduced rudimentary welfare programs into the school in order give students a better chance, while still leaving poor parents to their own devices. 

As with the American welfare system in general, school welfare is not much but it’s better than nothing.  Carrying on the pattern set in the nineteenth century, we are still shirking responsibility for dealing directly with poverty through the political system by opposing universal health care and a strong safety net.  Instead, we continue to put our money on schooling as the answer when the real solution lies elsewhere.  Until we decide to implement that solution, however, schooling is all we’ve got. 

In the meantime, schools serve as the wobbly but indispensable balance wheel of American social life.  Too bad it took a global pandemic to get us to realize what we lose when schools close down.

Posted in Academic writing, Writing

Farnsworth on Balancing Saxon and Latinate Words in Your Writing

This post focuses on the value of using an apt mix of Saxon and Latinate words in your writing.  It draws on a book by Ward Farnsworth called Farnsworth’s Classical English Style.  English has a wonderfully polyglot heritage to draw upon — starting with an ancient form of German brought by early Saxon invaders, then Danish brought by Vikings, and finally French brought by the last set of conquerors

The primary poles of the language remain the Saxon and the Latinate, and this polarity provides a rich array of possibilities for authors seeking to enhance the effectiveness of their writing.  The two forms of words have strikingly different characteristics, which the skilled writer can use to considerable effect.

Farnsworth Cover

You don’t need the Oxford English Dictionary to tell you distinguish the two kinds of words from each other.  Here’s how Farnsworth puts it:

Saxon words are shorter, and in their simplest forms they usually consist of just one syllable. Latinate words usually have a root of two or three syllables, and then can be lengthened further and turned into other parts of speech.

The simplest guide, useful often but not always, is this: if a word ends with -tion, or if it could be made into a similar word that does, then it almost always is derived from Latin. Same if it easily takes other suffixes that turn it into a longish word.

A key difference is the sound:

Saxon words tend to sound different from Latinate words in ways distantly related to the sounds of the modern German and French languages. Many Saxon words have hard sounds like ck or the hard g. Latinate words are usually softer and more mellifluous.

Another central difference is between high and low speech, formal and informal speech.

When French arrived in England it was the language of the conqueror and the new nobility. A thousand years later, words from French still connote a certain fanciness and distance from the gritty, and Saxon words still seem plainer, less formal, and closer to the earth. If you want to talk clinically about something distasteful, you use the Latinate word for it – the one derived from old French: terminate or execute (Latinate) instead of kill (Saxon).

As the conceptual life of English speakers became more sophisticated, they needed new words to talk about what they were thinking. They usually made them out of French or more directly from Latin or Greek. That is part of why people who teach at universities find it hard to prefer Saxon words when they have their conversations. Most of them would probably write better if they did use more Saxon words, but there are lots of tempting Latinate words that seem designed for academic purposes, because they were. They allow a kind of precision (or facilitate a kind of jargon) that Saxon words cannot match.

As Farnsworth notes, this high-low difference offers both an opportunity and a challenge for academics.  We need Latinate words in order to achieve the desired precision and complexity of argument and to deal with abstraction — all central components of academic discourse.  But we also may lean toward the Latinate simply because it makes us sound and feel more professional, unsullied by common speech.  This not only puts off the nonacademic reader but also clouds clarity and reduces impact for all readers.

Yet another distinction in these types of words is between the visual and the conceptual, the felt and the thought.

Saxon words tend to be easier to picture than the Latinate kind, most of which need a minor moment of translation before they appear in the mind’s eye. Compare light (Saxon) and illumination (Latinate), bodily (Saxon) and corporeal (Latinate), burn (Saxon) and incinerate (Latinate). The difference between visual and conceptual is related to the ways that these kinds of words can speak to the different capacities of an audience. Latinate words tend to create distance from what they describe. They invite thought but not feeling. Saxon words are more visceral. They take a shorter path to the heart.

Here he lays out a central principle of good writing in English.

For most people most of the time, attractive English isn’t the art of choosing beautiful words. It is the art of arranging humble words beautifully.

Here’s an example from Lincoln’s Gettysburg Address:

The world will little note, nor long remember what we say here, but it can never forget what they did here.

He provides some other vivid examples from that wellspring of good writing, the King James Bible.

And God said, Let there be light: and there was light. Gen. 1:3 Ask, and it shall be given you; seek, and ye shall find; knock, and it shall be opened unto you. Matt. 7:7 And ye shall know the truth, and the truth shall make you free. John 8:32 Greater love hath no man than this, that a man lay down his life for his friends. John 15:13 Every word of those passages is Saxon. The gravity of their meaning matches the simplicity of their wording.

Perhaps more precisely, the sense of weight is increased by the contrast between the size of the meanings and the size of the words. A big thing has been pressed into a small container. The result is a type of tension. It gets released in the mind of the reader.

I love the way he captures the dynamics of powerful writing — creating a “tension” that “gets released in the mind of the reader.”

One way the writer uses the difference in character of the two kinds of words is by deploying them at different places within the same sentence:  starting Latinate and ending Saxon, or the other way around.  Consider what happens when you use the first approach.

Starting with Latinate words creates a sense of height and abstraction. Ending with plain language brings the sentence onto land. The simplicity of the finish can also lend it a conclusive ring. And the longer words give the shorter ones a power, by force of contrast, that the shorter ones would not have had alone.

Lincoln is well-known for his love of simple language, but he was also at home with Latinate words and mixed the two types to strong effect. He especially liked to circle with larger words early in a sentence and then finish it simply. The pattern allowed him to offer intellectual or idealistic substance and then tie it to a stake in the dirt.

Here’s an example, from a letter he wrote about the Emancipation Proclamation:

But the proclamation, as law, either is valid or is not valid. If it is not valid it needs no retraction. If it is valid it cannot be retracted, any more than the dead can be brought to life.

An example from Winston Churchill:

They will never understand how it was that a victorious nation, with everything in hand, suffered themselves to be brought low, and to cast away all that they had gained by measureless sacrifice and absolute victory – gone with the wind!

And another from an opinion by Oliver Wendell Holmes.  The setup is Latinate, the punchline is Saxon.

If there is any principle of the Constitution that more imperatively calls for attachment than any other it is the principle of free thought – not free thought for those who agree with us but freedom for the thought that we hate.

Frederick Douglass:

Inaction is followed by stagnation. Stagnation is followed by pestilence and pestilence is followed by death.

But you can also reverse the direction to good effect, starting Saxon and ending Latinate.

A frequent product of this pattern is a sense of compression released. Moving from Saxon to Latinate words makes the first part of a sentence feel compact, the rest expansive. The last part thus gains a kind of push.

The way of the Lord is strength to the upright: but destruction shall be to the workers of iniquity. Prov. 10:29 In that last case the good and the strong are described in simple words. The long words are reserved for the villains. 

How great are his signs! and how mighty are his wonders! his kingdom is an everlasting kingdom, and his dominion is from generation to generation. Dan. 4:3 These sentences go from a tight start to a finish that flows freely and gains in height. The result can be a feeling of increasing grandeur, like passing from a low ceiling into a room with a higher one.

Lincoln again:

I insist that if there is any thing which it is the duty of the whole people to never entrust to any hands but their own, that thing is the preservation and perpetuity of their own liberties and institutions.

And Churchill:

You ask, what is our policy? I can say: It is to wage war, by sea, land, and air, with all our might and with all the strength that God can give us; to wage war against a monstrous tyranny, never surpassed in the dark and lamentable catalogue of human crime. That is our policy.

Try creating and releasing tension in your own sentences through a judicious mix of Saxon and Latinate words.

 

Posted in Educational goals, History of education

Are Students Consumers?

This post is a piece I published in Education Week way back in 1997.  It’s a much shorter and more accessible version of the most cited paper I ever published, “Public Goods, Private Goods: The American Struggle over Educational Goals.”  Drawing on the latter, it lays out a case of three competing educational goals that have shaped the history of American schooling: democratic equality, social efficiency, and social mobility. 

In reading it over, I find it holds up rather well, except for a tendency to demonize social mobility.  Since then I’ve come to think that, while the latter does a lot of harm, it’s also an essential component of schooling.  We can’t help but be concerned about the selective benefit that schooling provides us and our children even as we at the same time are concerned about supporting the broader benefits that schooling provides the public as a whole.

See what you think.  Here’s a link to the original and also to a PDF in case you can’t get past the paywall.  

 

Are Students “Consumers”?

David F. Labaree

Observers of American education have frequently noted that the general direction of educational reform over the years has not been forward but back and forth. Reform, it seems, is less an engine of progress than a pendulum, swinging monotonously between familiar policy alternatives. Progress is hard to come by.

However, a closer reading of the history of educational change in this country reveals a pattern that is both more complex and in a way more troubling than this. Yes, the back-and-forth movement is real, but it turns out that this pattern is for the most part good news. It simply represents a periodic shift in emphasis between two goals for education — democratic equality and social efficiency — that represent competing but equally indispensable visions of education.

The bad news is that in the 20th century, and especially in the past several decades, the pendulum swings increasingly have given way to a steady movement in the direction of a third goal, social mobility. This shift from fluctuation to forward motion may look like progress, but it’s not. The problem is that it represents a fundamental change in the way we think about education, by threatening to transform this most public of institutions from a public good into a private good. The consequences for both school and society, I suggest, are potentially devastating.

Let me explain why. First we’ll consider the role that these three goals have played in American education, and then we can explore the implications of the movement from equality and efficiency to mobility.

The first goal is democratic equality, which is the oldest of the three. From this point of view, the purpose of schooling is to produce competent citizens. This goal provided the primary impetus for the common school movement, which established the foundation for universal public education in this country during the middle of the 19th century. The idea was and is that all citizens need to be able to think, understand the world around them, behave sociably, and act according to shared political values — and that public schools are the best places to accomplish these ends. The corollary of this goal is that all these capabilities need to be equally distributed, and that public schools can serve as what Horace Mann called the great “balance wheel,” by providing a common educational competence that helps reduce differences.

Some of the most enduring and familiar characteristics of our current system of education were formed historically in response to this goal. There are the neighborhood elementary school and the comprehensive high school, which draw together students from the whole community under one roof. There is the distinctively American emphasis on general education at all levels of the educational system. There is the long-standing practice of socially promoting students from grade to grade. And there is the strong emphasis on inclusion, which over the years has led to such innovations as racial integration and the mainstreaming of special education students.

The second goal is social efficiency, which first became prominent in the Progressive era at the turn of the century. From this perspective, the purpose of education is not to produce citizens but to train productive workers. The idea is that our society’s health depends on a growing economy, and economy needs workers with skills that will allow them to carry out their occupational roles effectively. Schools, therefore, should place less emphasis on general education and more on the skills needed for particular jobs. And because skill requirements differ greatly from job to job, schools need to tailor curricula to the job and then sort students into the different curricula.

Consider some of the enduring effects that this goal has had on education over the years. There is the presence of explicitly vocational programs of study within the high school and college curriculum. There is the persistent practice of tracking and ability grouping. And there is the prominence of social efficiency arguments in the public rhetoric about education, echoing through every millage election and every race for public office in the past half-century. We are all familiar with the argument that pops upon these occasions — that education is the keystone of the community’s economic future, that spending money on education is really an investment in human capital that will pay big dividends.

Notice that the first two goals are in some ways quite different in the effects they have had on schools. One emphasizes a political role for schools while the other stresses an economic role. One pushes for general education, the other for specialized education. One homogenizes, the other differentiates.

But from another angle, the two take a similar approach, because they both treat education as public good. A public good is one that benefits all members of a community, which means that you cannot avoid being affected by it. For example, police protection and road maintenance have an impact directly or indirectly on the life of everyone. Likewise, everyone stands to gain from a public school system that produces competent citizens and productive workers, even those members of the community who don’t have children in public schools.

This leads us to something that is quite distinctive about the third educational goal, the one I call social mobility. From the perspective of this goal, education is not a public good but a private good. If the first goal for education takes the viewpoint of the citizen and the second takes that of the taxpayer, the third takes the viewpoint of the individual educational consumer.

The purpose of education from this angle is not what it can do for democracy or the economy but what it can do for me. Historically, education has paid off handsomely for individuals who stayed in school and came away with diplomas. Educational credentials have made it possible for people to distinguish themselves from their competitors, giving them a big advantage in the race for good jobs and a comfortable life. As a result, education has served as a springboard to upward mobility for the working class and a buttress against downward mobility for the middle class.

Note that if education is going to serve the social-mobility goal effectively, it has to provide some people with benefits that others don’t get. Education in this sense is a private good that only benefits the owner, an investment in my future, not yours, in my children, not other people’s children. For such an educational system to work effectively, it needs to focus a lot of attention on grading, sorting, and selecting students. It needs to provide a variety of ways for individuals to distinguish themselves from others — such as by placing themselves in a more prestigious college, a higher curriculum track, the top reading group, or the gifted program. In this sense the social-mobility goal reinforces the same sorting and selecting tendency in education that is promoted by the social-efficiency goal, but without the same concern for providing socially useful skills.

Now that I’ve spelled out some of the main characteristics of these three goals for education, let me show how they can help us understand the major swings of the pendulum in educational reform over the last 200 years.

During the common school era in the mid-19th century, the dominant goal for American education democratic equality. The connection between school and work at this point was weak. People earned job skills on the job rather than in school, and educational credentials offered social distinction but not necessarily preference in hiring.

By the end of the 19th century, however, both social efficiency and social mobility emerged as major factors in shaping education, while the influence of democratic equality declined. High school enrollments began to take off in the 1890s, which posed two big problems for education — a social-efficiency problem (how to provide education for the new wave of students), and a social-mobility problem (how to protect the value of high school credentials for middle-class consumers). The result was a series of reforms that defined the Progressive era in American education during the first half of the 20th century. These included such innovations as tracking, ability testing, ability grouping, vocationalism, special education, social promotion, and life adjustment.

Then in the 1960s and 1970s we saw a swing back from social efficiency to democratic equality (reinforced by the social-mobility goal). The national movement for racial equality brought pressure to integrate schools, and these arguments for political equality and individual opportunity led to a variety of related reforms aimed at reducing educational discrimination based on class, gender, and handicapping condition.

But in the 1980s and 1990s, the momentum shifted back from democratic equality to social efficiency — again reinforced by social mobility. The emerging movement for educational standards responded both to concerns about declining economic competitiveness (seen as a deficiency of human capital) and to concerns about a glut of high school and college credentials (seen as a threat to social mobility).

However, another way to think about these historical trends in educational reform is to turn attention away from the pendulum swings between the first two goals and to focus instead on the steady growth in the influence of the third goal throughout the last 100 years. Since its emergence as a factor in the late 19th century, social mobility has gradually grown to become the dominant goal in American education. Increasingly, neither of the other two goals can make strong headway except in alliance with the third. Only social mobility, it seems, can afford to go it alone any longer. A prime example is the recent push for educational choice, charters, and vouchers. This is the strongest educational reform movement of the 1990s, and it is grounded entirely within the consumer-is-king perspective of the social-mobility goal.

So, you may ask, what are the implications of all this? I want to mention two problems that arise from the history of conflicting goals in American education — one deriving from the conflict itself and the other from the emerging dominance of social mobility. The second problem is more serious than the first.

On the issue of conflict: Contradictory goals have shaped the basic structure of American schools, and the result is a system that is unable to accomplish any one of these goals very effectively — which has been a common complaint about schools. Also, much of what passes for educational reform may be little more than ritual swings back and forth between alternative goals — another common complaint. But I don’t think this problem is really resolvable in any simple way. Americans seem to want and need an education system that serves political equality and economic productivity and personal opportunity, so we might as well learn how to live with it.

The bigger problem is not conflict over goals but the possible victory of social mobility over the other two. The long-term trend is in the direction of this goal, and the educational reform initiatives in the last decade suggest that this trend is accelerating. At the center of the current talk about education is a series of reforms designed to empower the educational consumer, and if they win out, this would resolve the tension between public and private conceptions of education decisively in the favor of the private view. Such a resolution to the conflict over goals would hurt education in at least two ways.

First, in an educational system where the consumer is king, who will look after the public’s interest in education? As supporters of the two public goals have long pointed out, we all have a stake in the outcomes of public education, since this is the institution that shapes our fellow citizens and fellow workers. In this sense, the true consumers of education are all of the members of the community — and not just the parents of school children. But these parents are the only ones whose interests matter forth school choice movement, and their consumer preferences will dictate the shape of the system.

A second problem is this: In an educational system where the opportunity for individual advancements is the primary focus, it becomes more important to get ahead than to get an education. When the whole point of education is not to ensure that I learn valuable skills but instead to give me a competitive social advantage, then it is only natural for me to focus my ingenuity as a student toward acquiring the most desirable grades, credits, and degrees rather than toward learning the curriculum.

We have already seen this taking place in American education in the past few decades. Increasingly, students have been acting more like smart consumers than eager learners. Their most pointed question to the teacher is “Will this be on the test?” They see no point in studying anything that doesn’t really count. If the student is the consumer and the goal is to get ahead rather than to get an education, then it is only rational for students to look for the best deal. And that means getting the highest grades and the most valuable credentials for the lowest investment of effort. As cagey consumers, children in school have come to be like the rest of us when we’re in the shopping mall: They hate to pay full price when they can get the same product on sale.

That’s the bad news from this little excursion into educational history, but don’t forget the good news as well. For 200 years, Americans have seen education as a central pillar of public life. The contradictory structure of American education today has embedded within it an array of social expectations and instructional practices that clearly express these public purposes. There is reason to think that Americans will not be willing to let educational consumerism drive this public-ness out of the public schools.

Posted in Academic writing, Uncategorized

Agnes Callard — Publish and Perish

This post is a recent essay by philosopher Agnes Callard about the problems with academic writing.  It was published in The Point.  

In this essay, she explores the way that professionalization has ruined scholarly writing.  The need to sound professional and publish in the kinds of serious journals that are the route to tenure and academic recognition has diverted us from using writing to communicate ideas to a broad audience in favor of writing to signal scholarly cred. 

Early on she makes a confession that many of us could make: “Although I love to read, and read a lot, little of my reading comes from recent philosophy journals.”  Sound familiar?  It sure resonates with me.  We tend to read journals in our field out of duty — reviewing for journals, checking on a possible citation for our own work — rather than out of any hope that this will provide us with enlightenment.

Much of her discussion is familiar territory, part of the literature on the failures of academic writing that I have highlighted in this blog (e.g., here and here.  But she adds a qualifier that I find intriguing: 

The sad thing about being stuck reading narrow, boring, abstruse papers is not how bad they are, but how good they are. When I am enough of an insider to be in a position to engage the writer in back-and-forth questioning, either in speech or in writing, that process of objection and pushback tends to expose a real and powerful line of thought driving the piece. Philosophers haven’t stopped loving knowledge, despite the increasingly narrow confines within which we must, if we are to survive, pursue it.

It’s not that academics are failing to come up with interesting ideas.  It’s that they feel compelled to hide these ideas under heaps of jargon, turgid prose, and professional posturing.  In conversation, these scholars can explain what’s cool about their work.  But it’s obviously counterproductive to make the reader do all the work of unpacking your argument into a discernable and compelling form. 

Recall Stephen Toulmin’s point, which I use as the epigraph for my syllabus on academic writing: “The effort the writer does not put into writing, the reader has to put into reading.”

Enjoy.

 

Publish and Perish

Agnes Callard

These words exist for you to read them. I wrote them to try to convey some ideas to you. These are not the first words I wrote for you—those were worse. I wrote and rewrote, with a view to clarifying my meaning. I want to make sure that what you take away is exactly what I have in mind, and I want to be concise and engaging, because I am mindful of competing demands on your time and attention.

You might think that everything I am saying is trivial and obvious, because of course all writing is like this. Writing is a form of communication; it exists to be read. But that is, in fact, not how all writing works. In particular, it is not how academic writing works. Academic writing does not exist in order to communicate with a reader. In academia, or at least the part of it that I inhabit, we write, most of the time, not so much for the sake of being read as for the sake of publication.

Let me illustrate by way of a confession regarding my own academic reading habits. Although I love to read, and read a lot, little of my reading comes from recent philosophy journals. The main occasions on which I read new articles in my areas of specialization are when I am asked to referee or otherwise assess them, when I am helping someone prepare them for publication and when I will need to cite them in my own paper.

“Counts” being the operative word. What can be counted is what will get done. In the humanities, no one counts whether anyone reads our papers. Only whether they are published, and where. I have observed these pressures escalate over time: nowadays it is unsurprising when those merely applying to graduate schools have already published a paper or two.

Writing for the sake of publication—instead of for the sake of being read—is academia’s version of “teaching to the test.” The result is papers few actually want to read. First, the writing is hypercomplex. Yes, the thinking is also complex, but the writing in professional journals regularly contains a layer of complexity beyond what is needed to make the point. It is not edited for style and readability. Most significantly of all, academic writing is obsessed with other academic writing—with finding a “gap in the literature” as opposed to answering a straightforwardly interesting or important question.

Of course publication is a necessary step along the way to readership, but the academic who sets their sights on it is like the golfer or baseball player who stops their swing when they make contact with the ball. Without follow-through, what you get are short, jerky movements; we academics have become purveyors of small, awkwardly phrased ideas.

In making these claims about academic writing, I am thinking in the first instance of my own corner of academia—philosophy—though I suspect that my points generalize, at least over the academic humanities. To offer up one anecdote: in spring 2019 I was teaching Joyce’s Portrait of the Artist as a Young Man; since I don’t usually teach literature, I thought I should check out recent secondary literature on Joyce. What I found was abstruse and hypercomplex, laden with terminology and indirect. I didn’t feel I was learning anything I could use to make the meaning of the novel more accessible to myself or to my students. I am willing to take some of the blame here: I am sure I could have gotten something out of those pieces if I had been willing to put more effort into reading them. Still, I do not lack the intellectual competence required to understand analyses of Joyce; I feel all of those writers could have done more to write for me.

But whether my points generalize across the humanities or not, I will confess that I feel the urgency of the problem for philosophy much more than for some abstract entity called “the humanities.” I love Joyce, I love Homer, but I am not invested in the quality of current scholarship on either. It’s philosophy that I worry about.

When I am asked for sources of “big ideas” in philosophy—the kind that would get the extra-philosophical world to stand up and take notice—I struggle to list anyone born after 1950. It is sobering to consider that the previous decade produced: Daniel Dennett, Saul Kripke, David Lewis, Derek Parfit, John McDowell, Peter Singer, G. A. Cohen and Martha Nussbaum. In my view, each of these people towers over everyone who comes after them in at least one of the categories by which we might judge a philosopher: breadth, depth, originality or degree of public influence. Or consider this group, born in roughly the two decades prior (1919-1938), remarkable in its intellectual fertility: Elizabeth Anscombe, Philippa Foot, Stanley Cavell, Harry Frankfurt, Bernard Williams, Thomas Nagel, Robert Nozick, Richard Rorty, Hilary Putnam, John Rawls. These are the philosophers about whom one routinely asks, “Why don’t people write philosophy like this anymore?” And this isn’t only a point about writing style. Their work is inviting—it asks new questions, it sells the reader on why those questions matter and it presents itself as a point of entry into philosophy. This is why all of us keep assigning their work over and over again, a striking fact given how much the number of philosophers has ballooned since their time.

And it’s not just a matter of a few exceptional figures. A few years ago, I happened to browse through back issues of a top journal (Ethics) from 1940-1950—not an easy decade for the world, or academia. I went in assuming those papers would be of much lower quality than what is being put out now. Keep in mind, this is a time when not only was publication not required for getting a job, even a Ph.D. was not required; there were far fewer philosophers, and getting a paper accepted at a journal was a vastly less competitive process.

In general, I would describe the papers from that decade as lacking something in terms of precision, clarity and “scholarliness,” but also as being more engaging and ambitious, more heterogeneous in tone and writing style, and better written. Perhaps some amount of academic competition is salutary, but the all-consuming competition of recent years, it appears, has been less productive of excellence than of homogeneity and stagnation. Because the most reliable mark of “quality” is familiarity, the machine incentivizes keeping innovation to a minimum—only at the margin, just enough to get published. It constricts the space of thought. Over time, we end up with less and less to show for all the effort, talent and philosophical training we are throwing into philosophical research. If I wanted to make progress on one of my own papers, I’d certainly be better served with a paper from Ethics in 2020—I’m much more likely to want to cite it. But if I were just curiously browsing for some philosophical reading, I’d go for one of those back issues. We might be hitting more balls today, but none of them is going far.

Some see a way out: they call it “public philosophy.” But it is a mistake to think that this represents an escape from the problem I am describing. We do not have two systems for doing philosophy, “academic philosophy” and “public philosophy.” “Public philosophy,” including the piece of it you are currently reading, is written mostly by academic philosophers—which is to say, people who studied, received Ph.D.s at and in the vast majority of cases make a living by working within the academic philosophy system.

I have no objection to applying the title “philosopher” broadly, including to those public intellectuals who have had so much more success in speaking to a general audience than I or any of my colleagues who operate more strictly within the confines of academic philosophy: from Judith Butler and Bruno Latour to Slavoj Žižek, Camille Paglia and Steven Pinker. But it is one thing to be a “philosopher” in the sense of being a source of intellectual inspiration to the public, or a subset thereof, and another to be a member of a philosophical community. The latter designation requires a person not only to be beholden to such a community argumentatively, but also calls for participation in the maintenance and self-reproduction of that community through education, training and management. Academic philosophy is the system we have. You can’t jump ship, because there’s nowhere to jump.

The sad thing about being stuck reading narrow, boring, abstruse papers is not how bad they are, but how good they are. When I am enough of an insider to be in a position to engage the writer in back-and-forth questioning, either in speech or in writing, that process of objection and pushback tends to expose a real and powerful line of thought driving the piece. Philosophers haven’t stopped loving knowledge, despite the increasingly narrow confines within which we must, if we are to survive, pursue it.

Some in the philosophical community will defend this “narrowing” as a sign of the increasingly scientific character of philosophy. But no matter how scientific some parts of philosophy become, the following difference will always remain: unlike science, philosophy cannot benefit those who don’t engage in it. Philosophical technology—ideas, arguments, distinctions, questions—cannot live outside the human mind.

One doesn’t need to idolize Socrates, as I happen to, to think that philosophy is an especially dialogical discipline. All academic work invites response in the weak sense of “there is always more to be said,” or “corrections welcome,” but philosophical talks, papers and books specifically aim to provoke, to incite, to court pushback and counterexample. Our task is not to take some questions off humanity’s plate, but to infect others with our need to find answers.

The philosopher is an especially needy kind of truth-seeker. Like vampires, zombies and werewolves, we are creatures who need company, and who will do whatever it takes to create it.

No one thinks that Plato, Descartes, Kant and the rest were right about everything; nonetheless, centuries and millennia later, we cannot stop talking not just about them, but to them, with them. They made us into one of them, and we need to keep paying that forward.

Posted in Higher Education, History of education, Inequality, Meritocracy, Public Good, Uncategorized

How NOT to Defend the Private Research University

This post is a piece I published today in the Chronicle Review.  It’s about an issue that has been gnawing at me for years.  How can you justify the existence of institutions of the sort I taught at for the last two decades — rich private research universities?  These institutions obviously benefit their students and faculty, but what about the public as a whole?  Is there a public good they serve; and if so, what is it? 

Here’s the answer I came up with.  These are elite institutions to the core.  Exclusivity is baked in.  By admitting only a small number of elite students, they serve to promote social inequality by providing grads with an exclusive private good, a credential with high exchange value. But, in part because of this, they also produce valuable public goods — through the high quality research and the advanced graduate training that only they can provide. 

Open access institutions can promote the social mobility that private research universities don’t, but they can’t provide the same degree of research and advanced training.  The paradox is this:  It’s in the public’s interest to preserve the elitism of these institutions.  See what you think.

Hoover Tower

How Not to Defend the Private Research University

David F. Labaree

In this populist era, private research universities are easy targets that reek of privilege and entitlement. It was no surprise, then, when the White House pressured Harvard to decline $8.6 million in Covid-19-relief funds, while Stanford, Yale, and Princeton all judiciously decided not to seek such aid. With tens of billions of endowment dollars each, they hardly seemed to deserve the money.

And yet these institutions have long received outsized public subsidies. The economist Richard Vedder estimated that in 2010, Princeton got the equivalent of $50,000 per student in federal and state benefits, while its similar-size public neighbor, the College of New Jersey, got just $2,000 per student. Federal subsidies to private colleges include research grants, which go disproportionately to elite institutions, as well as student loan and scholarship funds. As recipients of such largess, how can presidents of private research universities justify their institutions to the public?

Here’s an example of how not to do so. Not long after he assumed the presidency of Stanford in 2016, Marc Tessier-Lavigne made the rounds of faculty meetings on campus in order to introduce himself and talk about future plans for the university. When he came to a Graduate School of Education meeting that I attended, he told us his top priority was to increase access. Asked how he might accomplish this, he said that one proposal he was considering was to increase the size of the entering undergraduate class by 100 to 200 students.

The problem is this: Stanford admits about 4.3 percent of the candidates who apply to join its class of 1,700. Admitting a couple hundred additional students might raise the admit rate to 5 percent. Now that’s access. The issue is that, for a private research university like Stanford, the essence of its institutional brand is its elitism. The inaccessibility is baked in.

Raj Chetty’s social mobility data for Stanford show that 66 percent of its undergrads come from the top 20 percent by income, 52 percent from the top 10 percent, 17 percent from the top 1 percent, and just 4 percent from the bottom 20 percent. Only 12 percent of Stanford grads move up by two quintiles or more — it’s hard for a university to promote social mobility when the large majority of its students starts at the top.

Compare that with the data for California State University at Los Angeles, where 12 percent of students are from the top quintile and 22 percent from the bottom quintile. Forty-seven percent of its graduates rise two or more income quintiles. Ten percent make it all the way from the bottom to the top quintile.

My point is that private research universities are elite institutions, and they shouldn’t pretend otherwise. Instead of preaching access and making a mountain out of the molehill of benefits they provide for the few poor students they enroll, they need to demonstrate how they benefit the public in other ways. This is a hard sell in our populist-minded democracy, and it requires acknowledging that the very exclusivity of these institutions serves the public good.

For starters, in making this case, we should embrace the emphasis on research production and graduate education and accept that providing instruction for undergraduates is only a small part of the overall mission. Typically these institutions have a much higher proportion of graduate students than large public universities oriented toward teaching (graduate students are 57 percent of the total at Stanford and just 8.5 percent in the California State University system).

Undergraduates may be able to get a high-quality education at private research universities, but there are plenty of other places where they could get the same or better, especially at liberal-arts colleges. Undergraduate education is not what makes these institutions distinctive. What does make them stand out are their professional schools and doctoral programs.

Private research universities are elite institutions, and they shouldn’t pretend otherwise.

Private research universities are souped up versions of their public counterparts, and in combination they exert an enormous impact on American life.

As of 2017, the American Association of Universities, a club consisting of the top 65 research universities, represented just 2 percent of all four-year colleges and 12 percent of all undergrads. And yet the group accounted for over 20 percent of all U.S. graduate students; 43 percent of all research doctorates; 68 percent of all postdocs; and 38 percent of all Nobel Prize winners. In addition, its graduates occupy the centers of power, including, by 2019, 64 of the Fortune 100 CEOs; 24 governors; and 268 members of Congress.

From 2014 to 2018, AAU institutions collectively produced 2.4-million publications, and their collective scholarship received 21.4 million citations. That research has an economic impact — these same institutions have established 22 research parks and, in 2018 alone, they produced over 4,800 patents, over 5,000 technology license agreements, and over 600 start-up companies.

Put all this together and it’s clear that research universities provide society with a stunning array of benefits. Some of these benefits accrue to individual entrepreneurs and investors, but the benefits for society at a whole are extraordinary. These universities drive widespread employment, technological advances that benefit consumers worldwide, and the improvement of public health (think of all the university researchers and medical schools advancing Covid-19-research efforts right now).

Besides their higher proportion of graduate students and lower student-faculty ratio, private research universities have other major advantages over publics. One is greater institutional autonomy. Private research universities are governed by a board of laypersons who own the university, control its finances, and appoint its officers. Government can dictate how it uses the public subsidies it gets (except tax subsidies), but otherwise it is free to operate as an independent actor in the academic market. This allows these colleges to pivot quickly to take advantage of opportunities for new programs of study, research areas, and sources of funding, largely independent of political influence, though they do face a fierce academic market full of other private colleges.

A 2010 study of universities in Europe and the U.S. by Caroline Hoxby and associates shows that this mix of institutional autonomy and competition is strongly associated with higher rankings in the world hierarchy of higher education. They find that every 1-percent increase in the share of the university budget that comes from government appropriations corresponds with a decrease in international ranking of 3.2 ranks. At the same time, each 1-percent increase in the university budget from competitive grants corresponds with an increase of 6.5 ranks. They also found that universities high in autonomy and competition produced more patents.

Another advantage the private research universities enjoy over their public counterparts, of course, is wealth. Stanford’s endowment is around $28 billion, and Berkeley’s is just under $5 billion, but because Stanford is so much smaller (16,000 versus 42,000 total students) this multiplies the advantage. Stanford’s endowment per student dwarfs Berkeley’s. The result is that private universities have more research resources: better labs, libraries, and physical plant; higher faculty pay (e.g., $254,000 for full professors at Stanford, compared to $200,000 at Berkeley); more funding for grad students, and more staff support.

A central asset of private research universities is their small group of academically and socially elite undergraduate students. The academic skill of these students is an important draw for faculty, but their current and future wealth is particularly important for the institution. From a democratic perspective, this wealth is a negative. The student body’s heavy skew toward the top of the income scale is a sign of how these universities are not only failing to provide much social mobility but are in fact actively engaged in preserving social advantage. We need to be honest about this issue.

But there is a major upside. Undergraduates pay their own way (as do students in professional schools); but the advanced graduate students don’t — they get free tuition plus a stipend to pay living expenses, which is subsidized, both directly and indirectly, by undergrads. The direct subsidy comes from the high sticker price undergrads pay for tuition. Part of this goes to help out upper-middle-class families who still can’t afford the tuition, but the rest goes to subsidize grad students.

The key financial benefits from undergrads come after they graduate, when the donations start rolling in. The university generously admits these students (at the expense of many of their peers), provides them with an education and a credential that jump-starts their careers and papers over their privilege, and then harvests their gratitude over a lifetime. Look around any college campus — particularly at a private research university — and you will find that almost every building, bench, and professor bears the name of a grateful donor. And nearly all of the money comes from former undergrads or professional school students, since it is they, not the doctoral students, who go on to earn the big bucks.

There is, of course, a paradox. Perhaps the gross preservation of privilege these schools traffic in serves a broader public purpose. Perhaps providing a valuable private good for the few enables the institution to provide an even more valuable public good for the many. And yet students who are denied admission to elite institutions are not being denied a college education and a chance to get ahead; they’re just being redirected. Instead of going to a private research university like Stanford or a public research university like Berkeley, many will attend a comprehensive university like San José State. Only the narrow metric of value employed at the pinnacle of the American academic meritocracy could construe this as a tragedy. San José State is a great institution, which accepts the majority of the students who apply and which sends a huge number of graduates to work in the nearby tech sector.

The economist Miguel Urquiola elaborates on this paradox in his book, Markets, Minds, and Money: Why America Leads the World in University Research (Harvard University Press, 2020), which describes how American universities came to dominate the academic world in the 20th century. The 2019 Shanghai Academic Ranking of World Universities shows that eight of the top 10 universities in the world are American, and seven of these are private.

Urquiola argues that the roots of American academe’s success can be found in its competitive marketplace. In most countries, universities are subsidiaries of the state, which controls its funding, defines its scope, and sets its policy. By contrast, American higher education has three defining characteristics: self-rule (institutions have autonomy to govern themselves); free entry (institutions can be started up by federal, state, or local governments or by individuals who acquire a corporate charter); and free scope (institutions can develop programs of research and study on their own initiative without undue governmental constraint).

The result is a radically unequal system of higher education, with extraordinary resources and capabilities concentrated in a few research universities at the top. Caroline Hoxby estimates that the most selective American research universities spend an average of $150,000 per student, 15 times as much as some poorer institutions.

As Urquiola explains, the competitive market structure puts a priority on identifying top research talent, concentrating this talent and the resources needed to support it in a small number of institutions, and motivating these researchers to ramp up their productivity. This concentration then makes it easy for major research-funding agencies, such as the National Institutes of Health, to identity the institutions that are best able to manage the research projects they want to support. And the nature of the research enterprise is such that, when markets concentrate minds and money, the social payoff is much greater than if they were dispersed more evenly.

Radical inequality in the higher-education system therefore produces outsized benefits for the public good. This, paradoxical as it may seem, is how we can truly justify the public investment in private research universities.

David Labaree is a professor emeritus at the Stanford Graduate School of Education.

 

 

Posted in Capitalism, Culture, Meritocracy, Uncategorized

Clare Coffey — Closing Time: We’re All Counting Bodies

This is a lovely essay by Clare Coffey from the summer issue of Hedgehog Review.  In it she explores the extremes in contemporary American life through the medium of two recent books:  those who have been shunted aside in the knowledge economy and destined to deaths of despair, and those who occupy the flashiest reaches of the new uber class.  She does this through an adept analysis of two recent books:  Deaths of Despair and the Future of Capitalism, by Anne Case and Angus Deaton; and Very Important People: Status and Beauty in the Global Party Circuit, by Ashley Mears.  In combination, the books tell a powerful story.

Closing Time

We’re All Counting Bodies

Clare Coffey

Lenin’s maxim that “there are decades when nothing happens, and there are weeks when decades happen” can be tough on writers. You spend years carefully marshaling an argument, anticipating objections, tightening your focus, sacrificing claims that might interfere with the suasion of your central point, and then—bam, the gun goes off. Something happens that makes the point toward which you were gently cajoling the reader not only obvious but insufficient. Your thoroughbred stands ready, but the rest of the field has already left the gate.

So it is with Deaths of Despair and the Future of Capitalism. In 2014, Princeton economists Anne Case and Angus Deaton, the latter a Nobel Prize winner, noted that for the first time, the mortality rate among white Americans without a college degree was climbing rather than dropping; further, while members of this group remained relatively advantaged compared to their black peers, the two cohorts’ mortality rates were moving in opposite directions. Case and Deaton found that a significant portion of this hike in mortality was due to deaths from alcoholism, drug use, and suicide—phenomena which, bundled together, they labeled “deaths of despair.”

Deaths of Despair Cover

Six years later, in this new book, the two economists attempt to turn these observations into a thesis: What can this horrifying data can tell us about American society at large? Instead of linking the deaths to any single deprivation, the authors place them in a context of wholesale loss of social status and coherent identity for those without purchase in the knowledge professions—a loss that encompasses wage stagnation, the decline of union power, and the transition from a manufacturing to a service economy.

For Case and Deaton, the closing of a factory involves all three, and cannot be understood strictly in terms of lost earnings or job numbers. Even in a “success” story, in which workers get new jobs at a staffing agency or an Amazon fulfillment center, a qualitative catastrophe occurs: to the prestige of difficult, directly productive work; to a measure of democratic control over the conditions of work; to the sense of valued belonging to socially important organizations; to the norms governing work, marriage, and sociality that developed in a particular material context, and which cannot simply transfer over or remake themselves overnight. At least some of these losses are downstream of sectoral transition only insofar as firm structure and historic labor organization is concerned. There is no purely sectoral reason for companies to outsource all non-knowledge jobs to staffing companies, or for Amazon to fire whistleblowers. The differences between NYC taxis and Uber lie in the fact that one has a union and the other classifies its workers as independent contractors, not in NAICS codes. But however carefully you parse the causes, deaths of despair are the final result of a long, slow social death.

Who are the culprits? Case and Deaton are careful not to absolve capitalism, but they insist that the problem is not really capitalism itself but its abuses: “We are not against capitalism. We believe in the power of competition and free markets. Capitalism has brought an end to misery and death for millions in now rich countries over the past 250 years and, much more rapidly, in countries like India and China, over the past 50 years.” This qualification is not unique to them; it takes different forms, from the regulatory reformism of political liberals such as Elizabeth Warren to the attacks on “crony capitalism” of doctrinaire libertarians, for whom the true free market has not yet been tried. For Case and Deaton, the big-picture problem is unchecked economic trends that encourage “upward redistribution”; their more specific and more representative target is a rent-seeking health-care industry.

Their complaint is not only that companies like Purdue Pharma arguably jump-started the opioid epidemic by hard-selling their pain medications and concealing these drugs’ addictive potential. Case and Deaton also argue that the health-care sector has eaten up American wage gains with insurance costs, funneling more and more money to health-care spending while delivering less and less in terms of health outcomes. The numbers the authors have assembled are convincing. But who at this juncture needs to be convinced? A teenager recently died of COVID-19 after being turned away from an urgent care clinic for lack of insurance. Hospital personnel are getting laid off in the midst of a pandemic to stanch balance sheet losses resulting from delayed elective care. Hospitals that have been operated on the basis of years of business school orthodoxy lack the extra capacity to deal with anything more momentous than a worse-than-usual flu season. Who is in any serious doubt that the American health-care system is cobbled together out of rusty tin cans and profit margins? The more pertinent question is what in America isn’t.

The release of Case and Deaton’s book just as an often fatal communicable disease was going pandemic was not, of course, the fault of the authors. But it makes for oddly frustrating reading. Positing a link between deindustrialization and health-care rent seeking and deaths of despair is an abductive argument about historical and present actors rather than a purely statistical inference. As Case and Deaton freely admit, you cannot prove by means of regression analysis that any of their targets are the unmistakable causes of these deaths. For that matter, there’s too much bundling among both the phenomena (alcoholic diseases, overdoses, suicides) and the proposed causes (deindustrialization, the decline of organized labor, wage stagnation, corporate restructuring) to conduct even a controlled test.

While it may not be possible to demonstrate airtight causality, Deaths of Despair nonetheless provides valuable documentation of the humiliations, losses, and unmoorings of those on the wrong end of a widening economic divide. The book is less a technocratic prescription than a grim body count.

In Very Important People: Status and Beauty in the Global Party Circuit, Ashley Mears is counting bodies too, albeit very different ones. From New York to Miami, from Ibiza to Saint-Tropez, all over the elite global party scene in which Mears, a sociologist and former fashion model, did eighteen months of research, everyone is counting bodies. The bodies are those of models, ruthlessly quantified and highly valuable to the owners of elite nightclubs. Very Important People hinges on one insight: The image of a rooftop party filled with glamorous models drinking champagne isn’t just a pop-culture cliché. It is a lucrative business model.

VIP Cover

According to Mears, up through the nineties the business model for nightclubs was simple. There was a bar and a dance floor. You paid to get in and you paid to drink. Ideally, you’d want a certain ratio of women to men, but the pleasures on offer were fairly straightforward. But in the early 2000s, a new model emerged, ironically enough, in the repurposed industrial buildings of New York’s Meatpacking District. Rather than rely on the dance floor and bar, clubs encouraged (usually male) customers to put down serious cash for immediately available and strategically placed tables and VIP sections, where bottles of liquor at marked-up prices could be brought to them. Clubs that could successfully brand themselves as elite might make enormous sums off out-of-town dentists on a spree, young financiers looking to woo or compete with business associates by demonstrating access to the city’s most exclusive pleasures, and the mega-rich “whales” proclaiming their status by over-the-top performances of generosity and waste.

The table is crucial for this strategy to succeed. It allows maximum visibility for both the whale’s endless parade of bottles of Dom Perignon (much of it left undrunk by virtue of sheer volume) and the groups of models that signal that this is the kind of club where a whale might be found. The good that is being advertised is indistinguishable from the advertising process.

A whole secondary ecosystem has grown up around this glitzy “potlach,” as Mears calls it—this elaborately choreographed wasting of wealth. There are the elite club promoters, who might make thousands a night if they show up with enough models, and whose transactional relationships with the models are defined in useful, fragile terms of mutual care. There are the models, young and broke in expensive cities, who get free meals, free champagne, and sometimes free housing as long as they show up and play nice. There are the bouncers, who police the height and looks of entrants, and the whales, who both command the scene and function as an advertisement for its desirability. Being adjacent to real wealth is a powerful incentive, especially for promoters, who dream of rubbing shoulders and making deals of their own through connections forged in the club.

The owners make money, and everyone else gets a little something and a little scammed. Perhaps among those who are scammed the least are the models, the majority of whom seem to be in it for a good party rather than upward mobility. When you are very young and very beautiful, the world tends to see those traits as the most important things about you. One way to register dissent is to trade them only for things equally ephemeral, inconsequential, delightful: a glass of champagne, moonlight over the Riviera, a night spent dancing till dawn. Reaping the benefits of belonging to an intrinsically exclusive club is not heroic. But it seems no worse than the trade made by the wives of the superwealthy, who in one scene appear, disapproving and hostile, at a table adjacent to their husbands’ at an Upper East Side restaurant. They have made a more thoroughgoing negotiation of their value to wealthy men—one resting on the ability to reproduce the upper class as well as attest to its presence.

Demarcating status is the limit of the model’s power. It is what she is at the club to do. The model is not there primarily to be sexually alluring—that is the role of the lower-class-coded bottle waitress. One of Mears’s subjects even confesses that models aren’t his type: They are too tall and skinny, too stereotyped, and after all, desire is so highly personal—less an estimation that a face has been arranged in the single best way as delight that it has been arranged in such a way. But models are necessary precisely because their bodies and faces have transcended the whims of any personally desiring subject, to the objectivity of market value. Their beauty can be quantified in inches, and dollars.

To contemplate and cultivate beauty is perhaps noble. To desire and consume it is at least human. To desire not any object in itself, but an image of desirability, is ghastly. There are many scenes in Very Important People, from the physical dissipation to the moments bordering on human trafficking, that are morally horrifying. What lingers, though, is this spectral quality: huge amounts of money, time, and flesh in service to a recursive and finally imaginary value. If anyone has gained from the losses of Case and Deaton’s subjects, it is the patrons of the global party circuit. But their gains seem less hoarded than unmade, in a kind of reverse alchemy—transmuted into the allurements of a phantom world, elusive, seductive, and all too soluble in the light of day.

Posted in History, Schooling, Welfare

Michael Katz — Public Education as Welfare

In this post, I reproduce a seminal essay by Michael Katz called “Public Education as Welfare.” It was originally published in Dissent in 2010 (link to the original) and it draws on his book, The Price of Citizenship: Redefining the American Welfare State.  

I encountered this essay when I was working on a piece of my own about the role that US public schools play as social welfare agencies.  My interest emerged from an op-ed about what is lost when schools close that I published a couple weeks ago and then posted here.  Michael was my dissertation advisor back at Penn, and I remembered he had written about the connection between schooling and welfare.  As you’ll see when I publish my essay here in a week or so, my focus is on the welfare function of schooling in companion with its other functions: building political community, promoting economic growth, and providing advantage in the competition for social position.  

Katz takes a much broader approach, seeking to locate schools as a central component of the peculiar form of the American welfare state.  He does a brilliant job of locating schooling in relation to the complex array of other public and private programs that constitute this rickety and fiendishly complex structure.  Enjoy.

Katz Cover

Public Education as Welfare

Michael B. Katz

Welfare is the most despised public institution in America. Public education is the most iconic. To associate them with each other will strike most Americans as bizarre, even offensive. Thelin would be less surprising to nineteenth century reformers for whom crime, poverty, and ignorance formed an unholy trinity against which they struggled. Nor would it raise British eyebrows. Ignorance was one of the “five giants” to be slain by the new welfare state proposed in the famous Beveridge Report. National health insurance, the cornerstone of the British welfare state, and the 1944 Education Act, which introduced the first national system of secondary education to Britain, were passed by Parliament only two years apart. Yet, in the United States, only a few students of welfare and education have even suggested that the two might stand together.

Why this mutual neglect? And how does public education fit into the architecture of the welfare state? It is important to answer these questions. Both the welfare state and the public school system are enormous and in one way or another touch every single American. Insight into the links between the two will illuminate the mechanisms through which American governments try to accomplish their goals; and it will show how institutions whose public purpose is egalitarian in fact reproduce inequality.

The definition and boundaries of the welfare state remain contentious topics. I believe that the “term “welfare state” refers to a collection of programs designed to assure economic security to all citizens by guaranteeing the fundamental necessities of life: food, shelter, medical care, protection in childhood, and support in old age. In the United States, the term generally excludes private efforts to provide these goods. But the best way to understand a nation’s welfare state is not to apply a theoretically driven definition but, rather, to examine the mechanisms through which legislators, service providers, and employers, whether public, private, or a mix of the two, try to prevent or respond to poverty, illness, dependency, economic security, and old age.

Where does public education fit within this account? First, most concretely, for more than century schools have been used as agents of the welfare state to deliver social services, such as nutrition and health. Today, in poor neighborhoods, they often provide hot breakfasts among other services. More to the point, public school systems administer one of the nation’s largest programs of economic redistribution. Most accounts of the financing of public education stress the opposite point by highlighting inequities, “savage inequalities,” to borrow Jonathan Kool’s phrase, that shortchange city youngsters and racial minorities. These result mostly from the much higher per-pupil spending in affluent suburbs than in poor inner cities, where yields from property taxes are much lower. All this is undeniable as well as unacceptable.

But tilt the angle and look at the question from another perspective. Consider how much the average family with children pays in property taxes, the principal support for schools. Then focus on per-pupil expenditure, even in poor districts. You will find that families, including poor city families, receive benefits worth much more than they have contributed. Wealthier families, childless and empty-nest couples, and businesses subsidize families with children in school.

There is nothing new about this. The mid-nineteenth-century founders of public school systems, like Horace Mann, and their opponents understood the redistributive character of public education. To build school systems, early school promoters needed to persuade the wealthy and childless that universal, free education would their interests by reducing the incidence of crime, lowering the cost of poor relief, improving the skills and attitudes of workers, assimilating immigrants—and therefore saving them money in the long run. So successful were early school promoters that taxation for public education lost its controversial quality. With just a few exceptions, debates focused on the amount of taxes, not on their legitimacy. The exceptions occurred primarily around the founding of high schools that working-class and other voters correctly observed would serve only a small fraction of families at a time when most youngsters in their early teens were sent out to work or kept at home to help their families. For the most part, however, the redistributive quality of public education sank further from public consciousness. This is what early school promoters wanted and had worked to make happen. When they began their working the early nineteenth century, “public” usually referred to schools widely available and either free or cheap—in short, schools for the poor. School promoters worked tirelessly to break this link between public and pauper that inhibited the development of universal public education systems. So successful were they that today the linkage seems outrageous—though in cities where most of the remaining affluent families send their children to private schools, the association of public with pauper has reemerged with renewed ferocity.

As a concrete example, here is a back-of-the envelope illustration. In 2003–2004, public elementary and secondary education in the United States cost $403 billion or, on average, $8,310 per student (or, taking the median, $7,860). Most families paid nothing like the full cost of this education in taxes. Property taxes, which account for a huge share of spending on public schools, average $935 per person or, for family of four, something under $4,000, less than half the average per-pupil cost. As rough as these figures are, they do suggest that most families with school-age children receive much more from spending on public education than they contribute in taxes. (A similar point could be made about public higher education.)

Taxpayers provide this subsidy because they view public education as a crucial public good. It prevents poverty, lowers the crime rate, prepares young people for the work force, and fosters social mobility—or so the story goes. The reality, as historians of education have shown, is a good deal more complex. Public education is the mechanism through which the United States solves problems and attempts to reach goals achieved more directly or through different mechanisms in other countries. International comparisons usually brand the United States a welfare laggard because it spends less of its national income on welfare related benefits than do other advanced industrial democracies. But the comparisons leave out spending on public education, private social services, employer-provided health care and pensions, and benefits delivered through the tax code, a definitional weakness whose importance will become clearer when I describe the architecture of the welfare state.

***

Almost thirty-five years ago, in Social Control of the Welfare State, Morris Janowitz pointed out that “the most significant difference between the institutional bases of the welfare state in Great Britain and the United States was the emphasis placed on public education—especially for lower income groups—in the United States. Massive support for the expansion of public education . . . in the United States must be seen as a central component of the American notion of welfare . . .” In the late nineteenth and early twentieth centuries, while other nations were introducing unemployment, old age, and health insurance, the United States was building high schools for a huge surge in enrollment. “One would have to return to the 1910s to find levels of secondary school enrollment in the United States that match those in 1950s Western Europe,” point out economists Claudia Golden and Lawrence F. Katz in The Race Between Education and Technology. European nations were about generation behind the United States in expanding secondary education; the United States was about a generation behind Europe in instituting its welfare state.

If we think of education as a component, wean see that the U.S. welfare state focuses on enhancing equality of opportunity in contrast to European welfare states, which have been more sympathetic to equality of condition. In the United States, equality always has primarily about a level playing field where individuals can compete unhindered by obstacles that crimp the full expression of their native talents; education has served as the main mechanism for leveling the field. European concepts of equality more often focus on group inequality and the collective mitigation of handicaps and risks that, in the United States, have been left for individuals to deal with on their own.

***

Public education is part of the American welfare state. But which one? Each part is rooted in a different place in American history. Think of the welfare state as a loosely constructed, largely unplanned structure erected by many different people over centuries. This rickety structure, which no sane person would have designed, consists of two main divisions, the public and private welfare states, with subdivisions within each. The divisions of the public welfare state are public assistance, social insurance, and taxation. Public assistance (called outdoor relief through most of its history) originated with the Elizabethan poor laws brought over by the colonists. It consists of means-tested benefits. Before 1996, the primary example was Aid to Families with Dependent Children (AFDC), and since 1996, it has been Temporary Assistance to Needy Families (TANF)—the programs current-day Americans usually have in mind when they speak of “welfare.”

Social insurance originated in Europe in the late nineteenth century and made its way slowly to the United States. The first form of U.S. social insurance was workers’ compensation, instituted by several state governments in the early twentieth century. Social insurance benefits accrue to individuals on account of fixed criteria such as age. They are called insurance because they are allegedly based on prior contributions. The major programs—Social Security for the elderly and unemployment insurance—emerged in 1935 when Congress passed the Social Security Act. Social insurance benefits are much higher than benefits provided through public assistance, and they carry no stigma.

The third track in the public welfare state is taxation. U.S. governments, both federal and state, administer important benefits through the tax code rather than through direct grants. Thesis the most modern feature of the welfare state. The major example of a benefit aimed at poor people is the Earned Income Tax Credit, which expanded greatly during the Clinton presidency.

Within the private welfare state are two divisions: charities and social services and employee benefits. Charities and social services have along and diverse history. In the 1960s, governments started to fund an increasing number of services through private agencies. (In America, governments primarily write checks; they do not usually operate programs.) More and more dependent on public funding, private agencies increasingly became, in effect, government providers, a transformation with profound implications for their work. Employee benefits constitute the other division in the private welfare state. These date primarily from the period after the Second World War. They expanded as a result of the growth of unions, legitimated by the 1935 Wagner Act and 1949 decisions of the National Labor Relations Board, which held that employers were required to bargain over, though not required to provide, employee benefits.

Some economists object to including these benefits within the welfare state, but they are mistaken. Employee benefits represent the mechanism through which the United States has chosen to meet the health care needs of majority of its population. About 60 percent of Americans receive their health insurance through their employer, and many receive pensions as well. If unions had bargained hard for a public rather than a private welfare state, the larger American welfare state would look very different. Moreover, the federal government encourages the delivery of healthcare and pensions through private employers by allowing them to deduct the cost from taxes, and it supervises them with massive regulations, notably the Employee Retirement Security Act of 1974.

The first thing to stress about this welfare state is that its divisions are not distinct. They overlap and blend in complicated ways, giving the American welfare state a mixed economy not usefully described as either public or private. At the same time, federalism constrains its options, with some benefits provided by federal government and others offered through state and local governments. Throughout the twentieth century, one great problem facing would-be welfare state builders was designing benefits to pass constitutional muster.

How does public education fit into this odd, bifurcated structure? It shares characteristics with social insurance, public assistance, and social services. At first, it appears closest to social insurance. Its benefits are universal and not means tested, which makes them similar to Social Security (although Social Security benefits received by high income individuals are taxed). But education benefits are largely in kind, as are food stamps, housing, and Medicare. (In-kind benefits are “government provision of goods and services to those in need of them” rather than of “income sufficient to meet their needs via the market.”) Nor are the benefits earned by recipients through prior payroll contributions or employment. This separates them from Social Security, unemployment insurance, and workers’ compensation. Public education is also an enormous source of employment, second only to health care in the public welfare state.

Even more important, public education is primarily local. Great variation exists among states and, within states, among municipalities. In this regard, it differs completely from Social Security and Medicare, whose nationally-set benefits are uniform across the nation. It is more like unemployment insurance, workers’ compensation, and TANF (and earlier AFDC), which vary by state, but not by municipality within states. The adequacy of educational benefits, by contrast, varies with municipal wealth. Education, in fact, is the only public benefit financed largely by property taxes. This confusing mix of administrative and financial patterns provides another example of how history shapes institutions and policy.

Because of its differences from both social insurance and public assistance, public education composes a separate division within the public welfare state. But it moves in the same directions as the rest. The forces redefining the American welfare state have buffeted public schools as well as public assistance, social insurance, and private welfare.

***

Since the 1980s, the pursuit of three objectives has driven change in the giant welfare state edifice. These objectives are, first, a war on dependence in all its forms—not only the dependence of young unmarried mothers on welfare but all forms of dependence on public and private support, including the dependence of workers on paternalistic employers for secure, long-term jobs and benefits. Second is the devolution of authority—the transfer of power from the federal government to the states, from states to localities, and from the public to the private sector. Last is the application of free market models to social policy. Everywhere the market triumphed as template for a reengineered welfare state. This is not a partisan story. Broad consensus on these objectives crossed party lines. Within the reconfigured welfare state, work in the regular labor market emerged as the gold standard, the mark of first-class citizenship, carrying with it entitlement to the most generous benefits. The corollary, of course, was that failure or inability to join the regular labor force meant relegation to second-class citizenship, where benefits were mean, punitive, or just unavailable.

The war on dependence, the devolution of authority, and the application of market models also run through the history of public education in these decades. The attack on “social promotion,” emphasis on high-stakes tests, implementation of tougher high school graduation requirements, and transmutation of “accountability” into the engine of school reform: all these developments are of a piece with the war on dependence. They call for students to stand on their own with rewards distributed strictly according to personal (testable) merit. Other developments point to the practice of devolution in public education. Prime example is the turn toward site-based management—that is, the decentralization of significant administrative authority from central offices to individual schools. The most extreme example is Chicago’s 1989 school reform, which put local school councils in charge of each school, even giving them authority to hire and fire principals.

At the same time, a countervailing trend, represented by the 2002 federal No Child Left Behind legislation and the imposition of standards, limited the autonomy of teachers and schools and imposed new forms of centralization. At least, that was the intent. In fact, left to develop their own standards, many states avoided penalties mandated in No Child Left Behind by lowering the bar and making it easier for students to pass the required tests. In 2010, the nation’s governors and state school superintendents convened a panel of experts to reverse this race to the bottom. The panel recommended combining a set of national standards—initially for English and math—with local autonomy in curriculum design and teaching methods. The Obama administration endorsed the recommendations and included them in its educational reform proposals.

In this slightly schizoid blend of local autonomy and central control, trends in public education paralleled developments in the administration of public assistance: the 1996 federal “welfare reform” legislation mandated asset of outcomes but left states autonomy in reaching them. In both education and public assistance, the mechanism of reform became the centralization of acceptable outcomes and the decentralization of the means for achieving them.

***

As for the market as a template for reform, it was everywhere in education as well as the rest of the welfare state. Markets invaded schools with compulsory viewing of the advertising on Chris Whittle’s Channel One “free” television news for schools, and with the kickbacks to schools from Coke, Pepsi, and other products sold in vending machines—money schools desperately needed as their budgets for sports, arts, and culture were cut. Some school districts turned over individual schools to for-profit corporations such as Edison Schools, while advocacy of vouchers and private charter schools reflected the belief that blending competition among providers with parental choice would expose poorly performing schools and teachers and motivate others to improve.

Unlike the situation in the rest of the welfare state, educational benefits cannot be tied to employment. But they are stratified nonetheless by location, wealth, and race. The forces eroding the fiscal capacities of cities and old suburbs—withdrawal of federal aid and shrinking tax base—have had a devastating impact on public education and on children and adolescents, relegating a great many youngsters living in poor or near-poor families to second class citizenship. In the educational division of the public welfare state test results play the role taken on elsewhere by employment. They are gatekeepers to the benefits of first-class citizenship. The danger is that high-stakes tests and stiffer graduation requirements will further stratify citizenship among the young, with kids failing tests joining stay-at-home mothers and out-of-work black men as the “undeserving poor.” In this way, public education complements the rest of the welfare state as a mechanism for reproducing, as well as mitigating, inequality in America.

***

Michael B. Katz is Walter H. Annenberg Professor of History at the University of Pennsylvania. His conception of the architecture of the American welfare state and the forces driving change within it are elaborated in his book The Price of Citizenship: Redefining the American Welfare State, updated edition (University of Pennsylvania Press).