Posted in Higher Education, History of education, Uncategorized

Research Universities and the Public Good

This post is a review essay of a new book called Research Universities and the Public Good.  It appeared in the current issue of American Journal of Sociology.  Here’s a link to a PDF of the original.

Research Universities and the Public Good: Discovery for an Uncertain Future

By Jason Owen-Smith. Stanford, Calif.: Stanford University Press,
2018. Pp. xii + 213. $35.00.

David F. Labaree
Stanford University

American higher education has long been immune to the kind of criticism
levied against elementary and secondary education because it has been seen
as a great success story, in contrast to the popular narrative of failure that
has been applied to the lower levels of the system. And the rest of the world
seems to agree with this distinction. Families outside the United States have
not been eager to send their children to our schools, but they have been
clamoring for admission to the undergraduate and graduate programs at
our colleges and universities. In the last few years, however, this reputational
immunity has been quickly fading. The relentlessly rationalizing reformers
who have done so much harm to U.S. schools in the name of accountability
have now started to direct their attention to higher education. Watch out,
they’re coming for us.

One tiny sector of the huge and remarkably diverse structure of U.S.
higher education has been particularly vulnerable to this contagion, namely,
the research university. This group represents only 3% of the more than
5,000 degree-granting institutions in the country, and it educates only a
small percentage of college students while sucking up a massive amount of
public and private resources. Its highly paid faculty don’t teach very much,
instead focusing their time instead on producing research on obscure topics
published in journals for the perusal of their colleagues rather than the public.
No wonder state governments have been reducing their funding for public
research universities and the federal government has been cutting its support
for research. No wonder there are strong calls for disaggregating the
multiplicity of functions that make these institutions so complex, so that
the various services of the university can be delivered more cost-effectively
to consumers.

In his new book, Jason Owen-Smith, a sociology professor at the University
of Michigan, mounts a valiant and highly effective defense of the apparently
indefensible American research university. While acknowledging the
complexity of functions that run through these institutions, he focuses his
attention primarily on the public benefits that derive from their research
production. As he notes, although they represent less than 3% of the institutions
of higher education, they produce nearly 90% of the system’s research and development. In an era when education is increasingly portrayed as primarily a private good—providing degrees whose benefits only accrue to the degree holders—he deliberately zeroes in on the way that university research constitutes a public good whose benefits accrue to the community as a whole.

He argues that the core public functions of the research university are to
serve as “sources of knowledge and skilled people, anchors for communities,
industries, and regions, and hubs connecting all of the far-flung parts of society”
(p. 1; original emphasis). In chapter 1 he spells out the overall argument,
in chapter 2 he explores the usefulness of the peculiarly complex
organization of the research university, in chapters 3–5 he examines in more
detail each of the core functions, and at the end he suggests ways that university
administrators can help position their institutions to demonstrate
the value they provide the public.

The core function is to produce knowledge and skill. The most telling
point the author makes about this function is that it works best if allowed
to emerge organically from the complex incentive structure of the university
itself instead of being directed by government or industry toward solving
the most current problems. Trying to make research relevant may well
make it dysfunctional. Mie Augier and James March (“The Pursuit of Relevance
in Management Education,” California Management Review 49
[2007]: 129–46) argue that the pursuit of relevance is afflicted by both ambiguity
(we don’t know what’s going to be relevant until we encounter the
next problem) and myopia (by focusing too tightly on the current case we
miss what it is a case of ). In short, as Owen-Smith notes, investing in research
universities is a kind of social insurance by which we develop answers
to problems that haven’t yet emerged.While the private sector focuses
on applied research that is likely to have immediate utility, public funds are
most needed to support the basic research whose timeline for utility is unknown
but whose breadth of benefit is much greater.

The second function of the research university is to serve as a regional anchor.
A creative tension that energizes this institution is that it’s both cosmopolitan
and local. It aspires to universal knowledge, but it’s deeply grounded
in place. Companies can move, but universities can’t. This isn’t just because
of physical plant, a constraint that also affects companies; it’s because universities
develop a complex web of relationships with the industries and governments
and citizens in their neighborhood. Think Stanford and Silicon
Valley. Owen-Smith makes the analogy to the anchor store in a shopping
mall.

The third function of the research university is to serve as a hub, which is
the cosmopolitan side of its relationship with the world. It’s located in place
but connected to the intellectual and economic world through a complex
web of networks. Like the university itself, these webs emerge organically
out of the actions of a vast array of actors pursuing their own research enterprises
and connecting with colleagues and funding sources and clients
and sites of application around the country and the globe. Research
universities are uniquely capable of convening people from all sectors
around issues of mutual interest. Such synergies benefit everyone.

The current discourse on universities, which narrowly conceives of them
as mechanisms for delivering degrees to students, desperately needs the
message that Owen-Smith delivers here. Students may be able to get a degree
through a cheap online program, but only the complex and costly system
of research universities can deliver the kinds of knowledge production,
community development, and network building that provide such invaluable
benefits for the public as a whole. One thing I would add to the author’s
analysis is that American research universities have been able to develop
such strong public support in the past in large part because they combine
top-flight scholarship with large programs of undergraduate education that
are relatively accessible to the public and rather undemanding intellectually.
Elite graduate programs and research projects rest on a firm populist base
that may help the university survive the current assaults, a base grounded
as much in football and fraternities as in the erudition of its faculty. This,
however, is but a footnote to this powerfully framed contribution to the literature
on U.S. higher education.

American Journal of Sociology, 125:2 (September, 2019), pp. 610-12

Posted in Higher Education, History of education, Meritocracy, Uncategorized

US Higher Education and Inequality: How the Solution Became the Problem

This post is a paper I wrote last summer and presented at the University of Oslo in August.  It’s a patchwork quilt of three previously published pieces around a topic I’ve been focused on a lot lately:  the role of US higher education — for better and for worse — in creating the new American aristocracy of merit.

In it I explore the way that systems of formal schooling both opened up opportunity for people to get ahead by individual merit and created the most effective structure ever devised for reproducing social inequality.  By defining merit as the accumulation of academic credentials and by constructing a radically stratified and extraordinarily opaque hierarchy of educational institutions for granting these credentials, the system grants an enormous advantage to the children of those who have already negotiated the system most effectively.

The previous generation of academic winners learned its secrets and decoded its inner logic.  They found out that it’s the merit badges that matter, not the amount of useful learning you acquire along the way.  So they coach their children in the art of gaming the system.  The result is that these children not only gain a huge advantage at winning the rewards of the meritocracy but also acquire a degree of legitimacy for these rewards that no previous system of inherited privilege ever attained.  They triumphed in a meritocratic competition, so they fully earned the power, money, and position that they derived from it.  Gotta love a system that can pull that off.

Here’s a PDF of the paper.

 

U.S. Higher Education and Inequality:

How the Solution Became the Problem

by

David F. Labaree

Lee L. Jacks Professor of Education, Emeritus

Stanford University

Email: dlabaree@stanford.edu

Web: https://dlabaree.people.stanford.edu

Twitter: @Dlabaree

Blog: https:/

/davidlabaree.com/

GSE Logo

Lecture delivered at University of Oslo

August 14, 2019

 

One of the glories of the emergence of modernity is that it offered the possibility and even the ideal that social position could be earned rather than inherited.  Instead of being destined to become a king or a peasant by dictate of paternity, for the first time in history individuals had the opportunity to attain their roles in society on the basis of merit.  And in this new world, public education became both the avenue for opportunity and the arbiter of merit.  But one of the anomalies of modernity is that school-based meritocracy, while increasing the fluidity of status attainment, has had little effect on the degree of inequality in modern societies.

In this paper, I explore how the structure of schooling helped bring about this outcome in the United States, with special focus on the evolution of higher education in the twentieth century.  The core issue driving the evolution of this structure is that the possibility for social mobility works at both the top and the bottom of the social hierarchy, with one group seeing the chance of rising up and the other facing the threat of falling down.  As a result, the former sees school as the way for their children to gain access to higher position while the latter sees it as the way for their children to preserve the social position they were born with.  Under pressure from both sides, the structure of schooling needs to find a way to accommodate these two contradictory aims.  In practice the system can accomplish this by allowing children from families at the bottom and the top to both increase their educational attainment beyond the level of their parents.  In theory this means that both groups can gain academic credentials that allow them to qualify for higher level occupational roles than the previous generation.  They can therefore both move up in parallel, gaining upward mobility without reducing the social distance between them.  Thus you end up with more opportunity without more equality.

Theoretically, it would be possible for the system to reduce or eliminate the degree to which elites manage to preserve their advantage through education simply by imposing a ceiling on the educational attainment allowed for their children.  That way, when the bottom group rises they get closer to the top group.  As a matter of practice, that option is not available in the U.S.  As the most liberal of liberal democracies, the U.S. sees any such limits on the choices of the upper group as a gross violation of individual liberty.  The result is a peculiar dynamic that has governed the evolution of the structure of American education over the years.  The pattern is this.  The out-group exerts political pressure in order to gain greater educational credentials for their children while the in-group responds by increasing the credentials of their own children.  The result is that both groups move up in educational qualifications at the same time.  Schooling goes up but social gaps remain the same.  It’s an elevator effect.  Every time the floor rises, so does the ceiling.

In the last 200 years of the history of schooling in the United States, the dynamic has played out like this.  At the starting point, one group has access to a level of education that is denied to another group.  The outsiders exert pressure to gain access to this level, which democratic leaders eventually feel compelled to grant.  But the insiders feel threatened by the loss of social advantage that greater access would bring, so they press to preserve that advantage.  How does the system accomplish this?  Through two simple mechanisms.  First, at the level where access is expanding, it stratifies schooling into curricular tracks or streams.  This means that the newcomers fill the lower tracks while the old-timers occupy the upper tracks.  Second, for the previously advantaged group it expands access to schooling at the next higher level.  So the system expands access to one level of schooling while simultaneously stratifying that level and opening up the next level.

This process has gone through three cycles in the history of U.S. schooling.  When the common school movement created a system of universal elementary schooling in the second quarter of the nineteenth century, it also created a selective public high school at the top of the system.  The purpose of the latter was to draw upper-class children from private schools into the public system by offering access to the high school only to graduates of the public grammar schools.  Without the elite high school as inducement, public schooling would have been left the domain for paupers. Then at the end of the nineteenth century, elementary grades filled up and demand increased for wider access to high school, so the system opened the doors to this institution.  But at the same it introduced curriculum tracks and set off a surge of college enrollments by the former high school students.  And when high schools themselves filled by the middle of the twentieth century, the system opened access to higher education by creating a range of new nonselective colleges and universities to absorb the influx.  This preserved the exclusivity of the older institutions, whose graduates in large numbers then started pursuing postgraduate degrees.

Result: A Very Stratified System of Higher Education

By the middle of the twentieth century, higher education was the zone of advantage for any American trying to get ahead or stay ahead.  And as a result of the process by which the tertiary system managed to incorporate both functions, it became extraordinarily stratified.  This was a system that emerged without a plan, based not on government fiat but the competing interests of educational consumers seeking to use it to their own advantage.  A market-oriented system of higher education such as this one has a special dynamic that leads to a high degree of stratification.  Each educational enterprise competes with the others to establish a position in the market that will allow it to draw students, generate a comfortable surplus, and maintain this situation over time.  The problem is that, given the lack of effective state limits on the establishment and expansion of colleges, these schools find themselves in a buyer’s market.  Individual buyers may want one kind of program over another, which gives colleges an incentive to differentiate the market horizontally to accommodate these various demands.  At the same time, however, buyers want a college diploma that will help them get ahead socially.  This means that consumers don’t just want a college education that is different; they want one that is better – better at providing access to good jobs.  In response to this consumer demand, the U.S. has developed a multi-tiered hierarchy of higher education, ranging from open-access institutions at the bottom to highly exclusive institutions at the top, with each of the upper tier institutions offering graduates a degree that provides invidious distinction over graduates from schools in the lower tiers.

This stratified structure of higher education arose in the nineteenth century in a dynamic market system, where the institutional actors had to operate according to four basic rules.  Rule One:  Age trumps youth.  It’s no accident that the oldest American colleges are overrepresented in the top tier.  Of the top 20 U.S. universities,[1] 19 were founded before 1900 and 7 before 1776, even though more than half of all American universities were founded in the twentieth century.  Before competitors had entered the field, the oldest schools had already established a pattern of training the country’s leaders, locked up access to the wealthiest families, accumulated substantial endowments, and hired the most capable faculty.

Rule Two:  The strongest rewards go to those at the top of the system.  This means that every college below the top has a strong incentive to move up the ladder, and that top colleges have a strong incentive to preserve their advantage.  Even though it is very difficult for lower-level schools to move up, this doesn’t keep them from trying.  Despite long odds, the possible payoff is big enough that everyone stays focused on the tier above.  A few major success stories allow institutions to keep their hopes alive.  University presidents lie awake at night dreaming of replicating the route to the top followed by social climbers like Berkeley, Hopkins, Chicago, and Stanford.

Rule Three:  It pays to imitate your betters.  As the research university emerged as the model for the top tier in American higher education in the twentieth century, it became the ideal toward which all other schools sought to move.  To get ahead you needed to offer a full array of undergraduate, graduate, and professional programs, selective admissions and professors who publish, a football stadium and Gothic architecture.  (David Riesman called this structure of imitation “the academic procession.”)[2]  Of course, given the advantages enjoyed by the top tier, imitation has rarely produced the desired results.  But it’s the only game in town.  Even if you don’t move up in the rankings, you at least help reassure your school’s various constituencies that they are associated with something that looks like and feels like a real university.

Rule Four:  It’s best to expand the system by creating new colleges rather than increasing enrollments at existing colleges.  Periodically new waves of educational consumers push for access to higher education.  Initially, existing schools expanded to meet the demand, which meant that as late as 1900 Harvard was the largest U.S. university, public or private.[3]  But beyond this point in the growth process, it was not in the interest of existing institutions to provide wider access.  Concerned about protecting their institutional advantage, they had no desire to sully their hard-won distinction by admitting the unwashed.  Better to have this kind of thing done by additional colleges created for that purpose.  The new colleges emerged, then, as a clearly designated lower tier in the system, defined as such by both their newness and their accessibility.

Think about how these rules have shaped the historical process that produced the present stratified structure of higher education.  This structure has four tiers.  In line with Rule One, these tiers from top to bottom emerged in roughly chronological order.  The Ivy League colleges emerged in the colonial period, followed by a series of flagship state colleges in the early and mid-nineteenth century.  These institutions, along with a few social climbers that emerged later, grew to form the core of the elite research universities that make up the top tier of the system.  Schools in this tier are the most influential, prestigious, well-funded, exclusive, research-productive, and graduate-oriented – in the U.S. and in the world.

The second tier emerged from the land grant colleges that began appearing in the mid to late nineteenth century.  They were created to fill a need not met by existing institutions, expanding access for a broader array of students and offering programs with practical application in areas like agriculture and engineering.  They were often distinguished from the flagship research university by the word “state” in their title (as with University of Michigan vs. Michigan State University) or the label “A & M” (for Agricultural and Mechanical, as with University of Texas vs. Texas A & M).  But, in line with Rules Two and Three, they responded to consumer demand by quickly evolving into full service colleges and universities; and in the twentieth century they adopted the form and function of the research university, albeit in a more modest manner.

The third tier arose from the normal schools, established in the late nineteenth century to prepare teachers.  Like the land grant schools that preceded them, these narrowly vocational institutions evolved quickly under pressure from consumers, who wanted them to model themselves after the schools in the top tiers by offering a more valuable set of credentials that would provide access to a wider array of social opportunities.  Under these market pressures, normal schools evolved into teachers colleges, general-purpose state colleges, and finally, by the 1960s, comprehensive regional state universities.

The fourth tier emerged in part from the junior colleges that first arose in the early twentieth century and eventually evolved into an extensive system of community colleges.  Like the land grant college and normal school, these institutions offered access to a new set of students at a lower level of the system.  Unlike their predecessors, for the most part they have not been allowed by state governments to imitate the university model, remaining primarily as two-year schools.  But through the transfer option, many students use them as a more accessible route into institutions in the upper tiers.

What This Means for Educational Consumers

This highly stratified system is very difficult for consumers to navigate.  Instead of allocating access to the top level of the system using the mechanism employed by most of the rest of the world – a state-administered university matriculation exam – the highly decentralized American system allocates access by means of informal mechanisms that in comparison seem anarchic.  In the absence of one access route, there are many; and in the absence of clear rules for prospective students, there are multiple and conflicting rules of thumb.  Also, the rules of thumb vary radically according to which tier of the system you are seeking to enter.

First, let’s look at the admissions process for families (primarily the upper-middle class) who are trying to get their children entrée to the elite category of highly selective liberal arts colleges and research universities.  They have to take into account the wide array of factors that enter into the complex and opaque process that American colleges use to select students at this level:  quality of high school; quality of a student’s program of study; high school grades; test scores in the SAT or ACT college aptitude tests; interests and passions expressed in an application essay; parents’ alumni status; whether the student needs financial aid; athletic skills; service activities; diversity factors such as race, ethnicity, class, national origin, sex, and sexual orientation; and extracurricular contributions a student might make to the college community.  There is no centralized review process; instead every college carries out its own admissions review and employs its own criteria.

This open and indeterminate process provides a huge advantage for upper-middle-class families.  If you are a parent who is a college graduate and who works at a professional or managerial job, where the payoff of going to a good college is readily apparent, you have the cultural and social capital to negotiate this system effectively and read its coded messages.  For you, going to college is not the issue; it’s a matter of which college your children can get into that would provide them with the greatest competitive advantage in the workplace.  You want for them the college that might turn them down rather than the one that would welcome them with open arms.  So you enroll your children in test prep; hire a college advisor; plan out a strategic plan for high school course-taking and extracurriculars; craft a service resume that makes them look appropriately public-spirited; take them on the obligatory college tour; and come up with just the right mix of applications to the stretch schools, the safety schools, and those in between.  And all this pays off handsomely: 77 percent of children from families in the top quintile by income gain a bachelor’s degree.[4]

If you are a parent farther down the class scale, who didn’t attend college and whose own work environment is not well stocked with college graduates, you have a lot more trouble negotiating the system.  The odds are not good:  for students from the fourth income quintile, only 17 percent earn a BA, and for the lowest quintile the rate is only 9 percent.[5]  Under these circumstances, having your child go to a college, any college, is a big deal; and one college is hard to distinguish from another.  But you are faced by a system that offers an extraordinary diversity of choices for prospective students:  public, not-for-profit, or for-profit; secular or religious; two-year or four-year; college or university; teaching or research oriented; massive or tiny student body; vocational or liberal; division 1, 2, or 3 intercollegiate athletics, or no sports at all; party school or nerd haven; high rank or low rank; full-time or part-time enrollment; urban or pastoral; gritty or serene; residential, commuter, or “suitcase college” (where students go home on weekends).  In this complex setting both consumers and providers somehow have to make choices that are in their own best interest.  Families from the upper-middle class are experts at negotiating this system, trimming the complexity down to a few essentials:  a four-year institution that is highly selective and preferably private (not-for-profit).  Everything else is optional.

If you’re a working-class family, however – lacking deep knowledge of the system and without access to the wide array of support systems that money can buy – you are more likely to take the system at face value.  Having your children go to a community college is the most obvious and attractive option.  It’s close to home, inexpensive, and easy to get into.  It’s where your children’s friends will be going, it allows them to work and go to school part time, and it doesn’t seem as forbiddingly alien as the state university (much less the Ivies).  You don’t need anything to gain admission except a high school diploma or GED.  No tests, counselors, tours, or resume-burnishing is required.  Of you could try the next step up, the local comprehensive state university.  To apply for admission, all you need is a high school transcript.  You might get turned down, but the odds are in your favor.  The cost is higher but can usually be paid with federal grants and loans.  An alternative is a for-profit institution, which is extremely accessible, flexible, and often online.  It’s not cheap, but federal grants and loans can pay the cost.  What you don’t have any way of knowing is that the most accessible colleges at the bottom of the system are also the ones where students are least likely to graduate.  (Only 29 percent of students entering two-year colleges earn an associate degree in three years;[6] only 39 percent earn a degree from a two-year or four-year institution in six years.[7])  You also may not be aware that the economic payoff for these colleges is lower; or that the colleges higher up the system may not only provide stronger support toward graduation and but might even be less expensive because of greater scholarship funding.

In this way, the complexity and opacity of this market-based and informally-structured system helps reinforce the social advantages of those at the top of the social ladder and limit the opportunities for those at the bottom.  It’s a system that rewards the insider knowledge of old hands and punishes newcomers.  To work it effectively, you need reject the fiction that a college is a college is a college and learn how seek advantage in the system’s upper tiers.

On the other hand, the system’s fluidity is real.  The absence of state-sanctioned and formally structured tracks means that the barriers between the system’s tiers are permeable.  Your children’s future is not predetermined by their high school curriculum or their score on the matriculation exam.  They can apply to any college they want and see what happens.  Of course, if their grades and scores are not great, their chances of admission to upper level institutions are poor.  But their chances of getting into a teaching-oriented state university are pretty good, and their chances of getting into a community college are virtually assured.  And if they take the latter option, as is most often the case for children from socially disadvantaged families, there is a real (if modest) possibility that they might be able to prove their academic chops, earn an AA degree, and transfer to a university, even a research university.  The probabilities of moving up in the system are low:  most community college students never earn an AA degree; and transfers have a harder time succeeding in the university than students who enroll there as freshmen.  But the possibilities are nonetheless genuine.

American higher education offers something for everyone.  It helps those at the bottom to get ahead and those at the top to stay ahead.  It provides socially useful educational services for every ability level and every consumer preference.  This gives it an astonishingly broad base of political support across the entire population, since everyone needs it and everyone can potentially benefit from it.  And this kind of legitimacy is not possible if the opportunity the system offers to the lower classes is a simple fraud.  First generation college students, even if they struggled in high school, can attend community college, transfer to San Jose State, and end up working at Apple.  It’s not very likely, but it assuredly is possible.  True, the more advantages you bring to the system – cultural capital, connections, family wealth – the higher the probability that you will succeed in it.  But even if you are lacking in these attributes, there is still an outside chance that you just might make it through the system and emerge with a good middle class job.

This helps explain how the system gets away with preserving social advantage for those at the top without stirring a revolt from those at the bottom.  Students from working-class and lower-class families are much less likely to be admitted to the upper reaches of the higher education system that provides the greatest social rewards; but the opportunity to attend some form of college is high, and attending a college at the lower levels of the system may provide access to a good job.  The combination of high access to the lower levels of the system and high attrition on the way to attaining a bachelor’s degree creates a situation where the system gets credit for openness and the student bears the burden for failing to capitalize on it.  The system gave you a chance but you just couldn’t make the grade.  The ready-made explanations for personal failure accumulate quickly as students try to move through the system.  You didn’t study hard enough, you didn’t get good grades in high school, you didn’t get good test scores, so you couldn’t get into a selective college.  Instead you went to a community college, where you got distracted from your studies by work, family, and friends, and you didn’t have the necessary academic ability; so you failed to complete your AA degree.  Or maybe you did complete the degree and transferred to a university, but you had trouble competing with students who were more able and better prepared than you.  Along with the majority of students who don’t make it all the way to a BA, you bear the burden for your failure – a conclusion that is reinforced by the occasional but highly visible successes of a few of your peers.  The system is well defended against charges of unfairness.

So we can understand why people at the bottom don’t cry foul.  It gave you a chance.  And there is one more reason for keeping up your hope that education will pay off for you.  A degree from an institution in a lower tier may pay lower benefits, but for some purposes one degree really is as good as another.  Often the question in getting a job or a promotion is not whether you have a classy credential but whether you have whatever credential is listed as the minimum requirement in the job description.  Bureaucracies operate on a level where form often matters more than substance.  As long as you can check off the box confirming that you have a bachelor’s degree, the BA from University of Phoenix and the BA from University of Pennsylvania can serve the same function, by allowing you to be considered for the job.  And if, say, you’re a public school teacher, an MA from Capella University, under the district contract, is as effective as one from Stanford University, because either will qualify you for a $5,000 bump in pay.

At the same time, however, we can see why the system generates so much anxiety among students who are trying to use the system to move up the social ladder for the good life.  It’s really the only game in town for getting a good job in twenty-first century America.  Without higher education, you are closed off from the white collar jobs that provide the most security and pay.  Yes, you could try to start a business, or you could try to work your way up the ladder in an organization without a college degree; but the first approach is highly risky and the second is highly unlikely, since most jobs come with minimum education requirements regardless of experience.  So you have to put all of your hopes in the higher-ed basket while knowing – because of your own difficult experiences in high school and because of what you see happening with family and friends – that your chances for success are not good.  You either you choose to pursue higher ed against the odds or you simply give up.  It’s a situation fraught with anxiety.

What is less obvious, however, is why the American system of higher education – which is so clearly skewed in favor of people at the top of the social order – fosters so much anxiety in them.  Upper-middle-class families in the U.S. are obsessed with education and especially with getting their children into the right college.  Why?  They live in the communities that have the best public schools; their children have cultural and social skills that schools value and reward; and they can afford the direct cost and opportunity cost of sending their high school grads to a residential college, even one of the pricey privates.  So why are there only a few colleges that seem to matter to this group?  Why does it matter so much to have your child not only get into the University of California but into Berkeley or UCLA?  What’s wrong with having them attend Santa Cruz or even one of the Cal State campuses?  And why the overwhelming passion for pursuing admission to Harvard or Yale?

The urgency behind all such frantic concern about admission to the most elite level of the system is this:  As parents of privilege, you can pass on your wealth to your children, but you can’t give them a profession.  Education is built into the core of modern societies, where occupations are no longer inherited but more or less earned.  If you’re a successful doctor or lawyer, you can provide a lot of advantages for your children; but in order for them to gain a position such as yours, they must succeed in school, get into a good college, and then into a good graduate school.  Unless they own the company, even business executives can’t pass on position to their children, and even then it’s increasingly rare that they would actually do so.  (Like most shareholders, they would profit more by having the company led by a competent executive than by the boss’s son.)  Under these circumstances of modern life, providing social advantage to your children means providing them with educational advantage.  Parents who have been through the process of climbing the educational hierarchy in order to gain prominent position in the occupational hierarchy know full well what it takes to make the grade.

They also know something else:  When you’re at the top of the social system, there is little opportunity to rise higher but plenty of opportunity to fall farther down.  Consider data on intergenerational mobility in the U.S.  For children of parents in the top quintile by household income, 60 percent end up at least one quintile lower than their parents and 37 fall at least two quintiles.[8]  That’s a substantial decline in social position.  So there’s good reason for these parents to fear downward mobility for their children and to use all their powers to marshal educational resources to head it off.  The problem is this:  Even though your own children have a wealth of advantages in negotiating the educational system, there are still enough bright and ambitious students from the lower classes who manage to make it through the educational gauntlet to pose them a serious threat.  So you need to make sure that your children attend the best schools, get into the high reading group and the program for the gifted, take plenty of advanced placement classes, and then get into a highly selective college and graduate school.  Leave nothing to chance, since some of your heirs are likely to be less talented and ambitious than those children who prove themselves against all odds by climbing the educational ladder.  When the higher education system opened up access after World War II, it made competition for the top tier of the system sharply higher, and the degree of competitiveness continued to increase as the proportion of students going to college grew to a sizeable majority.  As Jerome Karabel has noted in his study of elite college admissions, the American system of higher education does not equalize opportunity but it does equalize anxiety.[9]  It makes families at all levels of American society nervous about their ability to negotiate the system effectively, because it provides the only highway to the good life.

The American Meritocracy

The American system of education is formally meritocratic, but one of its social effects is to naturalize privilege.  This starts when a student’s academic merit is so central and so pervasive in schooling that it embeds itself within the individual person.  You start saying things like:  I’m smart.  I’m dumb.  I’m a good student.  I’m a bad student.  I’m good at reading but bad at math.  I’m lousy at sports.  The construction of merit is coextensive with the entire experience of growing up, and therefore it comes to constitute the emergent you.  It no longer seems to be something imposed by a teacher or a school but instead comes to be an essential part of your identity.  It’s now less what you do and increasingly who you are.  In this way, the systemic construction of merit begins to disappear and what’s left is a permanent trait of the individual.  You are your grade and your grade is your destiny.

The problem, however – as an enormous amount of research shows – is that the formal measures of merit that schools use are subject to powerful influence from a student’s social origins.  No matter how you measure merit, it affects your score.  It shapes your educational attainment.  It also shows up in measures that rank educational institutions by quality and selectivity.  Across the board, your parents’ social class has an enormous impact on the level of merit you are likely to acquire in school.  Students with higher social position end up accumulating a disproportionately large number of academic merit badges.

The correlations between socioeconomic status and school measures of merit are strong and consistent, and the causation is easy to determine.  Being born well has an enormously positive impact on the education merit you acquire across your life.  Let us count the ways.  Economic capital is one obvious factor.  Wealthy communities can support better schools. Social capital is another factor.  Families from the upper middle classes have a much broader network of relationships with the larger society than those form the working class, which provides a big advantage for their schooling prospects.  For them, the educational system is not foreign territory but feels like home.

Cultural capital is a third factor, and the most important of all.  School is a place that teaches students the cognitive skills, cultural norms, and forms of knowledge that are required for competent performance in positions of power.  Schools demonstrate a strong disposition toward these capacities over others:  mental over manual skills, theoretical over practical knowledge, decontextualized over contextualized perspectives, mind over body, Gesellschaft over Gemeinschaft.  Parents in the upper middle class are already highly skilled in these cultural capacities, which they deploy in their professional and managerial work on a daily basis.  Their children have grown up in the world of cultural capital.  It’s a language they learn to speak at home.  For working-class children, school is an introduction to a foreign culture and a new language, which unaccountably other students seem to already know.  They’re playing catchup from day one.  Also, it turns out that schools are better at rewarding cultural capital than they are at teaching it.  So kids from the upper middle class can glide through school with little effort while others continually struggle to keep up.  The longer they remain in school, the larger the achievement gap between the two groups.

In the wonderful world of academic merit, therefore, the fix is in.  Upper income students have a built-in advantage in acquiring the grades, credits, and degrees that constitute the primary prizes of the school meritocracy.  But – and this is the true magic of the educational process – the merits that these students accumulate at school come in a purified academic form that is independent of their social origins.  They may have entered schooling as people of privilege, but they leave it as people of merit.  They’re good students.  They’re smart.  They’re well educated.  As a result, they’re totally deserving of special access to the best jobs.  They arrived with inherited privilege but they leave with earned privilege.  So now they fully deserve what they get with their new educational credentials.

In this way, the merit structure of schooling performs a kind of alchemy.  It turns class position into academic merit.  It turns ascribed status into achieved status. You may have gotten into Harvard by growing up in a rich neighborhood with great schools and by being a legacy.  But when you graduate, you bear the label of a person of merit, whose future accomplishments arise alone from your superior abilities.  You’ve been given a second nature.

Consequences of Naturalized Privilege: The New Aristocracy

The process by which schools naturalize academic merit brings major consequences to the larger society.  The most important of these is that it legitimizes social inequality.  People who were born on third base get credit for hitting a triple, and people who have to start in the batter’s box face the real possibility of striking out.  According to the educational system, divergent social outcomes are the result of differences in individual merit, so, one way or the other, people get what they deserve.  The fact that a fraction of students from the lower classes manage against the odds to prove themselves in school and move up the social scale only adds further credibility to the existence of a real meritocracy.

In the United States in the last 40 years, we have come to see the broader implications of this system of status attainment through institutional merit.  It has created a new kind of aristocracy.  This is not Jefferson’s natural aristocracy, grounded in public accomplishments, but a caste of meritocratic privilege, grounded in the formalized and naturalized merit signaled by educational credentials.  As with aristocracies of old, the new meritocracy is a system of rule by your betters – no longer defined as those who are better born or more accomplished but now as those who are better educated.  Michael Young saw this coming back in 1958, as he predicted in his fable, The Rise of the Meritocracy.[10]  But now we can see that it has truly taken hold.

The core expertise of this new aristocracy is skill in working the system.  You have to know how to play the game of educational merit-getting and pass this on to your children.  The secret is in knowing that the achievements that get awarded merit points through the process of schooling are not substantive but formal.  Schooling is not about learning the subject matter; it’s about getting good grades, accumulating course credits, and collecting the diploma on the way out the door.  Degrees pay off, not what you learned in school or even the number of years of schooling you have acquired.  What you need to know is what’s going to be on the test and nothing else.  So you need to study strategically and spend of lot of effort working the refs.  Give teacher what she wants and be sure to get on her good side.  Give the college admissions officers the things they are looking for in your application.  Pump up your test scores with coaching and learning how to game the questions.

Members of the new aristocracy are particularly aggressive about carrying out a strategy known as opportunity hoarding.  There is no academic advantage too trivial to pursue, and the number of advantages you accumulate can never be enough.  In order to get your children into the right selective college you need send them to the right school, get them into the gifted program in elementary school and the right track in high school, hire a tutor, carry out test prep, do the college tour, pursue prizes, develop a well-rounded resume for the student (sport, student leadership, musical instrument, service), pull strings as a legacy and a donor, and on and on and on.

As we saw earlier, such behavior by upper-middle-class parents is not a crazy as it seems.  The problem with being at the top is that there’s nowhere to go but down.  The system is just meritocratic enough to keep the most privileged families on edge, worried about having their child bested by a smart poor kid.   Again, as Karabel put it, the only thing U.S. education equalizes is anxiety.

As with earlier aristocracies, the new aristocrats of merit cluster together in the same communities, where the schools are like no other.  Their children attend the same elite colleges, where they meet their future mates and then transmit their combined cultural, social, and economic capital in concentrated form to their children, a process sociologists call assortative mating.  And one consequence of this increase concentration of educational resources is that the achievement gap between low and high income students has been rising; Sean Reardon’s study shows the gap growing 40 percent in the last quarter of the twentieth century.  This is how educational and social inequality grows larger over time.

By assuming the form of meritocracy, schools have come to play a central role in defining the character of modern society.  In the process they have served to increase social opportunity while also increasing social inequality.  At the same time, they have established a solid educational basis for the legitimacy of this new inequality, and they have fostered the development of a new aristocracy of educational merit whose economic power, social privilege, and cultural cohesion would be the envy of the high nobility in early modern England or France.  Now, as then, the aristocracy assumes its outsized social role as a matter of natural right.

 

References

Community College Research Center. (2015). Community College FAQs. Teachers College, Columbia University. http://ccrc.tc.columbia.edu/Community-College-FAQs.html (accessed 8-3-15).

Geiger, Roger L. (2004). To Advance Knowledge: The Growth of American research Universities, 1900-1940. New Brunswick: Transaction.

Karabel, Jerome. (2005). The Chosen: The Hidden History of Admission and Exclusion at Harvard, Yale, and Princeton. New York: Mariner Books.

National Center for Education Statistics. (2014). Digest of Education Statistics, 2013. Washington, DC: US Government Printing Office.

Pell Institute and PennAHEAD. (2015). Indicators of Higher Education Equity in the United States (2015 revised edition). Philadelphia: The Pell Institute for the Study of Opportunity in Higher Education and the University of Pennsylvania Alliance for Higher Education and Democracy (PennAHEAD). http://www.pellinstitute.org/publications-Indicators_of_Higher_Education_Equity_in_the_United_States_45_Year_Report.shtml (accessed 8-10-15).

Pew Charitable Trusts Economic Mobility Project. (2012). Pursuing the American Dream: Economic Mobility Across Generations. Washington, DC: Pew Charitable Trusts. http://www.pewtrusts.org/en/research-and-analysis/reports/0001/01/01/pursuing-the-american-dream (accessed 8-10-15).

Riesman, David.  (1958).  The Academic Procession.  In Constraint and variety in American education.  Garden City, NY:  Doubleday.

U.S. News and World Report. (2015). National Universities Rankings.  http://colleges.usnews.rankingsandreviews.com/best-colleges/rankings/national-universities (accessed 4-28-15).

Young, Michael D. (1958). The Rise of the Meritocracy, 1870-2023.  New York:  Random House.

 

[1] U.S. News (2015).

[2] Riesman, (1958).

[3] Geiger (2004), 270.

[4] Pell (2015), p. 31.

[5] Pell (2015), p. 31.

[6] NCES (2014), table 326.20.

[7] CCRC (2015).

[8] Pew (2012), figure 3.

[9] Karabel (2005), p. 547.

[10] Young (1958).

Posted in Higher Education, History of education, History of Higher Education Class

Class on the History of Higher Education in the U.S.

This post contains all of the material for the class on the History of Higher Education in the US that I taught for at the Stanford Graduate School of Education for the last 15 years.  In retirement I wanted to make the course available on the internet to anyone who is interested.  If you are a college teacher, feel free to use any of it in whole or part.  If you are a student or a group of students, you can work your way through the class on your own at your own pace.  Any benefits that accrue are purely intrinsic, since no one will get college credits.  But that also means you’re free to pursue the parts of the class that you want and you don’t have any requirements or papers.  How great is that.

I’m posting the full syllabus below.  But it would be more useful to get it as a Word document through this link.  Feel free to share it with anyone you like.

All of the course materials except three required books are embedded in the syllabus through hyperlinks to a Google drive.  For each week, the syllabus includes a link to tips for approaching the readings, links to the PDFs of the readings, and a link to the slides for that week’s class.  Slides also include links to additional sources.  So the syllabus is all that is needed to gain access to the full class.

I hope you find this useful.

 

History of Higher Education in the U.S.

A 10-Week Class

David Labaree

Web: http://www.stanford.edu/~dlabaree/

Twitter: @Dlabaree

Blog: https://davidlabaree.com/

Course Description

This course provides an introductory overview of the history of higher education in the United States.  We will start with Perkin’s account of the world history of the university, and two chapters from my book about the role of the market in shaping the history of American higher education and the pressure from consumers to have college provide both social access and social advantage.  In week two, we examine an overview of the history of American college and university in the 18th and 19th centuries from John Thelin, and my chapter on the emerging nature of the college system.  In week three, we focus on the rise of the university in the latter part of the 19th century using two more chapters from Thelin, and my own chapter on the subject.  In week four, we read a series of papers around the issue of access to higher education, showing how colleges for many years sought to repel or redirect the college aspirations of women, blacks, and Jews.  In week five, we examine the history of professional education, with special attention to schools of business, education, and medicine.  In week six, we read several chapters from Donald Levine’s book about the rise of mass higher education after World War I, my piece about the rise of community colleges, and more from Thelin.  In week seven, we look at the surge of higher ed enrollments after World War II, drawing on pieces by Rebecca Lowen, Roger Geiger, Thelin, and Labaree.  In week eight, we look at the broadly accessible full-service regional state university, drawing on Alden Dunham, Thelin, Lohmann, and my chapter on the relationship between the public and private sector.  In week nine, we read a selection of chapters from Jerome Karabel’s book about the struggle by elite universities to stay on top of a dynamic and expanding system of higher education.  And in week 10, we step back and try to get a fix on the evolved nature of the American system of higher education, drawing on work by Mitchell Stevens and the concluding chapters of my book.

Like every course, this one is not a neutral survey of all possible perspectives on the domain identified by the course title; like every course, this one has a point of view.  This point of view comes through in my book manuscript that we’ll be reading in the course.  Let me give you an idea of the kind of approach I will be taking.

The American system of higher education is an anomaly.  In the twentieth century it surged past its European forebears to become the dominant system in the world – with more money, talent, scholarly esteem, and institutional influence than any of the systems that served as its models.  By all rights, this never should have happened.  Its origins were remarkably humble: a loose assortment of parochial nineteenth-century liberal-arts colleges, which emerged in the pursuit of sectarian expansion and civic boosterism more than scholarly distinction.  These colleges had no academic credibility, no reliable source of students, and no steady funding.  Yet these weaknesses of the American system in the nineteenth century turned out to be strengths in the twentieth.  In the absence of strong funding and central control, individual colleges had to learn how to survive and thrive in a highly competitive market, in which they needed to rely on student tuition and alumni donations and had to develop a mode of governance that would position them to pursue any opportunity and cultivate any source of patronage.  As a result, American colleges developed into an emergent system of higher education that was lean, adaptable, autonomous, consumer-sensitive, self-supporting, and radically decentralized.  This put the system in a strong position to expand and prosper when, before the turn of the twentieth century, it finally got what it was most grievously lacking:  a surge of academic credibility (when it assumed the mantle of scientific research) and a surge of student enrollments (when it became the pipeline to the middle class).  This course is an effort to understand how a system that started out so badly turned out so well – and how its apparently unworkable structure is precisely what makes the system work.

That’s an overview of the kind of argument I will be making about the history of higher education.  But you should feel free to construct your own, rejecting mine in part or in whole.  The point of this class, like any class, is to encourage you to try on a variety of perspectives as part of the process of developing your own working conceptual framework for understanding the world.  I hope you will enjoy the ride.

Readings

Books:  We will be reading the following books:

Thelin, John R. (2011). A history of American higher education, 2nd ed. Baltimore: Johns Hopkins University Press.

Labaree, David F. (2017). A perfect mess: The unlikely ascendancy of American higher education.  Chicago: University of Chicago Press.

Karabel, Jerome. (2005). The chosen: The hidden history of admission and exclusion at Harvard, Yale, and Princeton. New York: Houghton Mifflin Harcourt.

             Supplementary Resources:  There is a terrific online archive of primary and secondary readings on higher education, which is a supplement to The History of Higher Education, 3rd ed., published by the Association for the Study of Higher Education (ASHE): http://www.pearsoncustom.com/mi/msu_ashe/.

Course Outline

Below are the topics we will cover, week by week, with the readings for each week.

Week 1

Introduction to course

Tips for week 1 readings

Labaree, David F. (2015). A system without a plan: Elements of the American model of higher education.  Chapter 1 in A perfect mess: The unlikely ascendancy of American higher education.

Labaree, David F. (2015). Balancing access and advantage.  Chapter 5 in A perfect mess: The unlikely ascendancy of American higher education,

Perkin, Harold. (1997). History of universities. In Lester F. Goodchild and Harold S. Wechsler (Eds.), ASHE reader on the history of higher education, 2nd ed. (pp. 3-32). Boston: Pearson Custom Publishing.

Class slides for week 1

Week 2

Overview of the Early History of Higher Education in the U.S.

Tips for week 2 readings

Thelin, John R. (2011). A history of American higher education, 2nd ed. Baltimore: Johns Hopkins University Press (introductory essay and chapters 1-3).

Labaree, David F. (2015). Unpromising roots:  The ragtag college system in the nineteenth century.  Chapter 2 in A perfect mess: The unlikely ascendancy of American higher education.

Class slides for week 2

Week 3

Roots of the Growth of the University in the Late 19th and Early 20th Century

Thursday 4/19

Tips for week 3 readings

Thelin, John R. (2011). A history of American higher education, 2nd ed. Baltimore: Johns Hopkins University Press (chapters 4-5).

Labaree, David F. (2015). Adding the pinnacle and keeping the base: The graduate school crowns the system, 1880-1910.  Chapter 3 in A perfect mess: The unlikely ascendancy of American higher education,

Labaree, David F. (1995).  Foreword (to book by Brown, David K. (1995). Degrees of control: A sociology of educational expansion and occupational credentialism. New York: Teachers College Press).

Class slides for week 3

 Week 4

Educating and Not Educating the Other:  Blacks, Women, and Jews

Tips for week 4 readings

Wechsler, Harold S. (1997).  An academic Gresham’s law: Group repulsion as a theme in American higher education. In Lester F. Goodchild and Harold S. Wechsler (Eds.), ASHE reader on the history of higher education, 2nd ed. (pp. 416-431). Boston: Pearson Custom Publishing.

Anderson, James D. (1997).  Training the apostles of liberal culture: Black higher education, 1900-1935. In Lester F. Goodchild and Harold S. Wechsler (Eds.), ASHE reader on the history of higher education, 2nd ed. (pp. 432-458). Boston: Pearson Custom Publishing.

Gordon, Lynn D. (1997).  From seminary to university: An overview of women’s higher education, 1870-1920. In Lester F. Goodchild and Harold S. Wechsler (Eds.), ASHE reader on the history of higher education, 2nd ed. (pp. 473-498). Boston: Pearson Custom Publishing.

Class slides for week 4

Week 5

History of Professional Education

Tips for week 5 readings

Brubacher, John S. and Rudy, Willis. (1997). Professional education. In Lester F. Goodchild and Harold S. Wechsler (Eds.), ASHE reader on the history of higher education, 2nd ed. (pp. 379-393). Boston: Pearson Custom Publishing.

Bledstein, Burton J. (1976). The culture of professionalism. In The culture of professionalism: The middle class and the development of higher education in America (pp. 80-128). New York:  W. W. Norton.

Labaree, David F. (2015). Mutual subversion: The liberal and the professional. Chapter 4 in A perfect mess: The unlikely ascendancy of American higher education,

Starr, Paul. (1984). Transformation of the medical school. In Social transformation of American medicine (pp. 112-127). New York: Basic.

Class slides for week 5

Week 6

Emergence of Mass Higher Education

Tips for week 6 readings

Levine, Donald O. (1986).  The American college and the culture of aspiration, 1915-1940. Ithaca: Cornell University Press.  Read introduction and chapters 3, 4, and 8.

Thelin, John R. (2011). A history of American higher education, 2nd ed. Baltimore: Johns Hopkins University Press (chapter 6).

Labaree, David F. (1997). The rise of the community college: Markets and the limits of educational opportunity.  In How to succeed in school without really learning:  The credentials race in American education (chapter 8, pp. 190-222). New Haven: Yale University Press.

Class slides for week 6

Week 7

The Huge Surge of Higher Education Expansion after World War II

Tips for week 7 readings

Thelin, John R. (2011). A history of American higher education, 2nd ed. Baltimore: Johns Hopkins University Press (chapter 7).

Geiger, Roger. (2004). University advancement from the postwar era to the 1960s. In Research and relevant knowledge: American research universities since World War II (chapter 5, pp. 117-156).  Read the first half of the chapter, which focuses on the rise of Stanford.

Lowen, Rebecca S. (1997). Creating the cold war university: The transformation of Stanford. Berkeley: University of California Press.  Introduction and Chapters 5 and 6.

Labaree, David F. (2015). Learning to love the bomb: America’s brief cold-war fling with the university as a public good. Chapter 7 in A perfect mess: The unlikely ascendancy of American higher education.

Class slides for week 7

Week 8

Populist, Practical, and Elite:  The Diversity and Evolved Institutional Character of the Full-Service American University

Tips for week 8 readings

Thelin, John R. (2011). A history of American higher education, 2nd ed. Baltimore: Johns Hopkins University Press (chapter 8).

Dunham, Edgar Alden. (1969). Colleges of the forgotten Americans: A profile of state colleges and universities. New York: McGraw Hill (introduction, chapters 1-2).

Lohmann, Suzanne. (2006). The public research university as a complex adaptive system. Unpublished paper, University of California, Los Angeles.

Labaree, David F. (2015). Private advantage, public impact. Chapter 6 in A perfect mess: The unlikely ascendancy of American higher education.

Class slides for week 8

Week 9

The Struggle by Elite Universities to Stay on Top

Tips for week 9 readings

Karabel, Jerome. (2005). The chosen: The hidden history of admission and exclusion at Harvard, Yale, and Princeton. New York: Houghton Mifflin Harcourt.  Read introduction and chapters 2, 4, 9, 12, 13, 17, and 18.

Class slides for week 9

Week 10

Conclusions about the American System of Higher Education

Tips for week 10 readings

Stevens, Mitchell L., Armstrong, Elizabeth A., & Arum, Richard. (2008). Sieve, incubator, temple, hub: Empirical and theoretical advances in the sociology of higher education. Annual Review of Sociology, 34 (127-151).

Labaree, David F. (2015). Upstairs, downstairs: Relations between the tiers of the system. Chapter 8 in A perfect mess: The unlikely ascendancy of American higher education,

Labaree, David F. (2015). A perfect mess. Chapter 9 in A perfect mess: The unlikely ascendancy of American higher education.

Class slides for week 10

 

Guidelines for Critical Reading

Whenever you set out to do a critical reading of a particular text (a book, article, speech, proposal, conference paper), you need to use the following questions as a framework to guide you as you read:

  1. What’s the point? This is the analysis/interpretation issue: what is the author’s angle?
  2. What’s new? This is the value-added issue: What does the author contribute that we don’t already know?
  3. Who says? This is the validity issue: On what (data, literature) are the claims based?
  4. Who cares? This is the significance issue, the most important issue of all, the one that subsumes all the others: Is this work worth doing?  Is the text worth reading?  Does it contribute something important?

Guidelines for Analytical Writing

             In writing papers for this (or any) course, keep in mind the following points.  They apply in particular to the longer papers, but most of the same concerns apply to critical reaction papers as well.

  1. Pick an important issue: Make sure that your analysis meets the “so what” test. Why should anyone care about this topic, anyway?  Pick an issue or issues that matters and that you really care about.

 

  1. Keep focused: Don’t lose track of the point you are trying to make and make sure the reader knows where you are heading and why.

 

  1. Aim for clarity: Don’t assume that the reader knows what you’re talking about; it’s your job to make your points clearly.  In part this means keeping focused and avoiding distracting clutter.  But in part it means that you need to make more than elliptical references to concepts and sources or to professional experience.  When referring to readings (from the course or elsewhere), explain who said what and why this point is pertinent to the issue at hand.  When drawing on your own experiences or observations, set the context so the reader can understand what you mean.  Proceed as though you were writing for an educated person who is neither a member of this class nor a professional colleague, someone who has not read the material you are referring to.

 

  1. Provide analysis: A good paper is more than a catalogue of facts, concepts, experiences, or references; it is more than a description of the content of a set of readings; it is more than an expression of your educational values or an announcement of your prescription for what ails education.  A good paper is a logical and coherent analysis of the issues raised within your chosen area of focus.  This means that your paper should aim to explain rather than describe.  If you give examples, be sure to tell the reader what they mean in the context of your analysis.  Make sure the reader understands the connection between the various points in your paper.

 

  1. Provide depth, insight, and connections: The best papers are ones that go beyond making obvious points, superficial comparisons, and simplistic assertions.  They dig below the surface of the issue at hand, demonstrating a deeper level of understanding and an ability to make interesting connections.

 

  1. Support your analysis with evidence: You need to do more than simply state your ideas, however informed and useful these may be.  You also need to provide evidence that reassures the reader that you know what you are talking about, thus providing a foundation for your argument.  Evidence comes in part from the academic literature, whether encountered in this course or elsewhere.  Evidence can also come from your own experience.  Remember that you are trying to accomplish two things with the use of evidence.  First, you are saying that it is not just you making this assertion but that authoritative sources and solid evidence back you up.  Second, you are supplying a degree of specificity and detail, which helps to flesh out an otherwise skeletal argument.

 

  1. Draw on course materials (this applies primarily to reaction papers, not the final paper). Your paper should give evidence that you are taking this course.  You do not need to agree with any of the readings or presentations, but your paper should show you have considered the course materials thoughtfully.

 

  1. Recognize complexity and acknowledge multiple viewpoints. The issues in the history of American education are not simple, and your paper should not propose simple solutions to complex problems. It should not reduce issues to either/or, black/white, good/bad.  Your paper should give evidence that you understand and appreciate more than one perspective on an issue.  This does not mean you should be wishy-washy.  Instead, you should aim to make a clear point by showing that you have considered alternate views.

 

  1. Challenge assumptions. The paper should show that you have learned something by doing this paper. There should be evidence that you have been open to changing your mind.

 

  1. Do not overuse quotation: In a short paper, long quotations (more than a sentence or two in length) are generally not appropriate.  Even in longer papers, quotations should be used sparingly unless they constitute a primary form of data for your analysis.  In general, your paper is more effective if written primarily in your own words, using ideas from the literature but framing them in your own way in order to serve your own analytical purposes.  However, selective use of quotations can be very useful as a way of capturing the author’s tone or conveying a particularly aptly phrased point.

 

  1. Cite your sources: You need to identify for the reader where particular ideas or examples come from.  This can be done through in-text citation:  Give the author’s last name, publication year, and (in the case of quotations) page number in parentheses at the end of the sentence or paragraph where the idea is presented — e.g., (Kliebard, 1986, p. 22); provide the full citations in a list of references at the end of the paper.  You can also identify sources with footnotes or endnotes:  Give the full citation for the first reference to a text and a short citation for subsequent citations to the same text.  (For critical reaction papers, you only need to give the short cite for items from the course reading; other sources require full citations.)  Note that citing a source is not sufficient to fulfill the requirement to provide evidence for your argument.  As spelled out in #6 above, you need to transmit to the reader some of the substance of what appears in the source cited, so the reader can understand the connection with the point you are making and can have some meat to chew on.  The best analytical writing provides a real feel for the material and not just a list of assertions and citations.  Depth, insight, and connections count for more than a superficial collection of glancing references.  In other words, don’t just mention an array of sources without drawing substantive points and examples from these sources; and don’t draw on ideas from such sources without identifying the ones you used.

 

  1. Take care in the quality of your prose: A paper that is written in a clear and effective style makes a more convincing argument than one written in a murky manner, even when both writers start with the same basic understanding of the issues.  However, writing that is confusing usually signals confusion in a person’s thinking.  After all, one key purpose of writing is to put down your ideas in a way that permits you and others to reflect on them critically, to see if they stand up to analysis.  So you should take the time to reflect on your own ideas on paper and revise them as needed.  You may want to take advantage of the opportunity in this course to submit a draft of the final paper, revise it in light of comments, and then resubmit the revised version.  This, after all, is the way writers normally proceed.  Outside of the artificial world of the classroom, writers never turn in their first draft as their final statement on a subject.

  

Posted in Higher Education, Meritocracy, Uncategorized

Daniel Markovits on “The Meritocracy Trap”

In this post, which I just wrote, I look at the arguments in the new book by Daniel Markovits.  It crystallizes a lot of the issues in the current debate about meritocracy and advances the argument in ways I hadn’t considered before.  This is not a review of the book but a teaser to get you to read it for yourself.  In it I single out some of his key points and give you some of my favorite quotes from the book.  Enjoy.

Moskovits Cover

Daniel Markovits on The Meritocracy Trap

In the last year or two, the media have been filled with critiques of the American meritocracy (e.g., here, here, and here).  It’s about time this issue got the critical attention it deserves, since the standard account has long been that the only problem with the meritocracy is that it’s not meritocratic enough.  Thus the Varsity Blues college admissions soap opera that has been playing in the press for months now, another case of rich people buying privileged access to credentials they haven’t earned the hard way.  That’s an old story of jumping the line and cutting in front of the truly worthy.

But in his new book, The Meritocracy Trap, Daniel Markovits makes a more complex, more interesting, and ultimately more damning critique.  The problem with meritocracy, he ways, lies at its very core and not just in its slipshod implementation.  It’s a destructive force in modern society, which puts people in the lower 99 percent at a severe disadvantage in the pursuit of social mobility and a good life.  But – and this is the less familiar part – it is also damaging to the people in the top group who gain the most financial and social benefits from it.  It’s a trap for both groups, and both would be better off without it.

In this post, I want to walk through key parts of the book’s argument and present some of my favorite quotes.  Markovits, a professor at Yale Law School, is a very effective writer and the story he tells in not only largely compelling but it’s also compulsively quotable.  I hope this teaser will convince you to read the book, which is available in the usual places and also in pirated editions online.

Here’s how he sets up the argument:

Common usage often conflates meritocracy with equality of opportunity. But although meritocracy was embraced as the handmaiden of equality of opportunity, and did open up the elite in its early years, it now more nearly stifles than fosters social mobility. The avenues that once carried people from modest circumstances into the American elite are narrowing dramatically. Middle-class families cannot afford the elaborate schooling that rich families buy, and ordinary schools lag farther and farther behind elite ones, commanding fewer resources and delivering inferior educations. Even as top universities emphasize achievement rather than breeding, they run admissions competitions that students from middle-class backgrounds cannot win, and their student bodies skew dramatically toward wealth. Meritocratic education now predominantly serves an elite caste rather than the general public.

Meritocracy similarly transforms jobs to favor the super-educated graduates that elite universities produce, so that work extends and compounds inequalities produced in school. Competence and an honest work ethic no longer assure a good job. Middle-class workers, without elite degrees, face discrimination all across a labor market that increasingly privileges elaborate education and extravagant training.

The meritocracy thus works at two levels, a hyper-intense winner-take-all competition to get the very best education in an extremely stratified system of schooling coupled with a similarly intense competition in the elite sector of the workforce for the positions at the very top with the most extraordinary financial and social rewards.

This system obviously gives a huge advantage to students who bring the cultural, social, and economic capital that comes from being the children of those who are already in the elite sector.  That, as I said, is an old story; no surprise there.  But he also shows the price paid by the group at the top.  As he puts it,

the rich and the rest are entangled in a single, shared, and mutually destructive economic and social logic. Their seemingly opposite burdens are in fact two symptoms of a shared meritocratic disease. Meritocratic elites acquire their caste through processes that ruthlessly exclude most Americans and, at the same time, mercilessly assault those who do go through them. The powerfully felt but unexplained frustrations that mar both classes—unprecedented resentment among the middle class and inscrutable anxiety among the elite—are eddies in a shared stream, drawing their energies from a single current.

            Markovits notes that “For virtually all of human history, income and industry have charted opposite courses.”  The rich were idle, living off the land and off the labor of others.  The poor were the workhorses of the economy.  But today,

High society has reversed course. Now it valorizes industry and despises leisure. As every rich person knows, when an acquaintance asks “How are you?” the correct answer is “So busy.” The old leisure class would have thought this a humiliating admission. The working rich boast that they are in demand.

The result is that, in a dramatic historical reversal, meritocrats at the top of the workforce now work longer hours than the middle or working classes.

In 1940, a typical worker in the bottom 60 percent worked nearly four (or 10 percent) more weekly hours than a typical worker in the top 1 percent. By 2010, the low-income worker devoted roughly twelve (or 30 percent) fewer hours to work than the high-income worker. Taken together, these trends shift the balance of ordinary to elite labor by nearly sixteen hours—or two regulation workdays—per week.

What’s going on here is that in the new meritocracy, top positions go to people who prove their worth not only by accumulating the most highly credentialed skills in school but by demonstrating the greatest dedication to the job.  The days of bankers’ hours and white-shoe law firms, with genteel professionals working at a relaxed aristocratic rate, are gone.  Take the case of lawyers, which Markovits knows best:

In 1962 (when elite lawyers earned a third of what they do today), the American Bar Association could confidently declare that “there are . . . approximately 1300 fee-earning hours per year” available to the normal lawyer. Today, by contrast, a major law firm pronounces with equal confidence that a quota of 2,400 billable hours “if properly managed” is “not unreasonable,” which is a euphemism for “necessary for having a hope of making partner.” Billing 2,400 hours requires working from 8 a.m. until 8 p.m., six days a week, without vacation or sick days, every week of the year. Graduates of elite law schools join law firms that commonly require associates and even partners to work sixty-, eighty-, and even hundred-hour weeks.

            The issue is that the meritocrats are claiming the top rewards not as owners of property but as workers using their own human capital.  “Unlike land or factories, human capital can produce income—at least using current technologies—only by being mixed with its owners’ own contemporaneous labor.”  In order to win the competition, they need to exploit their own labor.

People who are required to measure up from preschool through retirement become submerged in the effort. They become constituted by their achievements, so that eliteness goes from being something that a person enjoys to being everything that he is. In a mature meritocracy, schools and jobs dominate elite life so immersively that they leave no self over apart from status. An investment banker, enrolled as a two-year-old in the Episcopal School and then passed on to Dalton, Princeton, Morgan Stanley, Harvard Business School, and finally to Goldman Sachs (where he spends his income on sending his children to the schools that he once attended), becomes this résumé, in the minds of others and even in his own imagination.

As a result,

Meritocratic inequality might free the rich in consumption, but it enslaves them in production….  A person who lives like this places himself, quite literally, at the disposal of others—he uses himself up….  The elite, acting now as rentiers of their own human capital, exploit themselves, becoming not just victims but also agents of their own alienation.

            Of course, it’s hard to feel sorry for the people who win this competition, since their rewards are so over the top.

David Rockefeller received a salary of about $1.6 million (in 2015 dollars) when he became chairman of Chase Manhattan Bank in 1969, which amounted to roughly fifty times a typical bank teller’s income. Last year Jamie Dimon, who runs JPMorgan Chase today, received a total compensation of $29.5 million, which is over a thousand times as much as today’s banks pay typical tellers.

So no one says, “Poor Jamie Dimon.”  But one fundamental consequence of the long work hours of the new elite is that it helps justify their high rewards.  Not only are they better educated than you are, they also work harder than you do.  So how are you supposed to cry foul about where you ended up in life?  By not simply cashing in on their credentials but also by exploiting their own human capital, they provide the meritocracy with iron-clad legitimacy.

To make matters worse, meritocracy—precisely because it justifies economic inequalities and disguises class—denies ordinary Americans any high-minded language through which to explain and articulate the harms and wrongs of their increasing…. They become “victims without a language of victimhood.”

Markovits also connects the rise of meritocracy and the anxieties in foments to the politics of the Trump era.

Meritocracy is therefore far from innocent in the recent rise of nativism and populism. Instead, nativism and populism represent a backlash against meritocratic inequality brought on by advanced meritocracy. Nativism and populism express the same ideological and psychological forces behind the epidemic of addiction, overdose, and suicide that has lowered life expectancy in the white working and middle class.

The contrast with Obama is instructive: “Obama—a superordinate product of elite education—embodied meritocracy’s triumph. Trump—‘a blue-collar billionaire’ who announces ‘I love the poorly educated’ and openly opposes the meritocratic elite—exploits meritocracy’s enduring discontents.”  As he observes, “False prophets gain a foothold…because deeply discontented people care—often most and always first—about being heard and not just being helped. They will cling to the only ship that acknowledges the storm.”

 

 

 

 

 

 

Posted in Higher Education, History, History of education

Q and A about A Perfect Mess

This is a Q and A I did with Scott Jaschik about my book, A Perfect Mess, shortly after it came out.  It was published in Inside Higher Ed in 2017.

‘A Perfect Mess’

Author discusses his new book about American higher education, which suggests it may be better off today than people realize … because it has always faced so many problems and has always been a “hustler’s paradise.”

By

Scott Jaschik
May 3, 2017

 

David F. Labaree’s new book makes a somewhat unusual argument to reassure those worried about the future of American higher education. Yes, it has many serious problems, he writes. But it always has and always will. And that is in fact a strength of American higher education, he argues.

Labaree, a professor of education at Stanford University, answered questions via email about his new book, A Perfect Mess: The Unlikely Ascendancy of American Higher Education (University of Chicago Press).

Q: What do you consider the uniquely American qualities in the development of higher education in the U.S.?

A: The American system of higher education emerged in a unique historical setting in the early 19th century, when the state was weak, the market strong and the church divided. Whereas the European university was the creature of the medieval Roman Catholic church and then grew strong under the rising nation-state in the early modern period, the American system lacked the steady support of church or state and had to rely on the market in order to survive. This posed a terrible problem in the 19th century, as colleges had to scrabble around looking for consumers who would pay tuition and for private sponsors who would provide donations. But at the same time, it planted the seeds of institutional autonomy that came to serve the system so well in the next two centuries. Free from the control of church and state, individual colleges learned to survive on their own resources by meeting the needs of their students and their immediate communities.

By the 20th century, this left the system with the proven ability to adapt to circumstances, take advantage of opportunities, build its own sources of political and economic support, and expand to meet demand. Today the highest-rated universities in global rankings of higher education institutions are the ones with the greatest autonomy, in particular as measured by being less dependent on state funds. And American institutions dominate these rankings; according to the Shanghai rankings, they account for 16 of the top 20 universities in the world.

Q: How significant is the decentralization of American higher education, with public and private systems, and publics reflecting very different traditions in different states?

A: Decentralization has been a critically important element in the American system of higher education. The federal government never established a national university, and state governments were slow in setting up their own colleges because of lack of funds. As a result, unlike anywhere else in the world, private colleges in the U.S. emerged before the publics. They were born as not-for-profit corporations with state charters but with little public funding and no public control.

The impulse for founding these colleges had little to do with advancing higher learning. Instead founders established these institutions primarily to pursue two other goals — to promote the interests of one religious denomination over others and to make land in one town more attractive to buy than land in a neighboring town. Typically, these two aims came together. Developers would donate land and set up a college, seek affiliation with a church, and then use this college as a way to promote their town as a cultural center rather than a dusty agricultural village. Remember that early America had too much land and not enough buyers. The federal government was giving it away. This helps explain why American colleges arose in the largest numbers in the sparsely populated frontier rather than in the established cities in the East — why Ohio had so many more colleges than Massachusetts or Virginia.

Q: What eras strike you as those in which American higher education was most threatened?

A: American higher education was in the greatest jeopardy in the period after the Civil War. The system was drastically overbuilt. In 1880 the U.S. had more than 800 colleges, five times the number in the entire continent of Europe. Overall, however, they were poor excuses for institutions of higher education. On average they had only 130 students and 10 faculty, which made them barely able to survive from year to year, forced to defer faculty salaries and beg for donations. European visitors loved to write home about how intellectually and socially undistinguished they were.

The system as a whole had only one great asset — a huge amount of capacity. What it lacked, however, was sufficient students and academic credibility. Fortunately, both of these elements arose in the last quarter of the century to save the day. The rise of white-collar employment in the new corporations and government agencies created demand for people with strong cognitive, verbal and social skills, the kinds of things that students learn in school. And with the public high schools filling up with working-class students, the college became the primary way for middle-class families to provide their children with advantaged access to managerial and professional work. At the same time, the import of the German model of the university, with a faculty of specialized researchers sporting the new badge of merit, the Ph.D., offered American colleges and universities the possibility for academic stature that had so long eluded them. This steady flow of students and newfound academic distinction allowed the system to realize the potential embedded in its expansive capacity and autonomous structure.

Q: You seem to be suggesting not to worry too much about today’s problems, because higher education has always been a “perfect mess.” But are there issues that are notably worse today than in the past?

A: First, let me say a little about the advantages of the system’s messiness. In the next section, I’ll respond about the problem facing the system today. The relative autonomy and decentralization of American higher education allows individual colleges and universities to find their own ways of meeting needs, finding supporters and making themselves useful. They can choose to specialize, focusing on particular parts of the market — by level of degree, primary consumer base, religious orientation or vocational function. Or, like the big public and private universities, they can choose to provide something for everyone. This makes for individual institutions that don’t have a clean organization chart, looking instead like what some researchers have called “organized anarchies.”

The typical university is in constant tension between autonomous academic departments, which control curriculum and faculty hiring and promotion, and a strong president, who controls funding and is responsible only to the lay board of directors who own the place. Also thrown into the mix are a jumble of independent institutes, research centers and academic programs that have emerged in response to a variety of funding opportunities and faculty initiatives. The resulting institution is a hustler’s paradise, driven by a wide array of entrepreneurial actors: faculty trying to pursue intellectual interests and forge a career; administrators trying to protect and enrich the larger enterprise; and donors and students who want to draw on the university’s rich resources and capitalize on association with its stellar brand. These actors are feverishly pursuing their own interests within the framework of the university, which lures them with incentives, draws strength from their complex interactions and then passes these benefits on to society.

Q: What do you see as the major challenges facing academic leaders today?

A: The biggest problem facing the American system of higher education today is how to deal with its own success. In the 19th century, very few people attended college, so the system was not much in the public spotlight. Burgeoning enrollments in the 20th century put the system center stage, especially when it became the expectation that most people should graduate from some sort of college. As higher education moved from being an option to becoming a necessity, it increasingly found itself under the kind of intense scrutiny that has long been directed at American schools.

Accountability pressure in the last three decades has reshaped elementary and secondary schooling, and now the accountability police are headed to the college campus. As with earlier iterations, this reform effort demands that colleges demonstrate the value that students and the public are getting for their investment in higher education. This is particularly the case because higher education is so much more expensive per student than schooling at lower levels. So how much of this cost should the public pay from tax revenues and how much debt should individual students take on?

The danger posed by this accountability pressure is that colleges, like the K-12 schools before them, will come under pressure to narrow their mission to a small number of easily measurable outcomes. Most often the purpose boils down to the efficient delivery of instructional services to students, which will provide them with good jobs and provide society with an expanding economy. This ignores the wide array of social functions that the university serves. It’s a laboratory for working on pressing social problems; a playpen for intellectuals to pursue whatever questions seem interesting; a repository for the knowledge needed to address problems that haven’t yet emerged; a zone of creativity and exploration partially buffered from the realm of necessity; and, yes, a classroom for training future workers. The system’s organizational messiness is central to its social value.

Posted in Credentialing, Higher Education, History, Meritocracy

Schooling the Meritocracy: How Schools Came to Democratize Merit, Formalize Achievement, and Naturalize Privilege

 

This a new piece I recently wrote, based on a paper I presented last fall at the ISCHE conference in Berlin.  It’s part of a larger project that focuses on the construction of the American meritocracy, which is to say the new American aristocracy of credentials.

Schooling the Meritocracy:

How Schools Came to Democratize Merit, Formalize Achievement, and Naturalize Privilege

David F. Labaree

 

Merit is much in the news these days.  Controversy swirls around the central role that education plays in establishing who has the most merit and thus who gets the best job.  Parents are suing Harvard, for purportedly admitting students based on ethnicity rather than academic achievement.  Federal prosecutors are indicting wealthy parents for trying to bribe their children’s way into the most selective colleges.  At the core of these debates is a concern about fairness.  To what extent does the social structure allow people to get what they deserve, based on individual merit rather than social power and privilege?  There’s nothing new about our obsession with establishing merit.  The ancient Greeks and Romans were as concerned with this issue as much as we are.  What is new, however, is that all the attention now is focused on schools as the great merit dispensers.

Modern systems of public schooling have transformed the concept of merit.  The premodern form of this quality was what Joseph Kett calls essential merit.  This represented a person’s public accomplishments, which were seen as a measure of character.  Such merit was hard won through good works and had to be defended vigorously, even if that meant engaging in duels.  The new kind of merit, which arose in the mid nineteenth century after the emergence of universal public schooling in the U.S., was what Kett calls institutional merit.  This you earned by attending school and advancing through the levels of academic attainment.  It became your personal property, which could not be challenged by others and which granted you privileges in accordance with the level of merit you acquired.

Here I examine three consequences of this shift from essential to institutional merit in the American setting.  First, this change democratized merit by making it, at least theoretically, accessible to anyone and not just the gentry, who in the premodern period had prime access to this reputational good.  Second, it formalized the idea of merit by turning it from a series of publicly visible and substantive accomplishments into the accumulation of the forms that schooling had to offer – grades, credits, and degrees.  Third, following from the first two, it served the social function of naturalizing the privileges of birth by transposing them into academic accomplishments.  The well born, through the medium of schooling, acquired a second nature that transformed ascribed status into achieved status.

 

Essential Merit

 

From the very start, the country’s Founding Fathers were obsessed with essential merit.  To twenty-first century ears, the way they used the term sounds like what we might call character or honor or reputation.  Individuals enacted this kind of merit through public performances, and it referred not just to achievements in general but especially those that were considered most admirable for public figures.  This put a premium on taking on roles of public service more than private accomplishment and on contributing to the public good.  Such merit might come from demonstrating courage on the battlefield, sacrifice for the community in a position of public leadership, scientific or literary eminence.  Think Washington, Jefferson, Madison, Hamilton, and Franklin.  It extended well beyond simple self-aggrandizement, although it often spurred that among its most ardent suitors.  It was grounded in depth of achievement, but it also relied heavily on symbolism to underscore its virtue.

Merit was both an enactment and a display.  The most accomplished practitioner of essential merit in the revolutionary period was George Washington.  From his earliest days he sought to craft the iconic persona that has persisted to the present day.  His copybook in school was filled with 110 rules of civility that should govern public behavior.  He constructed a resume of public service that led inevitably from an officer in the colonial militia, to a representative to the continental congress, to commander in chief of the revolutionary army, and then to president.  A tall man in an era of short men, he would tower over a room of ordinary people, and he liked to demonstrate his physical strength and his prowess as an accomplished horseman.  This was a man with a strong sense of his reputation and of how to maintain it.  And he scored the ultimate triumph of essential merit in his last performance in public life, when he chose to step down from the presidency after two terms and return to Mount Vernon – Cincinnatus laying down his awesome powers and going back to the farm.

This kind of merit is what Jefferson meant when he referred to a “natural aristocracy,” arising in the fertile fields of the new world that were uncorrupted by the inheritance of office.  It represents the kinds of traits that made aristocracy a viable form of governance for so many years:  educating men of privilege to take on positions of public leadership, imbued with noblesse oblige, and armed with the skills to be effective in the role.  Merit was a powerful motivator for the Founding Fathers, a spur to emulation for the benefit of the community, a self-generating dynamic for a hyper-accomplishment.  And it was a key source of their broad legitimacy as public leaders.

But essential merit also had its problems.  Although it left room for self-made men to demonstrate their merit – like Franklin and Hamilton – it was largely open to men of leisure, born into the gentry, supported by a plantation full of slaves, and free to serve the public without having to worry about making a living; think Washington, Jefferson, and Madison.  When politics began the transition in the 1820s from the Federalist to the Jacksonian era, the air of aristocracy fit uncomfortably into the emerging democratic ethos that Tocqueville so expertly captured in Democracy in America.

Another problem was that essential merit brought with it unruly competition.   How much essential merit can crowd into a room before a fight breaks out?  How can everyone be a leader?  What happens if you don’t get the respect you think you earned?  One response, quite common at the time, was to engage in a duel.  If your reputation was maligned by someone and that person refused to retract the slur, then your honor compelled you to defend your reputation with your life.  Alexander Hamilton was but one casualty of this lethal side effect of essential merit.  Benedict Arnold is another case in point.  An accomplished military officer and Washington protégé, Arnold was doing everything right on the battlefield to demonstrate his merit.  But when he sought appointment as a major general, politics blocked his path.  This was a slight too much for him to bear.  Instead of a duel (who would he challenge, his mentor Washington?), he opted for treason, plotting to pass along to the British his command of the fort at West Point.  So the dynamic behind essential merit was a powerful driver for behavior that was both socially functional and socially destructive.

 

The Rise of Institutional Merit

 

By the second quarter of the nineteenth century, a new form of merit was arising in the new republic.  In contrast to the high-flown notion of essential merit, grounded in high accomplishment in public life and defended with your life, the new merit was singularly pedestrian.  It mean grades on a report card at school.  Hardly the stuff of stirring biographies.  These grades were originally labeled as measures of merit and demerit in academic work, recording what you did well and what you did badly.  Ironically, ground zero for this new system was Benedict Arnold’s old fort at West Point, which was now the location of the U.S. Military Academy.  The sum of your merits and demerits constituted your academic worth.  Soon the emerging common school system adopted the same mode of evaluation.

The sheer banality of the new merit offered real advantages.  Unlike its predecessor, it did not signal membership in an exclusive club accessible primarily to the well-born but instead arose from a system that governed an entire population within a school.  As a result, it was well suited to a more democratic political culture.  Also, it provided a stimulus sufficiently strong to promote productive competition among students for academic standing, but these marks on a report card were not really worth fighting over.

So institutional merit emerged as a highly functional construct for meeting the organizational needs of the new systems of public schooling that arose in the middle of nineteenth century America.  What started out as a mechanism for motivating students in a classroom grew into a model for structuring an entire system of education.  Once the principle of ranking by individual achievement was established, it developed into a way of ranking groups of students within schools and then groups of schools with school systems.  The first innovation, as schools became larger and more heterogeneous in both age and ability, was to organize groups of students into homogeneous classrooms with others of the same age and ability.  If you performed with sufficient merit in one grade, you would be promoted with your peers at the end of the year into the next grade up the ladder.  If your merit was not up to standard, you would be held back to repeat the grade.  This allowed teachers to pitch instruction toward a group of students who were at roughly the same level of achievement and development.  It also created a more level playing field that allowed teachers to compare and rank the relative performance of students within the class, which they couldn’t do in a one-room schoolhouse with a wide array of ages and abilities.  So the invention of the grade also led to the invention of the metric that defines some students as above grade-level and others as below.  Graded schooling was thus the foundation of the modern meritocracy.

The next step in the development of institutional merit was the erection of a graded system of schooling.  Students would start out in an elementary school for the lower grades, then gain promotion to a grammar school, and (by the end of the nineteenth century) move up to a high school for the final grades.  Entry at one level was dependent on successful completion of the level below.  A clear hierarchy of schooling emerged based on the new merit metric.  And it didn’t stop there.  High school graduation became the criterion for entry into college, and the completion of college became the requirement for entry into graduate school.  A single graded structure guided student progress through each individual school and across the entire hierarchy of schooling, serving as a rationalized and incremental ladder of meritocratic attainment leading from first grade through the most elevated levels of the system.

Consider some of the consequences of the emergence of this finely tuned machinery for arranging students by institutional merit.  When you have a measure of what average progress should look like – annual promotion to the next grade, and periodic promotion to the school at the next level – then you also have a clear measure of failure.  There were three ways for students to demonstrate failure within the system:  to be held back from promotion to the next grade; to be denied the diploma that demonstrated completion of particular school level; to leave the system altogether as a particular point in the graded hierarchy.  Thus emerged the chronic problems of the new system – retardation and elimination.

A parallel challenge to the legitimacy of the merit structure occurred at the level of the school.  By the early twentieth century, level of school became increasingly important in determining access to the best jobs.  As a particular level of schooling began to fill up, as happened to the high school in the first half of the twentieth century, then that level of diploma became less able to provide invidious distinction.  For a high school graduate, this meant that the perceived quality of the school became an important factor in determining the relative merit of your degree compared with other high school graduates.  When college enrollments took off in the mid twentieth century and this level of the system emerged as the new zone of universal education, the value of a college degree likewise became dependent on the imputed merit of the institution granting it.  The result is a two-layered hierarchy of merit in the American educational system.  One was the formal graded system from first grade to graduate school.  Another was the informal ranking of institutions at the same formal level.  Both became critical in determining graduates’ level of institutional merit and their position in the queue for the best jobs.  Consider some of the consequences of the dominance of this new form of merit.

Democratizing Merit

As we saw, essential merit had a bias toward privilege.  The founding fathers who displayed the most merit were to the manor born.  They were free to exercise public service because of birth and wealth.  Yes, it was possible as well for an outsider to demonstrate essential merit, but it wasn’t easy.  Benjamin Franklin was sui generis, and even he acted less as a leader and more as a sage and diplomat.  Alexander Hamilton fought his way to the top, but he never lost his outsider status and ended up dying to defend his honor, which was hard-won but never fully secure.

What gives essential merit face validity is that it is based on what you have actually accomplished.  Your merit is your accomplishments.  That’s hard to beat as a basis for respect, but it’s also hard to attain.  Washington could prove himself as a military officer because his gentry status automatically qualified him to become an officer in the first place.  Jefferson became a political figure because that’s what men of his status did with themselves and his election would be assured.  As a result, what made this kind of merit so compelling is what also made it so difficult for anyone but the gentry to demonstrate.

So the move toward institutional merit radically opened up the possibility of attaining it.  It’s a system that applied to everyone – not just the people with special access but everyone in the common school classroom.  All students in the class could demonstrate their worth and earn the appropriate merits that measured that worth.  And everyone was measured on the same scale.  If essential merit was the measure of the member of the natural aristocracy, institutional merit was the measure of the citizen in a democracy.  You’ve got to love that part about it.

Another characteristic of institutional merit also made it distinctly democratic.  What it measured was neither intrinsically important nor deeply admirable.  It didn’t measure your valor in battle or your willingness to sacrifice for the public good; instead it reflected how many right answers you got on a weekly spelling test.  No big deal.

But what makes this measure of merit so powerful for the average person was its implication.  It measured a trivial accomplishment in the confined academic world of the classroom, but it implied a bright future.  If essential merit measured your real accomplishment in the world, institutional merit offered a prediction of your future accomplishment.  It said, look out for this guy – he’s going to be somebody.  This is a major benefit that derives from the new measure.  Measuring how well you did a job is relatively easy, but predicting in advance how well you will do that job is a very big deal.

Does institutional merit really predict future accomplishment?  Do academic grades, credits, and degrees tell us how people will perform on the job?  Human capital theorists say yes: the skills acquired in school translate into greater productivity in the workforce.  Credentialing theorists say no:  the workforce rewards degrees by demanding them as prerequisites for getting a job, but this doesn’t demonstrate that what is learned in school helps a person in doing the job.  I lean toward the latter group, but for our purposes this debate doesn’t really matter.  As long as the job market treats academic merit as a predictor of job performance, then this form of merit serves as such.  Whether academic learning is useful on the job is irrelevant as long as the measures of academic merit are used to allocate people to jobs.  And a system that offers everyone in a community access to schools that will award them tokens of institutional merit gives everyone a chance to gain any social position.  That’s a very democratic measure indeed.

Formalizing Merit

Part of what makes institutional merit so democratic is that the measure itself is so abstract.  What it’s measuring is not concrete accomplishment – winning a battle or passing a law – but generic accomplishment on a standardized and decontextualized scale.  It’s a score from A to F or 1 to 100 or 0 to 4.  All of these scales are in use in American schools, but which you use doesn’t matter.  They’re all interchangeable.  All they tell us is how high or low an individual was rated on some academic task.  Then these individual scores are averaged together across a heterogeneous array of such tasks to compute a composite score that tells us – what?  The score says that overall, at the end of the class, you met academic expectations (for that class in that grade) at a high, medium, or low level, or that you failed to meet the minimum expectation at all.  And, if compared to the grades that fellow students received in the same class, it shows where your performance ranked with that of your peers.

It’s the sheer abstraction of this measure of merit that gives it so much power.  A verbal description of a student’s performance in the class would be a much richer way of understanding what she learned there:  In her biology class, Joanie demonstrated a strong understanding of heredity and photosynthesis but she had some trouble with the vascular system.  The problem is that this doesn’t tell you how she compares with her classmates or whether she will qualify to become a banker.  What helps with the latter is that she received a grade of B+ (3.3 on a 4.0 scale) and the class average was B.  The grade tells you much less but it means a lot more for her and her future.  Especially when it is combined with all of her other grades in classes across her whole career in high school, culminating in her final grade point average and a diploma.  It says, she’ll get into college, but it won’t be very selective one.  She’ll end up in a middle class job, but she won’t be a top manager.  In terms of her future, this is what really matters, not her mastery of photosynthesis.

In this way, institutional merit is part of the broad process of rationalization that arose with modernity.  It filters out all of the noise that comes from context and content and qualitative judgments and comes up with a quantitative measure that locates the individual as a point on a normal curve representing everyone in the cohort.  It shows where you rank and predicts where you’re headed.  It becomes a central part of the machinery of disciplinary power.

Naturalizing Privilege

Once merit became democratized and formalized, it also became naturalized.  The process of naturalization works like this.  Your merit is so central and so pervasive in a system of universal schooling that it embeds itself within the individual person.  You start saying things like:  I’m smart.  I’m dumb.  I’m a good student.  I’m a bad student.  I’m good at reading but bad at math.  I’m lousy at sports.  The construction of merit is coextensive with the entire experience of growing up, and therefore it comes to constitute the emergent you.  It no longer seems to be something imposed by a teacher or a school but instead comes to be an essential part of your identity.  It’s now less what you do and increasingly who you are.  In this way, the systemic construction of merit begins to disappear and what’s left is a permanent trait of the individual.  You are your grade and your grade is your destiny.

The problem, however – as an enormous amount of research shows – is that the formal measures of merit that schools use are subject to powerful influence from a student’s social origins.  No matter how your measure merit, it affects your score.  It shapes your educational attainment.  It also shows up in measures that rank educational institutions by quality and selectivity.  Across the board, your parents’ social class has an enormous impact on the level of merit you are likely to acquire in school.  Students with higher social position end up accumulating a disproportionately large number of academic merit badges.

The correlations between socioeconomic status and school measures of merit are strong and consistent, and the causation is easy to determine.  Being born well has an enormously positive impact on the education merit you acquire across your life.  Let us count the ways.  Economic capital is one obvious factor.  Wealthy communities can support better schools. Social capital is another factor.  Families from the upper middle classes have a much broader network of relationships with the larger society than those form the working class, which provides a big advantage for their schooling prospects.  For them, the educational system is not foreign territory but feels like home.

Cultural capital is a third factor, and the most important of all.  School is a place that teaches students the cognitive skills, cultural norms, and forms of knowledge that are required for competent performance in positions of power.  Schools demonstrate a strong disposition toward these capacities over others:  mental over manual skills, theoretical over practical knowledge, decontextualized over contextualized perspectives, mind over body, Gesellschaft over Gemeinschaft.  Parents in the upper middle class are already highly skilled in these cultural capacities, which they deploy in their professional and managerial work on a daily basis.  Their children have grown up in the world of cultural capital.  It’s a language they learn to speak at home.  For working-class children, school is an introduction to a foreign culture and a new language, which unaccountably other students seem to already know.  They’re playing catchup from day one.  Also, it turns out that schools are better at rewarding cultural capital than they are at teaching it.  So kids from the upper middle class can glide through school with little effort while others continually struggle to keep up.  The longer they remain in school, the larger the achievement gap between the two groups.

So, in the wonderful world of academic merit, the fix is in.  Upper income students have a built-in advantage in acquiring the grades, credits, and degrees that constitute the primary prizes of the school meritocracy.  But – and this is the true magic of the educational process – the merits that these students accumulate at school come in a purified academic form that is independent of their social origins.  They may have entered schooling as people of privilege, but they leave it as people of merit.  They’re good students.  They’re smart.  They’re well educated.  As a result, they’re totally deserving of special access to the best jobs.  They arrived with inherited privilege but they leave with earned privilege.  So now they fully deserve what they get with their new educational credentials.

In this way, the merit structure of schooling performs a kind of alchemy.  It turns class position into academic merit.  It turns ascribed status into achieved status. You may have gotten into Harvard by growing up in a rich neighborhood with great schools and by being a legacy.  But when you graduate, you bear the label of a person of merit, whose future accomplishments arise alone from your superior abilities.  You’ve been given a second nature.

Consequences of Naturalized Privilege: The New Aristocracy

The process by which schools naturalize academic merit brings major consequences to the larger society.  The most important of these is that it legitimizes social inequality.  People who were born on third base get credit for hitting a triple, and people who have to start in the batter’s box face the real possibility of striking out.  According to the educational system, divergent social outcomes are the result of differences in individual merit, so, one way or the other, people get what they deserve.  The fact that a fraction of students from the lower classes manage against the odds to prove themselves in school and move up the social scale only adds further credibility to the existence of a real meritocracy.

In the United States in the last 40 years, we have come to see the broader implications of this system of status attainment through institutional merit.  It has created a new kind of aristocracy.  This is not Jefferson’s natural aristocracy, grounded in public accomplishments, but a caste of meritocratic privilege, grounded in the formalized and naturalized merit signaled by educational credentials.  As with aristocracies of old, the new meritocracy is a system of rule by your betters – no longer defined as those who are better born or more accomplished but now as those who are better educated.  Michael Young saw this coming back in 1958, as he predicted in his fable, The Rise of the Meritocracy.  But now we can see that it has truly taken hold.

The core expertise of this new aristocracy is skill in working the system.  You have to know how to play the game of educational merit-getting and pass this on to your children.  The secret is in knowing that the achievements that get awarded merit points through the process of schooling are not substantive but formal.  Schooling is not about learning the subject matter; it’s about getting good grades, accumulating course credits, and collecting the diploma on the way out the door.  Degrees pay off, not what you learned in school or even the number of years of schooling you have acquired.  What you need to know is what’s going to be on the test and nothing else.  So you need to study strategically and spend of lot of effort working the refs.  Give teacher what she wants and be sure to get on her good side.  Give the college admissions officers the things they are looking for in your application.  Pump up your test scores with coaching and learning how to game the questions.

Members of the new aristocracy are particularly aggressive about carrying out a strategy known as opportunity hoarding.  There is no academic advantage too trivial to pursue, and the number of advantages you accumulate can never be enough.  In order to get your children into the right selective college you need send them to the right school, get them into the gifted program in elementary school and the right track in high school, hire a tutor, carry out test prep, do the college tour, pursue prizes, develop a well-rounded resume for the student (sport, student leadership, musical instrument, service), pull strings as a legacy and a donor, and on and on and on.

Such behavior by upper-middle-class parents is not a crazy as it seems.  The problem with being at the top is that there’s nowhere to go but down.  If you look at studies of intergenerational mobility in the US, the top quintile of families have a big advantage, with more than 40 percent of children ending up in the same quintile as their parents, twice the rate that would occur by chance.  But that still means that 60 percent are going to be downwardly mobile.  The system is just meritocratic enough to keep the most privileged families on edge, worried about having their child bested by a smart poor kid.   As Jerry Karabel puts it in The Chosen, the only thing U.S. education equalizes is anxiety.

As with earlier aristocracies, the new aristocrats of merit cluster together in the same communities, where the schools are like no other.  Their children attend the same elite colleges, where they meet their future mates and then transmit their combined cultural, social, and economic capital in concentrated form to their children, a process sociologists call assortative mating.  And one consequence of this increase concentration of educational resources is that the achievement gap between low and high income students has been rising; Sean Reardon’s study shows the gap growing 40 percent in the last quarter of the twentieth century.  This is how educational and social inequality grows larger over time.

 

By democratizing, formalizing, and naturalizing merit, schools have played a central role in defining the character of modern society.  In the process they have served to increase social opportunity while also increasing social inequality.  At the same time, they have established a solid educational basis for the legitimacy of this new inequality, and they have fostered the development of a new aristocracy of educational merit whose economic power, social privilege, and cultural cohesion would be the envy of the high nobility in early modern England or France.  Now, as then, the aristocracy assumes its outsized social role as a matter of natural right.

 

Posted in Educational Research, Higher Education, Scholarship

We’re Producing Academic Technicians and Justice Warriors: Sermon on Educational Research, Pt. 2

This is a followup to the “Sermon on Educational Research” that I posted last week.  It’s a reflection on two dysfunctional orientations toward scholarship that students often pick up in the course of doctoral study.

We’re Producing Academic Technicians and Justice Warriors:

A Sermon on Educational Research, Part 2

David F. Labaree

Published in International Journal of the Historiography of Education, 1-2019

Download here

            In 2012, I wrote a paper for this journal titled “A Sermon on Educational Research.”  It offered advice to doctoral students in education about how to approach their work as emergent scholars in the field.  The key bits of advice were:  be wrong; be lazy; be irrelevant.  The idea was to immunize scholars against some of the chronic syndromes in educational scholarship – trying to be right instead of interesting, trying to be diligent instead of strategic, and trying to focus on issues arising from professional utility instead of from intellectual interest.  Needless to say, the advice failed to take hold.  The engine of educational research production has continued to plow ahead in pursuit of validity, diligence, and relevance.

So here I am, giving it another try.  This time I take aim at two kinds of practices among doctoral students in education that are particularly prominent right now and also particularly problematic for the future health of the field.  Most students don’t fit in either category, but the very existence of these practices threatens to pollute the pool.  One practice is the effort to become a hardcore academic technician; the other is the effort to become a hardcore justice warrior.  Though at one level they represent opposite orientations toward research, at another level they have in common the urge to serve as social engineers intent of fixing social problems.  The antidotes to these two tendencies, I suggest, are to take a dose of humility every day and to approach educational research as a form of play.  Let’s play with ideas instead of being hell-bent on tinkering with the social machinery.

The Academic Technician

One role that education doctoral students adopt is academic technician.  In practice, this means concentrating on learning the craft of a particular domain of educational research.  Not that there’s anything wrong with craft.  Without it, we wouldn’t be a profession at all but just a bunch of amateurs.  The problem comes from learning the craft too well.  That means apprenticing yourself to an expert in your subfield and adopting all the practices and perspectives that this expert represents.

One flaw with this approach is that it treats educational research as a field whose primary problems are technical.  It’s all about immersing yourself in cutting-edge research methodologies and diligently applying these to whatever data meets their assumptions.  Often the result is scholarship that is technically expert and substantively deficient.  Your aim is to be able to defend the validity of your findings more than their significance, since at colloquia where this kind of work is presented the arguments are mostly focused on whether the methodology warrants the modest claims made by the author.

The focus on acquiring technical skills diverts students from engaging with the big issues in the field of education, which are primarily normative.  Education is an effort to form children into the kinds of adults we want them to be.  So the central issues in education revolve around the ends we want to accomplish and the values we hold dear.  The key conflicts are about purpose rather than practice.  Technical skills are not sufficient to explore these issues, and by concentrating too much on acquiring these purportedly hard skills we turn our attention away from the normative concerns that by comparison seem awfully squishy.

Another problem that arises from the effort to become academic technicians is that it turns students into terrible writers.  You populate your text with jargon and other forms of academic shorthand because you are speaking to an audience so small it could fit in a single seminar room.  You’re trying to do science, so you model your writing after the lifeless language of the journeyman scientific journal article.  This means using passive voice, abandoning the first person (“data were gathered” – who did that?), avoiding action verbs, loading the text with nominalizations (never use a verb when you can turn it into a noun), and at all costs refusing to tell an engaging story.  If you make an effort to draw the reader’s interest, it’s considered unprofessional.  In this style of writing, papers are built by the numbers.  Using the IMRaD formula, papers need to consist of Introduction, Methods, Results, and Discussion.  You can read them and write them in any order.  Every paper is just an exercise in filling each of these categories with new content.  It’s plug and play all the way.

The Justice Warrior

Another scholarly role that doctoral students adopt is justice warrior.  If the first role ranks means over ends, this one canonizes ends and dispenses with means altogether.  All that matters is your frequently expressed commitment to particular values of social justice.  You can’t express these values too often or too vehemently, since the mission is all important and the enemies resisting the mission are legion.  As a result, your position is perpetually atop the high horse of righteous indignation.  The primary targets of your scholarship are sexism, racism, and colonialism, with social class coming in a distant fourth.

If people seek to question your position because of a putative failure to construct a compelling argument or to validate your claims with clear evidence and rigorous methods, they are only demonstrating that they are on the wrong side.  It’s ok to dismiss any text or argument whose author might be accused of betraying a tinge of sexism, racism, or colonialism.  Everything that follows is fruit of the poisonous tree.  You can say something like, “Once I saw he was using the male pronoun, I couldn’t continue reading.”  Nothing worthwhile comes from someone you deem a bad person.  This simplifies your life as an emergent scholar, since you can ignore most of the literature.  It also means you seek out like-minded souls to serve on your committee and like-minded journals to place your papers.

This approach incorporates a distinctive stance toward intellectual life.  The academic technician restricts intellectual interest to the methods within a small subfield at the expense of engaging with interesting ideas.  But the justice warrior on principle adopts a position that is whole-heartedly anti-intellectual.  You need to shun most ideas because they bear the taint of their sinful origins.  Maintaining ideological purity is the key focus of your academic life.  The world is black and white and only sinners see shades of gray.

For justice warriors, every class, colloquium, meeting, and paper is an opportunity to signal your virtue.  This has the effect of stifling the conversation, since it’s hard for anyone to come back with a critical comment without looking sexist, racist, or colonialist.  Once you establish the high ideological ground, it’s easy to defend your position without having to draw on data, methods, or logic.  Being right brings reliable rewards.

Some Common Ground: Becoming Social Engineers

These two tendencies within the educational research community appear to be opposites, but in one way they are quite similar.  Both show a commitment for scholars in the field to become engineers whose job it is to fix social problems.  For the academic technicians, this means a focus on creating data-driven policies for school reform, where the aim is to bring school and society in line with the findings of rigorous research.  Research says do this, so let it be so.  For justice warriors, this means a focus on bringing school and society in line with your own personal sense of what is righteous.  Both say:  I know best, so get out of my way.

The problem with this, of course, is that exercises in social engineering so often go very badly, no matter how much they are validated by science or confirmed by belief.  Think communism, fascism, the inquisition, eugenics, the penitentiary, and the school.  Rationalized scientific knowledge can be a destructive tool for tinkering with the emergent, organic ecology of social life.  And moral absolutism can easily poison the soil.

A Couple Antidotes

One antidote to the dual diseases of academic technicalism and justice fundamentalism is a dose of humility.  What most social engineers have in common is a failure to consider the possibility that they might be wrong.  Maybe I don’t know enough.  Maybe my methods don’t apply in this setting.  Maybe my theories are flawed.  Maybe my values are not universal.  Maybe my beliefs are mistaken.  Maybe my morals are themselves tainted by inconsistency.  Maybe it’s not just a case of technique; maybe ends matter.  Maybe it’s not just a case of values; maybe means matter.

Adopting humility doesn’t mean that you need to be tentative in your assertions or diffident in your willingness to enter the conversation.  Often you need to overstate your point in order to get attention and push people to engage with you.  But it does mean that you need to be willing to reconsider your argument based on evidence and arguments that you encounter through with other scholars – or with your own data.  This reminds me of a colleague who used to ask faculty candidates the same question:  “Tell me about a time when your research forced you to give up an idea that you really wanted to hold on to.”  If your own research isn’t capable of changing your mind, then it seems you’re not really examining data but simply confirming belief.

Another way to counter these two baleful tendencies in the field is to approach research as a form of intellectual play.  This means playing with ideas instead of engineering improvement, instead of pursuing methodological perfection, instead of pursuing ideological perfection.  Play lets you try things out without fear of being technically or ideologically wrong.  It keeps you from taking yourself too seriously, always a risk for academics at all levels.  Play will keep you from adopting the social engineering stance that assumes you know better than they do.  This doesn’t mean abandoning your commitments to rigor and values.  Your values will continue to shape what you play with, serving to make the stories you tell with your research meaningful and worthwhile.  Your technique will continue to be needed to make the stories you tell credible.

Playing with ideas is fundamental to the ways that universities work.  Ideally, universities provide a zone of autonomy for faculty that allows them to explore the intellectual terrain, unfettered by concerns about what’s politically correct, socially useful, or potentially ridiculous.  This freedom is more than a license to be frivolous, though it’s tolerant of such behavior.  Its value comes from the way it opens up possibilities that more planful programs of research might miss.  It allows you to think the unthinkable and pursue the longshot.  Maybe most such efforts come to naught, but that’s an acceptable cost if a few fall on fertile soil and grow into insights of great significance.  Play is messy but it’s highly functional.

In closing, let me note for the record that most doctoral students in education don’t fall into either of the two categories of scholarly malpractice that I identify here.  Most are neither academic technicians nor justice warriors.  Most manage to negotiate a position that avoids either of these polar tendencies.  That’s the good news, which bodes well for the future of our field.  The bad news, however, is that this often leaves them feeling as though they have fallen between two stools.  Compared to the academic technicians they seem unprofessional, and compared to the justice warriors they seem immoral.  That’s a position that threatens their ability to function as the kind of educational scholars we need.

Posted in Higher Education, History, Systems of Schooling

An Unlikely Triumph: How US Higher Education Went from Rags in the 19th Century to Riches in the 20th

This is a piece I published in Aeon in October, 2017.  It provides an overview of my book that came out that year, “A Perfect Mess: The Unlikely Ascendancy of American Higher Education (University of Chicago Press).

From the perspective of 19th-century visitors to the United States, the country’s system of higher education was a joke. It wasn’t even a system, just a random assortment of institutions claiming to be colleges that were scattered around the countryside. Underfunded, academically underwhelming, located in small towns along the frontier, and lacking in compelling social function, the system seemed destined for obscurity. But by the second half of the 20th century, it had assumed a dominant position in the world market in higher education. Compared with peer institutions in other countries, it came to accumulate greater wealth, produce more scholarship, win more Nobel prizes, and attract a larger proportion of talented students and faculty. US universities dominate global rankings.

How did this remarkable transformation come about? The characteristics of the system that seemed to be disadvantages in the 19th century turned out to be advantages in the 20th. Its modest state funding, dependence on students, populist aura, and obsession with football gave it a degree of autonomy that has allowed it to stand astride the academic world.

The system emerged under trying circumstances early in US history, when the state was weak, the market strong, and the church divided. Lacking the strong support of church and state, which had fostered the growth of the first universities in medieval Europe, the first US colleges had to rely largely on support from local elites and tuition-paying student consumers. They came into being with the grant of a corporate charter from state government, but this only authorised these institutions. It didn’t fund them.

The rationale for starting a college in the 19th century usually had less to do with promoting higher learning than with pursuing profit. For most of US history, the primary source of wealth was land, but in a country with a lot more land than buyers, the challenge for speculators was how to convince people to buy their land rather than one of the many other available options. (George Washington, for instance, accumulated some 50,000 acres in the western territories, and spent much of his life unsuccessfully trying to monetise his holdings.) The situation became even more desperate in the mid-19th century, when the federal government started giving away land to homesteaders. One answer to this problem was to show that the land was not just another plot in a dusty agricultural village but prime real estate in an emerging cultural centre. And nothing said culture like a college. Speculators would ‘donate’ land for a college, gain a state charter, and then sell the land around it at a premium, much like developers today who build a golf course and then charge a high price for the houses that front on to it.

Of course, chartering a college is not the same as actually creating a functioning institution. So speculators typically sought to affiliate their emergent college with a religious denomination, which offered several advantages. One was that it segmented the market. A Presbyterian college would be more attractive to Presbyterian consumers than the Methodist college in the next town. Another was staffing. Until the late-19th century, nearly all presidents and most faculty at US colleges were clergymen, who were particularly attractive to college founders for two reasons. They were reasonably well-educated, and they were willing to work cheap. A third advantage was that the church just might be induced to contribute a little money from time to time to support its struggling offspring.

Often the motives of profit and faith converged in the same person, producing a distinctive American character – the clergyman-speculator. J B Grinnell was a Congregational minister who left the church he founded in Washington, DC, to establish a town out west as a speculative investment. In 1854 he settled on a location in Iowa, named the town Grinnell, gained a charter for a college, and started selling land for $1.62 an acre. Instead of organising a college from scratch, he convinced Iowa College to move from Davenport and assume the name Grinnell College.

This process of college development helps to explain a lot of things about the emergent form of the US higher-education system in the 19th century. Less than a quarter of the colleges were in the strip of land along the eastern seaboard where most Americans lived. More than half were in the Midwest and Southwest: the sparsely populated frontier. If your aim is to attract a lot of students, this was not a great business plan, but it was useful in attracting settlers. The frontier location also helps to explain the nominal church support for the colleges. In the competitive US setting where no church was dominant, it was each denomination for itself, so everyone wanted to plant the denominational flag in the new territories for fear of ceding the terrain to the opposition. Together, land speculation and sectarian competitions help to explain why, by 1880, Ohio had 37 colleges – and France just 16.

The sheer number of such college foundings was remarkable. In 1790, at the start of the first decade of the new republic, the US already had 19 institutions called colleges or universities. The numbers grew gradually in the first three decades, rising to 50 by 1830, and then started accelerating. By the 1850s they had reached 250, doubling again in the following decade (563), and in 1880 totalled 811. The growth in colleges vastly exceeded the growth in population, with a total of five colleges per million people in 1790, rising to 16 per million in 1880. In that year, the US had five times as many colleges as the entire continent of Europe. This was the most overbuilt system of higher education the world had ever seen.

Of course, as European visitors liked to point out, it was a stretch to call most of these colleges institutions of higher learning. For starters, they were small. In 1880, the average college boasted 131 students and 10 faculty members, granting only 17 degrees a year. Most were located far from centres of culture and refinement. Faculty were preachers rather than scholars, and students were whoever was willing to pay tuition for a degree whose market value was questionable. Most graduates joined the clergy or other professions that were readily accessible without a college degree.

For American students, it was often a choice of going to high school or to college

On the east coast, a small number of colleges – Harvard, Yale, Princeton, William and Mary – drew students from families of wealth and power, and served as training grounds for future leaders. But closer to the frontier, there were no established elites for colleges to bond with, and they offered little in the way of social distinction. The fact that every other town had its own college led to intense competition for students, which meant that tuition charges remained low. This left colleges to operate on a shoestring, making do with poor facilities, low pay, struggles to attract and retain students and faculty, and continual rounds of fundraising. And it meant that students were more middle- than upper-class, there for the experience rather than the learning; the most serious students were those on scholarship.

Another sign of the lowly status of these 19th-century colleges is that they were difficult to distinguish from the variety of high schools and academies that were also in abundance across the US landscape. For students, it was often a choice of going to high school or to college, rather than seeing one as the feeder institution for the other. As a result, the age range of students attending high schools and colleges was substantially the same.

By the middle of the century, a variety of new forms of public colleges arose in addition to the independent institutions that today we call private. States started establishing their own colleges and universities, for much the same reasons as churches and towns did: competition (if the state next door had a college, you needed one too) and land speculation (local boosters pushed legislatures to grant them this plum). In addition, there were the colleges that arose from federal land grants and came to focus on more practical rather than classical education, such as engineering and agriculture. Finally came the normal schools, which focused on preparing teachers for the growing public school system. Unlike the privates, these newer institutions operated under public control, but that did not mean they had a steady flow of public funding. They didn’t start getting annual appropriations until the start of the 20th century. As a result, like the privates, they had to rely on student tuition and donations in order to survive, and they had to compete for students and faculty in the larger market already established by their private predecessors.

By 1880, the US system of higher education was extraordinarily large and spatially dispersed, with decentralised governance and a remarkable degree of institutional complexity. This system had established a distinctive structure early in the century, and then elaborated on it over the succeeding decades. It might seem strange to call the motley collection of some 800 colleges and universities a system at all. ‘System’ implies a plan and a form of governance that keeps things working according to the plan, and that indeed is the formal structure of higher-education systems in most other countries, where a government ministry oversees the system and tinkers with it over time. But not in the US.

The system of higher education in the US did not arise from a plan, and no agency governs it. It just happened. But it is nonetheless a system, which has a well-defined structure and a clear set of rules that guides the actions of the individuals and institutions within it. In this sense, it is less like a political system guided by a constitution than a market-based economic system arising from an accumulation of individual choices. Think urban sprawl rather than planned community. Its history is not a deliberate construction but an evolutionary process. The market systems just happen, but that doesn’t keep us from understanding how it came about and how it works.

People did try to impose some kind of logical form and function on to the system. All US presidents until Andrew Jackson argued for the need to establish a national university, which would have set a high standard for the system, but this effort failed because of the widespread fear of a strong central government. And a number of actors tried to impose their own vision of what the purpose of the system should be. In 1828, the Yale faculty issued a report strongly supporting the traditional classical curriculum (focused on Latin, Greek and religion); in the 1850s, Francis Wayland at Brown argued for a focus on science; and the Morrill Land-Grant Act of 1862 called for colleges that would ‘teach such branches of learning as are related to agriculture and the mechanic arts … in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life’. These visions provided support for a wide array of alternative college missions within a diversified system that was wed to none of them.

The weaknesses of the college system were glaringly obvious. Most of the colleges were not created to promote higher learning, and the level of learning they did foster was modest indeed. They had a rudimentary infrastructure and no reliable stream of funding. They were too many in number for any of them to gain distinction, and there was no central mechanism for elevating some of them above others. Unlike Europe, the US had no universities with the imprimatur of the national government or the established church, just a collection of marginal public and private institutions located on the periphery of civilisation. What a mess.

Take Middlebury College, a Congregational institution founded in 1800, which has now become one of the premier liberal arts colleges in the country, considered one of the ‘little Ivies’. But in 1840, when its new president arrived on campus (a Presbyterian minister named Benjamin Labaree, my grandfather’s grandfather), he found an institution that was struggling to survive, and in his 25-year tenure as president this situation did not change much for the better. In letters to the board of trustees, he detailed a list of woes that afflicted the small college president of his era. Hired for a salary of $1,200 a year (roughly $32,000 today), he found that the trustees could not afford to pay it. So he immediately set out to raise money for the college, the first of eight fundraising campaigns that he engaged in, making a $1,000 contribution of his own and soliciting gifts from the small faculty.

Money worries are the biggest theme in Labaree Snr’s letters (struggling to recruit and pay faculty, mortgaging his house to make up for his own unpaid salary, and perpetually seeking donations), but he also complained about the inevitable problems that come from trying to offer a full college curriculum with a small number of underqualified professors:

I accepted the Presidency of Middlebury College, Gentlemen, with a full understanding that your Faculty was small and that in consequence a large amount of instruction would devolve upon the President – that I should be desired to promote the financial interests of the Institution, as convenience and the duties of instruction would permit, was naturally to be expected, but I could not have anticipated that the task of relieving the College from pecuniary embarrassment, and the labor and responsibility of procuring funds for endowment for books, for buildings etc, etc would devolve on me. Could I have foreseen what you would demand of me, I should never have engaged in your service.

At one place in the correspondence, Labaree Snr listed the courses he had to teach as president: ‘Intellectual and Moral Philosophy, Political Economy, International Law, Evidences of Christianity, History of Civilization, and Butler’s Analogy’. US college professors could not afford to have narrow expertise.

Colleges survived by hustling for dollars from prospective donors and marketing themselves to prospective students

In short, the US college system in the mid-19th century was all promise and no product. Nonetheless, it turns out that the promise was extraordinary. One hidden strength was that the system contained nearly all the elements needed to respond to a future rapid expansion of student demand and burgeoning enrolments. It had the necessary physical infrastructure: land, classrooms, libraries, faculty offices, administration buildings, and the rest. And this physical presence was not concentrated in a few population centres but scattered across the landmass of a continental country. It had faculty and administration already in place, with programmes of study, course offerings, and charters granting colleges the ability to award degrees. It had an established governance structure and a process for maintaining multiple streams of revenue to support the enterprise, as well as an established base of support in the local community and in the broader religious denomination. The main thing the system lacked was students.

Another source of strength was that this disparate collection of largely undistinguished colleges and universities had succeeded in surviving a Darwinian process of natural selection in a fiercely competitive environment. As market-based institutions that had never enjoyed the luxury of guaranteed appropriations (this was true for public as well as private colleges), colleges survived by hustling for dollars from prospective donors and marketing themselves to prospective students who could pay tuition. They had to be adept at meeting the demands of the key constituencies in their individual markets. In particular, they had to be sensitive to what prospective students were seeking in a college experience, since they were paying a major part of the bills. And colleges also had a strong incentive to build longstanding ties with their graduates, who would become a prime source for new students and for donations.

In addition, the structure of the college – with a lay board, strong president, geographical isolation, and stand-alone finances – made it a remarkably adaptable institution. These colleges could make changes without seeking permission from the education minister or the bishop. President were the CEOs of the enterprise, and their clear mission was to maintain the viability of the college and expand its prospects. They had to make the most of the advantages offered to them by geography and religious affiliation, and to adapt quickly to shifts in position relative to competitors concerning such key institutional matters as programme, price and prestige. The alternative was to go out of business. Between 1800 and 1850, 40 liberal arts colleges closed, 17 per cent of the total.

Successful colleges were also deeply rooted in isolated towns across the country. They represented themselves as institutions that educated local leaders and served as cultural centres for their communities. The college name was usually the town’s name. The colleges that survived the mid-19th century were well-poised to take advantage of the coming surge of student interest, new sources of funding, and new rationales for attending college.

US colleges retained a populist aura. Because they were located in small towns all across the country and forced to compete with peers in the same situation, they became more concerned about survival than academic standards. As a result, the US system took on a character that was middle-class rather than upper-class. Poor families did not send their children to college, but ordinary middle-class families could. Admission was easy, the academic challenge moderate, the tuition manageable. This created a broad popular foundation for the college that saved it, for the most part, from Oxbridge-style elitism. The college was an extension of the community and denomination, a familiar local presence, a source of civic pride, and a cultural avatar representing the town to the world. Citizens did not have to have a family member connected with the school to feel that the college was theirs. This kind of populist base of support came to be enormously important when higher education enrolments started to skyrocket.

One final characteristic of the US model of higher education was its practicality. As it developed in the mid-19th century, the higher-education system incorporated this practical orientation into the structure and function of the standard-model college. The land-grant college was both an effect and a cause of the cultural preference for usefulness. The focus on the useful arts was written into the DNA of these institutions, as an expression of the US effort to turn a college for gentlemen or intellectuals into a school for practical pursuits, with an emphasis on making things and making a living, rather than on gaining social polish or exploring the cultural heights. And this model spread widely to the other parts of the system. The result was not just the inclusion of subjects such as engineering and applied science into the curriculum but also the orientation of the college itself as a problem-solver for the businessmen and policymakers. The message was: ‘This is your college, working for you.’

All of this was quite popular with consumers, but it didn’t make US colleges centres of intellectual achievement and renown. That, however, began to change in the 1880s, when the German research university burst on to the US educational scene. In this emerging model, the university was a place that produced cutting-edge scientific research, and provided graduate-level training for the intellectual elite. The new research model gave the institutionally overbuilt and academically undistinguished US system of higher education an infusion of scholarly credibility, which had been so clearly lacking. For the first time, the system could begin to make the claim of being the locus of learning at the highest level. At the same time, colleges received a large influx of enrolments, which remedied another problem with the old model – the chronic shortage of students.

The system had to make students happy, which meant an academic programme that was not overly challenging

But the US did not adopt the German model wholesale. Instead, the model was adapted to US needs. The research university was an add-on, not a transformation. The German university was an elitist institution, focused primarily on graduate instruction and high-level research, which were possible only with a strong and steady flow of state support. Since such funding was not forthcoming in the US, graduate education and scholarly research could exist only at a modest level and only if grafted on to the hardy stock of the US undergraduate college. It needed the financial support that comes from a large number of undergraduate students, who paid tuition and drew per-capita appropriations for state institutions. It also needed the political support and social legitimacy that came from the populism and practicality of the existing US college. High-level graduate learning depended on an undergraduate experience that was broadly accessible and not too demanding intellectually. In short, it needed students. And in the 20th century, the students arrived.

By then, the US higher-education system was in a strong position to capitalise on the capacities it had built during its competitive struggle for survival in the preceding years. Compared with the much older and more distinguished European institutions, it enjoyed a broad base of public support as a populist enterprise that offered a lot of practical benefits. It felt like our institution rather than theirs. To survive, the system had to go out of its way to make students happy, which meant providing a rich array of social entertainments – including fraternities, sororities and, of course, football – and an academic programme that was not overly challenging. The idea was to get students so enmeshed in the institution that they come to identify with it – which helps to ensure that later in life they will continue to wear the school colours, return for reunions, enrol their own children, and make generous donations.

One way you see this populist quality today is in the language people use. Americans tend to employ the labels college and university interchangeably. Elsewhere in the world, however, ‘university’ refers to the highest levels of postsecondary education, which offers bachelors and graduate degrees, while ‘college’ refers to something more like what Americans would call a community college, offering associate degrees and vocational training. So when Brits or Canadians say: ‘I’m going to university,’ it carries an elitist edge. But for Americans, the term university is considered a bit prissy and pretentious. They tend to prefer saying: ‘I’m going to college,’ whether that institution is Harvard or the local trade school. This is quite misleading, since US higher education is extraordinarily stratified, with the benefits varying radically according to the status of the institution. But it is also characteristically populist, an assertion that college is accessible to nearly anyone.

Coming into the 20th century, another advantage enjoyed by the system was that US colleges and universities tended to enjoy a relatively high degree of autonomy. This was most obvious in the case of the private not-for-profit institutions that still account for the majority of US higher-education institutions. A lay board owns the institution and appoints the president, who serves as CEO, sets the budget, and administers faculty and staff. Private universities now receive a lot of government money, especially for research grants and student loans and scholarships, but they have broad discretion over tuition, pay, curriculum and organisation. This allows the university to adapt quickly to changing market conditions, respond to funding opportunities, develop new programmes, and open research centres.

Public universities are subject to governance from the state, which provides appropriations in support of core functions and also shapes policy. This limits flexibility about issues such as budget, tuition and pay. But state funding covers only a portion of total expenses, with the share declining as you go up the institutional status ladder. Flagship public research universities in the US often receive less than 20 per cent of their budget from the state; for the University of Virginia, the portion is below 5 per cent. Regional state universities receive around half of their funds from the state. So public institutions need to supplement their funds using the same methods as private institutions – with student tuition, research grants, fees for services, and donations. And this gives them considerable latitude in following the lead of the privates in adapting to the market and pursuing opportunities. Public research universities have the greatest autonomy from state control. And the public universities that have long topped the rankings – the University of California and the University of Michigan – have their autonomy guaranteed in the state constitution.

By the 21st century, US universities accounted for 52 of the top 100 in the world, and 16 of the top 20

It turns out that autonomy is enormously important for a healthy and dynamic system of higher education. Universities operate best as emergent institutions, in which initiative bubbles up from below – as faculty pursue research opportunities, departments develop programmes, and administrators start institutes and centres to take advantage of possibilities in the environment. Central planning by state ministries of higher education seeks to move universities toward government goals, but this kind of top-down policymaking tends to stifle the entrepreneurial activities of the faculty and administrators who are most knowledgeable about the field and most in tune with market demand. You can quantify the impact that autonomy from the state has on university quality. The economist Caroline Hoxby at Stanford and colleagues did a study that compared the global rankings of universities with the proportion of university funding that comes from the state (using the ranks computed by Shanghai Jiao Tong University). They found that when the proportion of the budget from state funds rises by one percentage point, the university falls three ranks. Conversely, when the proportion of the budget from competitive grants rises by one percentage point, the university goes up six ranks.

In the 19th century, weak support from church and state forced US colleges to develop into an emergent system of higher education that was lean, adaptable, autonomous, consumer-sensitive, partially self-supporting, and radically decentralised. These humble beginnings provided the system with the core characteristics that helped it to become the leading system in the world. This undistinguished group of colleges came to top world rankings. By the 21st century, US universities accounted for 52 of the top 100 universities in the world, and 16 of the top 20. Half of the Nobel laureates in the 21st century were scholars at US institutions. At the same time, the system’s hand-to-mouth finances turned into extraordinary wealth. The university in the US with the largest endowment is Harvard, at $35 billion; the largest in Europe is Cambridge, at $8 billion. The largest endowment on the continent is held by a brand-new institution, Central European University in Budapest with $900 million, thanks to a donation from George Soros. This would place CEU in the 103rd position in the US, behind Brandeis University.

Rags to riches indeed. No longer a joke, the US system of higher education has become the envy of the world. Unfortunately, however, since it’s a system that emerged without a plan, there’s no model for others to imitate. It’s an accident that arose under unique circumstances: when the state was weak, the market strong, and the church divided; when there was too much land and not enough buyers; and when academic standards were low. Good luck trying to replicate that pattern anywhere in the 21st century.

Posted in Credentialing, Higher Education, Meritocracy

Michael Lewis: Don’t Eat Fortune’s Cookie

In the last year or so, I’ve been reading and writing about the American meritocracy, and I’m going to be posting some of these pieces here from time to time.  But today I want to post a wonderful statement on the subject by Michael Lewis, which I somehow had missed when it first came out.  It’s his address at the Princeton commencement in 2012 called, “Don’t Eat Fortune’s Cookie.”  The theme for the new Princeton grads is simple and powerful:  You shouldn’t assume you deserve to be where you are today.

Princeton University’s 2012 Baccalaureate Remarks

June 3, 2012 4:17 p.m.

Don’t Eat Fortune’s Cookie
Michael Lewis
June 3, 2012 — As Prepared

(NOTE: The video of Lewis’ speech as delivered is available on the Princeton YouTube channel.)

Thank you. President Tilghman. Trustees and Friends. Parents of the Class of 2012. Above all, Members of the Princeton Class of 2012. Give yourself a round of applause. The next time you look around a church and see everyone dressed in black it’ll be awkward to cheer. Enjoy the moment.

Thirty years ago I sat where you sat. I must have listened to some older person share his life experience. But I don’t remember a word of it. I can’t even tell you who spoke. What I do remember, vividly, is graduation. I’m told you’re meant to be excited, perhaps even relieved, and maybe all of you are. I wasn’t. I was totally outraged. Here I’d gone and given them four of the best years of my life and this is how they thanked me for it. By kicking me out.

At that moment I was sure of only one thing: I was of no possible economic value to the outside world. I’d majored in art history, for a start. Even then this was regarded as an act of insanity. I was almost certainly less prepared for the marketplace than most of you. Yet somehow I have wound up rich and famous. Well, sort of. I’m going to explain, briefly, how that happened. I want you to understand just how mysterious careers can be, before you go out and have one yourself.

I graduated from Princeton without ever having published a word of anything, anywhere. I didn’t write for the Prince, or for anyone else. But at Princeton, studying art history, I felt the first twinge of literary ambition. It happened while working on my senior thesis. My adviser was a truly gifted professor, an archaeologist named William Childs. The thesis tried to explain how the Italian sculptor Donatello used Greek and Roman sculpture — which is actually totally beside the point, but I’ve always wanted to tell someone. God knows what Professor Childs actually thought of it, but he helped me to become engrossed. More than engrossed: obsessed. When I handed it in I knew what I wanted to do for the rest of my life: to write senior theses. Or, to put it differently: to write books.

Then I went to my thesis defense. It was just a few yards from here, in McCormick Hall. I listened and waited for Professor Childs to say how well written my thesis was. He didn’t. And so after about 45 minutes I finally said, “So. What did you think of the writing?”

“Put it this way” he said. “Never try to make a living at it.”

And I didn’t — not really. I did what everyone does who has no idea what to do with themselves: I went to graduate school. I wrote at nights, without much effect, mainly because I hadn’t the first clue what I should write about. One night I was invited to a dinner, where I sat next to the wife of a big shot at a giant Wall Street investment bank, called Salomon Brothers. She more or less forced her husband to give me a job. I knew next to nothing about Salomon Brothers. But Salomon Brothers happened to be where Wall Street was being reinvented—into the place we have all come to know and love. When I got there I was assigned, almost arbitrarily, to the very best job in which to observe the growing madness: they turned me into the house expert on derivatives. A year and a half later Salomon Brothers was handing me a check for hundreds of thousands of dollars to give advice about derivatives to professional investors.

Now I had something to write about: Salomon Brothers. Wall Street had become so unhinged that it was paying recent Princeton graduates who knew nothing about money small fortunes to pretend to be experts about money. I’d stumbled into my next senior thesis.

I called up my father. I told him I was going to quit this job that now promised me millions of dollars to write a book for an advance of 40 grand. There was a long pause on the other end of the line. “You might just want to think about that,” he said.

“Why?”

“Stay at Salomon Brothers 10 years, make your fortune, and then write your books,” he said.

I didn’t need to think about it. I knew what intellectual passion felt like — because I’d felt it here, at Princeton — and I wanted to feel it again. I was 26 years old. Had I waited until I was 36, I would never have done it. I would have forgotten the feeling.

The book I wrote was called “Liar’s Poker.”  It sold a million copies. I was 28 years old. I had a career, a little fame, a small fortune and a new life narrative. All of a sudden people were telling me I was born to be a writer. This was absurd. Even I could see there was another, truer narrative, with luck as its theme. What were the odds of being seated at that dinner next to that Salomon Brothers lady? Of landing inside the best Wall Street firm from which to write the story of an age? Of landing in the seat with the best view of the business? Of having parents who didn’t disinherit me but instead sighed and said “do it if you must?” Of having had that sense of must kindled inside me by a professor of art history at Princeton? Of having been let into Princeton in the first place?

This isn’t just false humility. It’s false humility with a point. My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don’t want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.

I wrote a book about this, called “Moneyball.” It was ostensibly about baseball but was in fact about something else. There are poor teams and rich teams in professional baseball, and they spend radically different sums of money on their players. When I wrote my book the richest team in professional baseball, the New York Yankees, was then spending about $120 million on its 25 players. The poorest team, the Oakland A’s, was spending about $30 million. And yet the Oakland team was winning as many games as the Yankees — and more than all the other richer teams.

This isn’t supposed to happen. In theory, the rich teams should buy the best players and win all the time. But the Oakland team had figured something out: the rich teams didn’t really understand who the best baseball players were. The players were misvalued. And the biggest single reason they were misvalued was that the experts did not pay sufficient attention to the role of luck in baseball success. Players got given credit for things they did that depended on the performance of others: pitchers got paid for winning games, hitters got paid for knocking in runners on base. Players got blamed and credited for events beyond their control. Where balls that got hit happened to land on the field, for example.

Forget baseball, forget sports. Here you had these corporate employees, paid millions of dollars a year. They were doing exactly the same job that people in their business had been doing forever.  In front of millions of people, who evaluate their every move. They had statistics attached to everything they did. And yet they were misvalued — because the wider world was blind to their luck.

This had been going on for a century. Right under all of our noses. And no one noticed — until it paid a poor team so well to notice that they could not afford not to notice. And you have to ask: if a professional athlete paid millions of dollars can be misvalued who can’t be? If the supposedly pure meritocracy of professional sports can’t distinguish between lucky and good, who can?

The “Moneyball” story has practical implications. If you use better data, you can find better values; there are always market inefficiencies to exploit, and so on. But it has a broader and less practical message: don’t be deceived by life’s outcomes. Life’s outcomes, while not entirely random, have a huge amount of luck baked into them. Above all, recognize that if you have had success, you have also had luck — and with  luck comes obligation. You owe a debt, and not just to your Gods. You owe a debt to the unlucky.

I make this point because — along with this speech — it is something that will be easy for you to forget.

I now live in Berkeley, California. A few years ago, just a few blocks from my home, a pair of researchers in the Cal psychology department staged an experiment. They began by grabbing students, as lab rats. Then they broke the students into teams, segregated by sex. Three men, or three women, per team. Then they put these teams of three into a room, and arbitrarily assigned one of the three to act as leader. Then they gave them some complicated moral problem to solve: say what should be done about academic cheating, or how to regulate drinking on campus.

Exactly 30 minutes into the problem-solving the researchers interrupted each group. They entered the room bearing a plate of cookies. Four cookies. The team consisted of three people, but there were these four cookies. Every team member obviously got one cookie, but that left a fourth cookie, just sitting there. It should have been awkward. But it wasn’t. With incredible consistency the person arbitrarily appointed leader of the group grabbed the fourth cookie, and ate it. Not only ate it, but ate it with gusto: lips smacking, mouth open, drool at the corners of their mouths. In the end all that was left of the extra cookie were crumbs on the leader’s shirt.

This leader had performed no special task. He had no special virtue. He’d been chosen at random, 30 minutes earlier. His status was nothing but luck. But it still left him with the sense that the cookie should be his.

This experiment helps to explain Wall Street bonuses and CEO pay, and I’m sure lots of other human behavior. But it also is relevant to new graduates of Princeton University. In a general sort of way you have been appointed the leader of the group. Your appointment may not be entirely arbitrary. But you must sense its arbitrary aspect: you are the lucky few. Lucky in your parents, lucky in your country, lucky that a place like Princeton exists that can take in lucky people, introduce them to other lucky people, and increase their chances of becoming even luckier. Lucky that you live in the richest society the world has ever seen, in a time when no one actually expects you to sacrifice your interests to anything.

All of you have been faced with the extra cookie. All of you will be faced with many more of them. In time you will find it easy to assume that you deserve the extra cookie. For all I know, you may. But you’ll be happier, and the world will be better off, if you at least pretend that you don’t.

Never forget: In the nation’s service. In the service of all nations.

Thank you.

And good luck.