Posted in Credentialing, Higher Education, History of education, Sociology, Uncategorized

How Credentialing Theory Explains the Extraordinary Growth in US Higher Ed in the 19th Century

Today I am posting a piece I wrote in 1995. It was the foreword to a book by David K. Brown, Degrees of Control: A Sociology of Educational Expansion and Occupational Credentialism.  

I have long been interested in credentialing theory, but this is the only place where I ever tried to spell out in detail how the theory works.  For this purpose, I draw on the case of the rapid expansion of the US system of higher education in the 19th century and its transformation at the end of the century, which is the focus of Brown’s book.  Here’s a link to a pdf of the original. 

The case is particularly fruitful for demonstrating the value of credentialing theory, because the most prominent theory of education development simply can’t make sense of it.  Functionalist theory sees the emergence of educational systems as part of the process of modernization.  As societies become more complex, with a greater division of labor and a shift from manual to mental work, the economy requires workers with higher degrees of verbal and cognitive skills.  Elementary, secondary  and higher education arise over time in response to this need. 

The history of education in the U.S., however, poses a real problem for this explanation.  American higher education exploded in the 19th century, to the point that there were 800 some colleges in existence by 1880, which was more than the total number in the continent of Europe.  It was the highest rate of colleges per 100,000 population that the world have ever seen.   The problem is that this increase was not in respond to increasing demand from employers for college-educated workers.  While the rate of higher schooling was increasing across the century, the skill demands in the workforce were declining.  The growth of factory production was subdividing forms of skilled work, such as shoemaking, into a series of low-skilled tasks on the assembly line.  

This being the case, then, how can we understand the explosion of college founding in the 19th century?  Brown provides a compelling explanation, and I lay out his core arguments in my foreword.  I hope you find it illuminating.

 

Brown Cover

Preface

In this book, David Brown tackles an important question that has long puzzled scholars who wanted to understand the central role that education plays in American society: When compared with other Western countries, why did the United States experience such extraordinary growth in higher education? Whereas in most societies higher education has long been seen as a privilege that is granted to a relatively small proportion of the population, in the United States it has increasingly come to be seen as a right of the ordinary citizen. Nor was this rapid increase in accessibility very recent phenomenon. As Brown notes, between 1870 and 1930, the proportion of college-age persons (18 to 21 years old) who attended institutions of higher education rose from 1.7% to 13.0%. And this was long before the proliferation of regional state universities and community colleges made college attendance a majority experience for American youth.

The range of possible answers to this question is considerable, with each carrying its own distinctive image of the nature of American political and social life. For example, perhaps the rapid growth in the opportunity for higher education was an expression of egalitarian politics and a confirmation of the American Dream; or perhaps it was a political diversion, providing ideological cover for persistent inequality; or perhaps it was merely an accident — an unintended consequence of a struggle for something altogether different. In politically charged terrain such as this, one would prefer to seek guidance from an author who doesn’t ask the reader to march behind an ideological banner toward a preordained conclusion, but who instead rigorously examines the historical data and allows for the possibility of encountering surprises. What the reader wants, I think, is an analysis that is both informed by theory and sensitive to historical nuance.

In this book, Brown provides such an analysis. He approaches the subject from the perspective of historical sociology, and in doing so he manages to maintain an unusually effective balance between historical explanation and sociological theory-building. Unlike many sociologists dealing with history, he never oversimplifies the complexity of historical events in the rush toward premature theoretical closure; and unlike many historians dealing with sociology, he doesn’t merely import existing theories into his historical analysis but rather conceives of the analysis itself as a contribution to theory. His aim is therefore quite ambitious – to spell out a theoretical explanation for the spectacular growth and peculiar structure of American higher education, and to ground this explanation in an analysis of the role of college credentials in American life.

Traditional explanations do not hold up very well when examined closely. Structural-functionalist theory argues that an expanding economy created a powerful demand for advanced technical skills (human capital), which only a rapid expansion of higher education could fill. But Brown notes that during this expansion most students pursued programs not in vocational-technical areas but in liberal arts, meaning that the forms of knowledge they were acquiring were rather remote from the economically productive skills supposedly demanded by employers. Social reproduction theory sees the university as a mechanism that emerged to protect the privilege of the upper-middle class behind a wall of cultural capital, during a time (with the decline of proprietorship) when it became increasingly difficult for economic capital alone to provide such protection. But, while this theory points to a central outcome of college expansion, it fails to explain the historical contingencies and agencies that actually produced this outcome. In fact, both of these theories are essentially functionalist in approach, portraying higher education as arising automatically to fill a social need — within the economy, in the first case, and within the class system, in the second.

However, credentialing theory, as developed most extensively by Randall Collins (1979), helps explain the socially reproductive effect of expanding higher education without denying agency. It conceives of higher education diplomas as a kind of cultural currency that becomes attractive to status groups seeking an advantage in the competition for social positions, and therefore it sees the expansion of higher education as a response to consumer demand rather than functional necessity. Upper classes tend to benefit disproportionately from this educational development, not because of an institutional correspondence principle that preordains such an outcome, but because they are socially and culturally better equipped to gain access to and succeed within the educational market.

This credentialist theory of educational growth is the one that Brown finds most compelling as the basis for his own interpretation. However, when he plunges into a close examination of American higher education, he finds that Collins’ formulation of this theory often does not coincide very well with the historical evidence. One key problem is that Collins does not examine the nature of labor market recruitment, which is critical for credentialist theory, since the pursuit of college credentials only makes sense if employers are rewarding degree holders with desirable jobs. Brown shows that between 1800 and 1880 the number of colleges in the United States grew dramatically (as Collins also asserts), but that enrollments at individual colleges were quite modest. He argues that this binge of institution-creation was driven by a combination of religious and market forces but not (contrary to Collins) by the pursuit of credentials. There simply is no good evidence that a college degree was much in demand by employers during this period. Instead, a great deal of the growth in the number of colleges was the result of the desire by religious and ethnic groups to create their own settings for producing clergy and transmitting culture. In a particularly intriguing analysis, Brown argues that an additional spur to this growth came from markedly less elevated sources — local boosterism and land speculation — as development-oriented towns sought to establish colleges as a mechanism for attracting land buyers and new residents.

Brown’s version of credentialing theory identifies a few central factors that are required in order to facilitate a credential-driven expansion of higher education, and by 1880 several of these were already in place. One such factor is substantial wealth. Higher education is expensive, and expanding it for reasons of individual status attainment rather than for societal necessity is a wasteful use of a nation’s resources; it is only feasible for a very wealthy country. The United States was such a country in the late nineteenth century. A second factor is a broad institutional base. At this point, the United States had the largest number of colleges per million residents that the country has even seen, before or since. When combined with the small enrollments at each college, this meant that there was a great potential for growth within an already existing institutional framework. This potential was reinforced by a third factor, decentralized control. Colleges were governed by local boards rather than central state authorities, thus encouraging entrepreneurial behavior by college leaders, especially in the intensively competitive market environment they faced.

However, three other essential factors for rapid credential-based growth in higher education were still missing in 1880. For one thing, colleges were not going to be able to attract large numbers of new students, who were after all unlikely to be motivated solely by the love of learning, unless they could offer these students both a pleasant social experience and a practical educational experience — neither of which was the norm at colleges for most of the nineteenth century. Another problem was that colleges could not function as credentialing institutions until they had a monopoly over a particular form of credentials, but in 1880 they were still competing directly with high schools for the same students. Finally, their credentials were not going to have any value on the market unless employers began to demonstrate a distinct preference for hiring college graduates, and such a preference was still not obvious at this stage.

According to Brown, the 1880s saw a major shift in all three of these factors. The trigger for this change was a significant oversupply of institutions relative to existing demand. In this life or death situation, colleges desperately sought to increase the pool of potential students. It is no coincidence that this period marked the rapid diffusion of efforts to improve the quality of social life on campuses (from the promotion of athletics to the proliferation of fraternities), and also the shift toward curriculum with a stronger claim of practicality (emphasizing modern languages and science over Latin and Greek). At the same time, colleges sought to guarantee a flow of students from feeder institutions, which required them to establish a hierarchical relationship with high schools. The end of the century was the period in which colleges began requiring completion of a high school course as a prerequisite for college admission instead of the traditional entrance examination. This system provided high schools with a stable outlet for its graduates and colleges with predictable flow of reasonably well-prepared students. However, none of this would have been possible if the college degree had not acquired significant exchange value in the labor market. Without this, there would have been only social reasons for attending college, and high schools would have had little incentive to submit to college mandates.

Perhaps Brown’s strongest contribution to credential theory is his subtle and persuasive analysis of the reasoning that led employers to assert a preference for college graduates in the hiring process. Until now, this issue has posed a significant, perhaps fatal, problem for credentialing theory, which has asked the reader to accept two apparently contradictory assertions about credentials. First, the theory claims that a college degree has exchange value but not necessarily use value; that is, it is attractive to the consumer because it can be cashed in on a good job more or less independently of any learning that was acquired along the way. Second, this exchange value depends on the willingness of employers to hire applicants based on credentials alone, without direct knowledge of what these applicants know or what they can do. However this raises a serious question about the rationality of the employer in this process. After all, why would an employer, who presumably cares about the productivity of future employees, hire people based solely on college’s certification of competence in the absence of any evidence for that competence?

Brown tackles this issue with a nice measure of historical and sociological insight. He notes that the late nineteenth century saw the growing rationalization of work, which led to the development of large-scale bureaucracies to administer this work within both private corporations and public agencies. One result was the creation of a rapidly growing occupational sector for managerial employees who could function effectively within such a rationalized organizational structure. College graduates seemed to fit the bill for this kind of work. They emerged from the top level of the newly developed hierarchy of educational institutions and therefore seemed like natural candidates for management work in the upper levels of the new administrative hierarchy, which was based not on proprietorship or political office but on apparent skill. And what kinds of skills were called for in this line of work? What the new managerial employees needed was not so much the technical skills posited by human capital theory, he argues, but a general capacity to work effectively in a verbally and cognitively structured organizational environment, and also a capacity to feel comfortable about assuming positions of authority over other people.

These were things that the emerging American college could and indeed did provide. The increasingly corporate social structure of student life on college campuses provided good socialization for bureaucratic work, and the process of gaining access to and graduation from college provided students with an institutionalized confirmation of their social superiority and qualifications for leadership. Note that these capacities were substantive consequences of having attended college, but they were not learned as part of the college’s formal curriculum. That is, the characteristics that qualified college graduates for future bureaucratic employment were a side effect of their pursuit of a college education. In this sense, then, the college credential had a substantive meaning for employers that justified them in using it as a criterion for employment — less for the human capital that college provided than for the social capital that college conferred on graduates. Therefore, this credential, Brown argues, served an important role in the labor market by reducing the uncertainty that plagued the process of bureaucratic hiring. After all, how else was an employer to gain some assurance that candidate could do this kind of work? A college degree offered a claim to competence, which had enough substance behind it to be credible even if this substance was largely unrelated to the content of the college curriculum.

By the 1890s all the pieces were in place for a rapid expansion of college enrollments, strongly driven by credentialist pressures. Employers had reason to give preference to college graduates when hiring for management positions. As a result, middle-class families had an increasing incentive to provide their children with privileged access to an advantaged social position by sending them to college. For the students themselves, this extrinsic reward for attending college was reinforced by the intrinsic benefits accruing from an attractive social life on campus. All of this created a very strong demand for expanding college enrollments, and the pre-existing institutional conditions in higher education made it possible for colleges to respond to this demand in an aggressive fashion. There were a thousand independent institutions of higher education, accustomed to playing entrepreneurial roles in a competitive educational market, that were eager to capitalize on the surge of interest in attending college and to adapt themselves to the preferences of these new tuition-paying consumers. The result was a powerful and unrelenting surge of expansion in college enrollments that continued for the next century.

 

Brown provides a persuasive answer to the initial question about why American higher education expanded at such a rapid rate. But at this point the reader may well respond by asking the generic question that one should ask of any analyst, and that is, “So what?” More specifically, in light of the particular claims of this analysis, the question becomes: “What difference does it make that this expansion was spurred primarily by the pursuit of educational credentials?” In my view, at least, the answer to that question is clear. The impact of credentialism on both American society and the American educational system has been profound — profoundly negative. Consider some of the problems it has caused.

One major problem is that credentialism is astonishingly inefficient. Education is the largest single public investment made by most modern societies, and this is justified on the grounds that it provides a critically important contribution to the collective welfare. The public value of education is usually calculated as some combination of two types of benefits, the preparation of capable citizens (the political benefit) and the training of productive workers (the economic benefit). However the credentialist argument advanced by Brown suggests that these public benefits are not necessarily being met and that the primary beneficiaries are in fact private individuals. From this perspective, higher education (and the educational system more generally) exists largely as a mechanism for providing individuals with a cultural commodity that will give them a substantial competitive advantage in the pursuit of social position. In short, education becomes little but a vast public subsidy for private ambition.

The practical effect of this subsidy is the production of a glut of graduates. The difficulty posed by this outcome is not that the population becomes overeducated (since such a state is difficult to imagine) but that it becomes overcredentialed, since people are pursuing diplomas less for the knowledge they are thereby acquiring than for the access that the diplomas themselves will provide. The result is a spiral of credential inflation; for as each level of education in turn gradually floods with a crowd of ambitious consumers, individuals have to keep seeking ever higher levels of credentials in order to move a step ahead of the pack. In such a system nobody wins. Consumers have to spend increasing amounts of time and money to gain additional credentials, since the swelling number of credential holders keeps lowering the value of credentials at any given level. Taxpayers find an increasing share of scarce fiscal resources going to support an educational chase with little public benefit. Employers keep raising the entry-level education requirements for particular jobs, but they still find that they have to provide extensive training before employees can carry out their work productively. At all levels, this is an enormously wasteful system, one that rich countries like the United States can increasingly ill afford and that less developed countries, who imitate the U.S. educational model, find positively impoverishing.

A second major problem is that credentialism undercuts learning. In both college and high school, students are all too well aware that their mission is to do whatever it takes to acquire a diploma, which they can then cash in on what really matters — a good job. This has the effect of reifying the formal markers of academic progress-grades, credits, and degrees — and encouraging students to focus their attention on accumulating these badges of merit for the exchange value they offer. But at the same time this means directing attention away from the substance of education, reducing student motivation to learn the knowledge and skills that constitute the core of the educational curriculum. Under such conditions, it is quite rational, even if educationally destructive, for students to seek to acquire their badges of merit at a minimum academic cost, to gain the highest grade with the minimum amount of learning. This perspective is almost perfectly captured by a common student question, one that sends chills down the back of the learning-centered teacher but that makes perfect sense for the credential-oriented student: “ls this going to be on the test?” (Sedlak et al., 1986, p. 182). We have credentialism to thank for the aversion to learning that, to a great extent, lies at the heart of our educational system.

A third problem posed by credentialism is social and political more than educational. According to credentialing theory, the connection between social class and education is neither direct nor automatic, as suggested by social reproduction theory. Instead, the argument goes, market forces mediate between the class position of students and their access to and success within the educational system. That is, there is general competition for admission to institutions of higher education and for levels of achievement within these institutions. Class advantage is no guarantee of success in this competition, since such factors as individual ability, motivation, and luck all play a part in determining the result. Market forces also mediate between educational attainment (the acquisition of credentials) and social attainment (the acquisition of a social position). Some college degrees are worth more in the credentials market than others, and they provide privileged access to higher level positions independent of the class origins of the credential holder.

However, in both of these market competitions, one for acquiring the credential and the other for cashing it in, a higher class position provides a significant competitive edge. The economic, cultural, and social capital that come with higher class standing gives the bearer an advantage in getting into college, in doing well at college, and in translating college credentials into desirable social outcomes. The market-based competition that characterizes the acquisition and disposition of educational credentials gives the process a meritocratic set of possibilities, but the influence of class on this competition gives it a socially reproductive set of probabilities as well. The danger is that, as a result, a credential-driven system of education can provide meritocratic cover for socially reproductive outcomes. In the single-minded pursuit of educational credentials, both student consumers and the society that supports them can lose sight of an all-too-predictable pattern of outcomes that is masked by the headlong rush for the academic gold.

Posted in Credentialing, Curriculum, Meritocracy, Sociology, Systems of Schooling

Mary Metz: Real School

This blog post is a tribute to the classic paper by Mary Metz, “Real School.”  In it she shows how schools follow a cultural script that demonstrates all of the characteristics we want to see in a school.  The argument, in line with neo-institutional theory (see this example by Meyer and Rowan), is that schools are organized around meeting our cultural expectations for the form that schools should take more than around producing particular outcomes.  Following the script keeps us reassured that the school we are associated with — as a parent, student, teacher, administrator, taxpayer, political leader, etc. — is indeed a real school.  It follows that the less effective a school is at producing desirable social outcomes — high scores, graduation rates, college attendance, future social position — the most closely we want it to follow the script.  It’s a lousy high school but it still has an advanced placement program, a football team, a debate team, and a senior prom.  So it’s a real high school.

Here’s the citation and a link to a PDF of the original article:

Metz, Mary H. (1990). Real school: A universal drama amid disparate experience. In Douglas E. Mitchell & Margaret E. Goertz (Eds.), Education Politics for the New Century (pp. 75-91). New York: Falmer.

And here’s a summary of some of its key points.

Roots of real school: the need for reassurance

  • We’re willing to setting for formal over substantive equity in schooling

  • The system provides formal equivalence across school settings, to reassure everyone that all kids get the same educational opportunity

  • Even though this is obviously not the case — as evidenced by the way parents are so careful where they send their kids, where they buy a house

  • What’s at stake is institutional legitimacy

  • Teachers, administrators, parents, citizens all want reassurance that their school is a real school

  • If not, then I’m not a real teacher, a real student, so what are we doing here?

This arises from the need for schools to balance conflicting outcomes within the same institution — schools need to provide both access and advantage, both equality and inequality

  • We want it both ways with our schools: we’re all equal, but I’m better than you

  • Both qualities are important for the social functions and public legitimacy of the social system

  • This means that school, on the face of it, needs to give everyone a fair shot

  • But it also means that school, in practice, needs to sort the winners from the losers

  • And winning only has meaning if it appears to be the result of individual merit

  • But who wants to leave this up for chance for their own children?

  • So parents use every tool they’ve got to game the system and get their children a leg up in the competition

  • And upper-middle-class parents have a lot of such tools — cultural capital, social capital, and economic capital

  • Yet they still need the formal equality of schooling as cover for this quest for advantage

So wWhy is it, as Metz shows, that schools that are least effective in producing student learning are the most diligent in doing real school?

  • Teachers and parents in these schools rarely demand the abandonment of real school — a failed model — in favor of something radically different

  • To the contrary, they demand even closer alignment with the real school model

  • They do so because they need to maintain the confidence in the system

  • More successful schools can stay a little farther from the script, because parents are more confident they will produce the right outcomes for their kids

  • Education is a confidence game – in both senses of the word: an effort to maintain confidence and an effort to con the consumer

The magic of school formalism

  • Formalism is central to the system and its effectiveness as a place to provide access and advantage at the same time

  • So you focus on structure and form and process more than on substantive learning

  • Meyer and Rowan‘s formalistic definition of a school:

    • “A school is an accredited institution where a certified teacher teaches a sanctioned curriculum to a matriculated student who then receives an authorized diploma.”

  • Students can make progress and graduate even if they’re not learning much

  • It helps that the quality of schooling is less visible than the quantity

Enjoy.

Real School Front Page

Posted in Credentialing, Curriculum, Education policy, History of education, School reform

The Chronic Failure of Curriculum Reform

This post is about an issue I’ve wrestled with for years, namely why reforming schools in the U.S. is so difficult.  I eventually wrote a book on the subject, Someone Has to Fail: The Zero-Sum Game of Public Schooling, which was published in 2010.  But you may not need to read it if you look at this piece I did for Education Week back in 1999, which later appeared in a book called Lessons of a Century.  Here’s a link to the original.

Education Week Commentary

The Chronic Failure of Curriculum Reform

By David F. Labaree

May 19, 1999

One thing we have learned from examining the history of curriculum in the 20th century is that curriculum reform has had remarkably little effect on the character of teaching and learning in American classrooms. As the century draws to a close, it seems like a good time to think about why this has been the case.

The failure of curriculum reform was certainly not the result of a lack of effort. At various times during the last 100 years, reformers have: issued high-visibility reports proposing dramatic changes in the curriculum (Cardinal Principles of Secondary Education in 1918, A Nation at Risk in 1983); created whole new subject areas (social studies, vocational education, special education); sought to reorganize the curriculum around a variety of new principles (ability grouping, the project method, life adjustment, back to basics, inclusion, critical thinking); and launched movements to reinvent particular subjects (“New Math,” National Council of Teachers of Mathematics math, phonics, whole language).

In spite of all these reform efforts, the basic character of the curriculum that is practiced in American classrooms is strikingly similar to the form that predominated in the early part of the century. As before, the curriculum continues to revolve around traditional academic subjects–which we cut off from practical everyday knowledge, teach in relative isolation from one another, differentiate by ability, sequence by age, ground in textbooks, and deliver in a teacher-centered classroom. So much effort and so little result.

How can we understand this problem? For starters, we can recognize that curriculum means different things at different levels in the educational system, and that curriculum reform has had the greatest impact at the level most remote from teaching and learning in the classroom. Starting at the top of the system and moving toward the bottom, there is the rhetorical curriculum (ideas put forward by educational leaders, policymakers, and professors about what curriculum should be, as embodied in reports, speeches, and college texts), the formal curriculum (written curriculum policies put in place by school districts and embodied in curriculum guides and textbooks), the curriculum-in-use (the content that teachers actually teach in individual classrooms), and the received curriculum (the content that students actually learn in these classrooms).

Each wave of reform dramatically transforms the rhetorical curriculum, by changing the way educational leaders talk about the subject. This gives the feeling that something is really happening, but most often it’s not. Sometimes the reform moves beyond this stage and begins to shape the formal curriculum, getting translated into district-level curriculum frameworks and the textbooks approved for classroom use. Yet this degree of penetration does not guarantee that reform ideas will have an observable effect on the curriculum-in-use. More often than not, teachers respond to reform rhetoric and local curriculum mandates by making only marginal changes in the way they teach subjects. They may come to talk about their practice using the new reform language, but only rarely do they make dramatic changes in their own curriculum practice. And even the rare cases when teachers bring their teaching in line with curriculum reform do not necessarily produce a substantial change in the received curriculum. What students learn is frequently quite different from what the reformers intended. For as curriculum-reform initiatives trickle down from the top to the bottom of the educational system, their power and coherence dissipate, with the result that student learning is likely to show few signs of the outcomes promoted by the original reform rhetoric. As David B. Tyack and Larry Cuban show in their book Tinkering Toward Utopia, the dominant pattern is one of recurring waves of reform rhetoric combined with glacial change in educational practice.

Why has this pattern persisted for so long? Consider a few enduring characteristics of American education that have undermined the impact of curriculum reform on teaching and learning.

Conflicting Goals: One factor is conflict over the goals of education itself. Different curriculum reforms embody different goals. Some promote democratic equality, by seeking to provide all children with the skills and knowledge they will need to function as competent citizens. Others promote social efficiency, by seeking to provide different groups of children with the specific skills they need in order to be productive in the different kinds of jobs required in a complex economy. Still others promote social mobility, by providing individual students with educational advantages in the competition for the best social positions. One result is that reform efforts over time produce a pendulum swing between alternative conceptions of what children need to learn, leading to a sense that reform is both chronic (“steady work,” as Richard Elmore and Milbrey McLaughlin put it) and cyclical (the here-we-go-again phenomenon). Another result is the compromise structure of the curriculum itself, which embodies contradictory purposes and therefore is unable to accomplish any one of these purposes with any degree of effectiveness (the familiar sense of schools as trying to do too much while accomplishing too little).

Credentialing Over Learning: From the perspective of the social-mobility goal, the point of education is not to learn the curriculum but to accumulate the grades, credits, and degrees that provide an edge in competing for jobs. So when this goal begins to play an increasingly dominant role in shaping education–which has been the case during the 20th century in the United States–curriculum reforms come to focus more on sorting and selecting students and less on enhancing learning, more on form than substance. This turns curriculum into a set of labels for differentiating students rather than a body of knowledge that all children should be expected to master, and it erects a significant barrier to any curriculum reforms that take learning seriously.

A Curriculum That Works: Another factor that undermines efforts to reform the curriculum is the comfortable sense among influential people that the current course of study in schools works reasonably well. Middle- and upper-middle-class families have little reason to complain. After graduation, their children for the most part go on to find attractive jobs and live comfortable lives. Judging from these results, schools must be providing these students with an adequate fund of knowledge and skills, so they have little reason to push for curriculum reform as a top priority. In fact, such changes may pose a threat to the social success of these children by changing the rules of the game–introducing learning criteria that they may not be able to meet (such as through performance testing), or eliminating curriculum options that provide special advantage (such as the gifted program). Meanwhile, families at the lower end of the social-class system, who have less reason to be happy about the social consequences of schooling, are not in a powerful position to push for reform.

Preserving the Curriculum of a Real School: Curriculum reform can spur significant opposition from people at all levels of society if it appears to change one of the fundamental characteristics of what Mary Metz calls “real school.” Since all of us have extensive experience as students in school, we all have a strong sense of what makes up a school curriculum. To a significant extent, this curriculum is made up of the elements I mentioned earlier: academic subjects, which are cut off from practical everyday knowledge, taught in relative isolation from one another, stratified by ability, sequenced by age, grounded in textbooks, and delivered in a teacher-centered classroom. If this is our sense of what curriculum is like in a real school, then we are likely to object to any reforms that make substantial changes in any of these defining elements. This shared cultural understanding of the school curriculum exerts a profoundly conservative influence, by blocking program innovations even if they enhance learning and by providing legitimacy for programs that fit the traditional model even if they deter learning.

Preserving Real Teaching: This conservative view of the curriculum is also frequently shared by teachers. Prospective teachers spend an extended “apprenticeship of observation” (in Dan Lortie’s phrase) as students in the K-12 classroom, during which they observe teaching from the little seats and become imprinted with a detailed picture of what the teacher’s curriculum-in-use looks like. They can’t see the reasons that motivate the teacher’s curriculum choices. All they can see is the process, the routines, the forms. So it is not surprising that they bring to their own teaching a sense of curriculum that is defined by textbooks, disconnected categories of knowledge, and academic exercises. Teacher-preparation programs often try to offset the legacy of this apprenticeship by promoting the latest in curriculum-reform perspectives, but they are up against a massive accumulation of experience and sense impression that works to preserve the traditional curriculum.

Organizational Convenience: The traditional curriculum also persists in the face of curriculum-reform efforts because this curriculum is organizationally convenient for both teachers and administrators. It is convenient to focus on academic subjects, which are aligned with university disciplines, thus simplifying teacher preparation. It is convenient to have a curriculum that is differentiated, which allows teachers to specialize. It is convenient to stratify studies by ability and age, which facilitates classroom management by allowing teachers to teach to the whole class at one level rather than adapt the curriculum to the individual needs of learners. It is convenient to ground teaching in textbooks, which reduce the demands on teacher expertise while also reducing the time commitment required for a teacher to develop her own curriculum materials. And it is convenient to run a teacher-centered classroom, which reinforces the teacher’s control and which also simplifies curriculum planning and student monitoring. Curriculum-reform efforts are hard to sell and even more difficult to sustain if they can only succeed if teachers have special capacities, such as: extraordinary subject-matter expertise; the time, will, and skill required to develop their own curriculum materials; the ability to teach widely divergent students effectively; and the ability to maintain control over these students while allowing them freedom to learn on their own.

Loose Coupling of School Systems: Another factor that undercuts the effectiveness of curriculum reform is the loosely coupled nature of American school systems. School administrators exert a lot of control over such matters as personnel, budgets, schedules, and supplies, but they have remarkably little control over the actual process of instruction. In part, this is because teaching takes place behind closed doors, which means that only individual teachers really know the exact nature of the curriculum-in-use in their own classrooms. But in part, this is because administrators have little power to make teachers toe the line instructionally. Most managers can influence employee performance on the job by manipulating traditional mechanisms of fear and greed: Cross me and you’re fired; do the job the way I want, and I’ll offer you a promotion and a pay increase. School administrators can fire teachers only with the greatest difficulty, and pay levels are based on years of service and graduate credits, not job performance. The result is that teachers have considerably more autonomy in the way they perform their fundamental functions than do most employees. And this autonomy makes it hard for administrators to ensure that the formal curriculum becomes the curriculum-in-use in district classrooms.

Adaptability of the School System: Curriculum reform is also difficult to bring about because of another organizational characteristic of the American educational system: its adaptability. As Philip Cusick has shown, the system has a genius for incorporating curriculum change without fundamental reorganization. This happens in two related ways–formalism and segmentation. One is the way that teachers adopt the language and the feel of a reform effort without altering the basic way they do things.

The system is flexible about adopting curriculum forms as long as this doesn’t challenge the basic structure of curriculum practice. The other way is inherent in the segmented structure of the school curriculum. The differentiation of subjects frees schools to adopt new programs and courses by the simple process of addition. They can always tack on another segment in the already fragmented curriculum, because these additions require no fundamental restructuring of programs. For this reason, schools are quite tolerant of programs and courses that have contradictory goals. Live and let live is the motto. By abandoning any commitment to coherence of curriculum and compatibility of purpose, schools are able to incorporate new initiatives without forcing collateral changes. The result is that schools appear open to reform while effectively resisting real change.

Weak Link Between Teaching and Learning: Finally, let me return to the problem that faces any curriculum-reform effort in the last analysis, and that is trying to line up the received curriculum with the curriculum-in-use. The problem we confront here is the irreducible weakness of the link between teaching and learning. Even if teachers, against considerable odds, were to transform the curriculum they use in their classrooms to bring it in line with a reform effort, there is little to reassure us that the students in these classes would learn what the reform curriculum was supposed to convey. Students, after all, are willful actors who learn only what they choose to learn. Teachers can’t make learning happen; they can only create circumstances that are conducive to learning. Students may indeed choose to learn what is taught, they may also choose to learn something quite different, or they may decide to resist learning altogether. And their willingness to cooperate in the learning process is complicated further by the fact that they are present in the classroom under duress. The law says they have to attend school until they are 16 years old; the job market pressures them to stay in school even longer than that. But these forces guarantee only attendance, not engagement in the learning process. So this last crucial step in the chain of curriculum reform may be the most difficult one to accomplish in a reliable and predictable manner, since curriculum reform means nothing unless learning undergoes reform as well.

For all the reasons spelled out here, curriculum-reform movements over the course of the 20th century have produced a lot of activity but not very much real change in the curriculum that teachers use in classrooms or in the learning that students accomplish in these classrooms. But isn’t there reason to think that the situation I have described is now undergoing fundamental change? That real curriculum reform may now be on the horizon?

We currently have a substantial movement to set firm curriculum standards, one that is coming at us from all sides. Presidents Bush and Clinton have pushed in this direction; state departments of education are establishing curriculum frameworks for all the districts under their jurisdiction; and individual subject-matter groups have been working out their own sets of standards. This is something new in American educational history. And combined with the standards movement is a movement for systematic testing of what students know–particularly at the state level, but also at the local and federal levels. If in fact we are moving in the direction of a system in which high-stakes tests determine whether students have learned the material required by curriculum standards, this could bring about a more profound level of curriculum reform than we have ever before experienced. Isn’t that right?

Not necessarily. The move toward standards and testing would affect only one or two elements in the long list of factors that impede curriculum reform. If this movement is successful–which is a big if–it would indeed help tighten the links in a system of education that has long been loosely coupled. It might also have an impact on the problem of student motivation, by convincing at least some students (those who see the potential occupational benefit of education) that they need to study the curriculum in order to graduate and get a good job. But this movement has already run into substantial resistance from religious conservatives and supporters of school choice, and it goes against the grain of the deep-seated American tradition of local control of education. In addition, I don’t see how it would have a serious impact on any of the other factors that have for so long deflected efforts to reform the curriculum. Conflicting goals, the power of credentialing over learning, keeping a system that works, preserving the curriculum of the real school, organizational convenience, and system adaptability–all of these elements would be largely unaffected by the current initiatives for standards and testing.

The history of reform during the 20th century thus leaves us with a sobering conclusion: The American educational system seems likely to continue resisting efforts to transform the curriculum.

David F. Labaree, a professor of teacher education at Michigan State University, is the author of How To Succeed in School Without Really Learning and The Making of an American High School, both published by Yale University Press.

Vol. 18, Issue 36, Pages 42-44

Published in Print: May 19, 1999, as The Chronic Failure of Curriculum Reform

 

Posted in Credentialing, Higher Education, History, Meritocracy

Schooling the Meritocracy: How Schools Came to Democratize Merit, Formalize Achievement, and Naturalize Privilege

 

This a new piece I recently wrote, based on a paper I presented last fall at the ISCHE conference in Berlin.  It’s part of a larger project that focuses on the construction of the American meritocracy, which is to say the new American aristocracy of credentials.

Schooling the Meritocracy:

How Schools Came to Democratize Merit, Formalize Achievement, and Naturalize Privilege

David F. Labaree

 

Merit is much in the news these days.  Controversy swirls around the central role that education plays in establishing who has the most merit and thus who gets the best job.  Parents are suing Harvard, for purportedly admitting students based on ethnicity rather than academic achievement.  Federal prosecutors are indicting wealthy parents for trying to bribe their children’s way into the most selective colleges.  At the core of these debates is a concern about fairness.  To what extent does the social structure allow people to get what they deserve, based on individual merit rather than social power and privilege?  There’s nothing new about our obsession with establishing merit.  The ancient Greeks and Romans were as concerned with this issue as much as we are.  What is new, however, is that all the attention now is focused on schools as the great merit dispensers.

Modern systems of public schooling have transformed the concept of merit.  The premodern form of this quality was what Joseph Kett calls essential merit.  This represented a person’s public accomplishments, which were seen as a measure of character.  Such merit was hard won through good works and had to be defended vigorously, even if that meant engaging in duels.  The new kind of merit, which arose in the mid nineteenth century after the emergence of universal public schooling in the U.S., was what Kett calls institutional merit.  This you earned by attending school and advancing through the levels of academic attainment.  It became your personal property, which could not be challenged by others and which granted you privileges in accordance with the level of merit you acquired.

Here I examine three consequences of this shift from essential to institutional merit in the American setting.  First, this change democratized merit by making it, at least theoretically, accessible to anyone and not just the gentry, who in the premodern period had prime access to this reputational good.  Second, it formalized the idea of merit by turning it from a series of publicly visible and substantive accomplishments into the accumulation of the forms that schooling had to offer – grades, credits, and degrees.  Third, following from the first two, it served the social function of naturalizing the privileges of birth by transposing them into academic accomplishments.  The well born, through the medium of schooling, acquired a second nature that transformed ascribed status into achieved status.

 

Essential Merit

 

From the very start, the country’s Founding Fathers were obsessed with essential merit.  To twenty-first century ears, the way they used the term sounds like what we might call character or honor or reputation.  Individuals enacted this kind of merit through public performances, and it referred not just to achievements in general but especially those that were considered most admirable for public figures.  This put a premium on taking on roles of public service more than private accomplishment and on contributing to the public good.  Such merit might come from demonstrating courage on the battlefield, sacrifice for the community in a position of public leadership, scientific or literary eminence.  Think Washington, Jefferson, Madison, Hamilton, and Franklin.  It extended well beyond simple self-aggrandizement, although it often spurred that among its most ardent suitors.  It was grounded in depth of achievement, but it also relied heavily on symbolism to underscore its virtue.

Merit was both an enactment and a display.  The most accomplished practitioner of essential merit in the revolutionary period was George Washington.  From his earliest days he sought to craft the iconic persona that has persisted to the present day.  His copybook in school was filled with 110 rules of civility that should govern public behavior.  He constructed a resume of public service that led inevitably from an officer in the colonial militia, to a representative to the continental congress, to commander in chief of the revolutionary army, and then to president.  A tall man in an era of short men, he would tower over a room of ordinary people, and he liked to demonstrate his physical strength and his prowess as an accomplished horseman.  This was a man with a strong sense of his reputation and of how to maintain it.  And he scored the ultimate triumph of essential merit in his last performance in public life, when he chose to step down from the presidency after two terms and return to Mount Vernon – Cincinnatus laying down his awesome powers and going back to the farm.

This kind of merit is what Jefferson meant when he referred to a “natural aristocracy,” arising in the fertile fields of the new world that were uncorrupted by the inheritance of office.  It represents the kinds of traits that made aristocracy a viable form of governance for so many years:  educating men of privilege to take on positions of public leadership, imbued with noblesse oblige, and armed with the skills to be effective in the role.  Merit was a powerful motivator for the Founding Fathers, a spur to emulation for the benefit of the community, a self-generating dynamic for a hyper-accomplishment.  And it was a key source of their broad legitimacy as public leaders.

But essential merit also had its problems.  Although it left room for self-made men to demonstrate their merit – like Franklin and Hamilton – it was largely open to men of leisure, born into the gentry, supported by a plantation full of slaves, and free to serve the public without having to worry about making a living; think Washington, Jefferson, and Madison.  When politics began the transition in the 1820s from the Federalist to the Jacksonian era, the air of aristocracy fit uncomfortably into the emerging democratic ethos that Tocqueville so expertly captured in Democracy in America.

Another problem was that essential merit brought with it unruly competition.   How much essential merit can crowd into a room before a fight breaks out?  How can everyone be a leader?  What happens if you don’t get the respect you think you earned?  One response, quite common at the time, was to engage in a duel.  If your reputation was maligned by someone and that person refused to retract the slur, then your honor compelled you to defend your reputation with your life.  Alexander Hamilton was but one casualty of this lethal side effect of essential merit.  Benedict Arnold is another case in point.  An accomplished military officer and Washington protégé, Arnold was doing everything right on the battlefield to demonstrate his merit.  But when he sought appointment as a major general, politics blocked his path.  This was a slight too much for him to bear.  Instead of a duel (who would he challenge, his mentor Washington?), he opted for treason, plotting to pass along to the British his command of the fort at West Point.  So the dynamic behind essential merit was a powerful driver for behavior that was both socially functional and socially destructive.

 

The Rise of Institutional Merit

 

By the second quarter of the nineteenth century, a new form of merit was arising in the new republic.  In contrast to the high-flown notion of essential merit, grounded in high accomplishment in public life and defended with your life, the new merit was singularly pedestrian.  It mean grades on a report card at school.  Hardly the stuff of stirring biographies.  These grades were originally labeled as measures of merit and demerit in academic work, recording what you did well and what you did badly.  Ironically, ground zero for this new system was Benedict Arnold’s old fort at West Point, which was now the location of the U.S. Military Academy.  The sum of your merits and demerits constituted your academic worth.  Soon the emerging common school system adopted the same mode of evaluation.

The sheer banality of the new merit offered real advantages.  Unlike its predecessor, it did not signal membership in an exclusive club accessible primarily to the well-born but instead arose from a system that governed an entire population within a school.  As a result, it was well suited to a more democratic political culture.  Also, it provided a stimulus sufficiently strong to promote productive competition among students for academic standing, but these marks on a report card were not really worth fighting over.

So institutional merit emerged as a highly functional construct for meeting the organizational needs of the new systems of public schooling that arose in the middle of nineteenth century America.  What started out as a mechanism for motivating students in a classroom grew into a model for structuring an entire system of education.  Once the principle of ranking by individual achievement was established, it developed into a way of ranking groups of students within schools and then groups of schools with school systems.  The first innovation, as schools became larger and more heterogeneous in both age and ability, was to organize groups of students into homogeneous classrooms with others of the same age and ability.  If you performed with sufficient merit in one grade, you would be promoted with your peers at the end of the year into the next grade up the ladder.  If your merit was not up to standard, you would be held back to repeat the grade.  This allowed teachers to pitch instruction toward a group of students who were at roughly the same level of achievement and development.  It also created a more level playing field that allowed teachers to compare and rank the relative performance of students within the class, which they couldn’t do in a one-room schoolhouse with a wide array of ages and abilities.  So the invention of the grade also led to the invention of the metric that defines some students as above grade-level and others as below.  Graded schooling was thus the foundation of the modern meritocracy.

The next step in the development of institutional merit was the erection of a graded system of schooling.  Students would start out in an elementary school for the lower grades, then gain promotion to a grammar school, and (by the end of the nineteenth century) move up to a high school for the final grades.  Entry at one level was dependent on successful completion of the level below.  A clear hierarchy of schooling emerged based on the new merit metric.  And it didn’t stop there.  High school graduation became the criterion for entry into college, and the completion of college became the requirement for entry into graduate school.  A single graded structure guided student progress through each individual school and across the entire hierarchy of schooling, serving as a rationalized and incremental ladder of meritocratic attainment leading from first grade through the most elevated levels of the system.

Consider some of the consequences of the emergence of this finely tuned machinery for arranging students by institutional merit.  When you have a measure of what average progress should look like – annual promotion to the next grade, and periodic promotion to the school at the next level – then you also have a clear measure of failure.  There were three ways for students to demonstrate failure within the system:  to be held back from promotion to the next grade; to be denied the diploma that demonstrated completion of particular school level; to leave the system altogether as a particular point in the graded hierarchy.  Thus emerged the chronic problems of the new system – retardation and elimination.

A parallel challenge to the legitimacy of the merit structure occurred at the level of the school.  By the early twentieth century, level of school became increasingly important in determining access to the best jobs.  As a particular level of schooling began to fill up, as happened to the high school in the first half of the twentieth century, then that level of diploma became less able to provide invidious distinction.  For a high school graduate, this meant that the perceived quality of the school became an important factor in determining the relative merit of your degree compared with other high school graduates.  When college enrollments took off in the mid twentieth century and this level of the system emerged as the new zone of universal education, the value of a college degree likewise became dependent on the imputed merit of the institution granting it.  The result is a two-layered hierarchy of merit in the American educational system.  One was the formal graded system from first grade to graduate school.  Another was the informal ranking of institutions at the same formal level.  Both became critical in determining graduates’ level of institutional merit and their position in the queue for the best jobs.  Consider some of the consequences of the dominance of this new form of merit.

Democratizing Merit

As we saw, essential merit had a bias toward privilege.  The founding fathers who displayed the most merit were to the manor born.  They were free to exercise public service because of birth and wealth.  Yes, it was possible as well for an outsider to demonstrate essential merit, but it wasn’t easy.  Benjamin Franklin was sui generis, and even he acted less as a leader and more as a sage and diplomat.  Alexander Hamilton fought his way to the top, but he never lost his outsider status and ended up dying to defend his honor, which was hard-won but never fully secure.

What gives essential merit face validity is that it is based on what you have actually accomplished.  Your merit is your accomplishments.  That’s hard to beat as a basis for respect, but it’s also hard to attain.  Washington could prove himself as a military officer because his gentry status automatically qualified him to become an officer in the first place.  Jefferson became a political figure because that’s what men of his status did with themselves and his election would be assured.  As a result, what made this kind of merit so compelling is what also made it so difficult for anyone but the gentry to demonstrate.

So the move toward institutional merit radically opened up the possibility of attaining it.  It’s a system that applied to everyone – not just the people with special access but everyone in the common school classroom.  All students in the class could demonstrate their worth and earn the appropriate merits that measured that worth.  And everyone was measured on the same scale.  If essential merit was the measure of the member of the natural aristocracy, institutional merit was the measure of the citizen in a democracy.  You’ve got to love that part about it.

Another characteristic of institutional merit also made it distinctly democratic.  What it measured was neither intrinsically important nor deeply admirable.  It didn’t measure your valor in battle or your willingness to sacrifice for the public good; instead it reflected how many right answers you got on a weekly spelling test.  No big deal.

But what makes this measure of merit so powerful for the average person was its implication.  It measured a trivial accomplishment in the confined academic world of the classroom, but it implied a bright future.  If essential merit measured your real accomplishment in the world, institutional merit offered a prediction of your future accomplishment.  It said, look out for this guy – he’s going to be somebody.  This is a major benefit that derives from the new measure.  Measuring how well you did a job is relatively easy, but predicting in advance how well you will do that job is a very big deal.

Does institutional merit really predict future accomplishment?  Do academic grades, credits, and degrees tell us how people will perform on the job?  Human capital theorists say yes: the skills acquired in school translate into greater productivity in the workforce.  Credentialing theorists say no:  the workforce rewards degrees by demanding them as prerequisites for getting a job, but this doesn’t demonstrate that what is learned in school helps a person in doing the job.  I lean toward the latter group, but for our purposes this debate doesn’t really matter.  As long as the job market treats academic merit as a predictor of job performance, then this form of merit serves as such.  Whether academic learning is useful on the job is irrelevant as long as the measures of academic merit are used to allocate people to jobs.  And a system that offers everyone in a community access to schools that will award them tokens of institutional merit gives everyone a chance to gain any social position.  That’s a very democratic measure indeed.

Formalizing Merit

Part of what makes institutional merit so democratic is that the measure itself is so abstract.  What it’s measuring is not concrete accomplishment – winning a battle or passing a law – but generic accomplishment on a standardized and decontextualized scale.  It’s a score from A to F or 1 to 100 or 0 to 4.  All of these scales are in use in American schools, but which you use doesn’t matter.  They’re all interchangeable.  All they tell us is how high or low an individual was rated on some academic task.  Then these individual scores are averaged together across a heterogeneous array of such tasks to compute a composite score that tells us – what?  The score says that overall, at the end of the class, you met academic expectations (for that class in that grade) at a high, medium, or low level, or that you failed to meet the minimum expectation at all.  And, if compared to the grades that fellow students received in the same class, it shows where your performance ranked with that of your peers.

It’s the sheer abstraction of this measure of merit that gives it so much power.  A verbal description of a student’s performance in the class would be a much richer way of understanding what she learned there:  In her biology class, Joanie demonstrated a strong understanding of heredity and photosynthesis but she had some trouble with the vascular system.  The problem is that this doesn’t tell you how she compares with her classmates or whether she will qualify to become a banker.  What helps with the latter is that she received a grade of B+ (3.3 on a 4.0 scale) and the class average was B.  The grade tells you much less but it means a lot more for her and her future.  Especially when it is combined with all of her other grades in classes across her whole career in high school, culminating in her final grade point average and a diploma.  It says, she’ll get into college, but it won’t be very selective one.  She’ll end up in a middle class job, but she won’t be a top manager.  In terms of her future, this is what really matters, not her mastery of photosynthesis.

In this way, institutional merit is part of the broad process of rationalization that arose with modernity.  It filters out all of the noise that comes from context and content and qualitative judgments and comes up with a quantitative measure that locates the individual as a point on a normal curve representing everyone in the cohort.  It shows where you rank and predicts where you’re headed.  It becomes a central part of the machinery of disciplinary power.

Naturalizing Privilege

Once merit became democratized and formalized, it also became naturalized.  The process of naturalization works like this.  Your merit is so central and so pervasive in a system of universal schooling that it embeds itself within the individual person.  You start saying things like:  I’m smart.  I’m dumb.  I’m a good student.  I’m a bad student.  I’m good at reading but bad at math.  I’m lousy at sports.  The construction of merit is coextensive with the entire experience of growing up, and therefore it comes to constitute the emergent you.  It no longer seems to be something imposed by a teacher or a school but instead comes to be an essential part of your identity.  It’s now less what you do and increasingly who you are.  In this way, the systemic construction of merit begins to disappear and what’s left is a permanent trait of the individual.  You are your grade and your grade is your destiny.

The problem, however – as an enormous amount of research shows – is that the formal measures of merit that schools use are subject to powerful influence from a student’s social origins.  No matter how your measure merit, it affects your score.  It shapes your educational attainment.  It also shows up in measures that rank educational institutions by quality and selectivity.  Across the board, your parents’ social class has an enormous impact on the level of merit you are likely to acquire in school.  Students with higher social position end up accumulating a disproportionately large number of academic merit badges.

The correlations between socioeconomic status and school measures of merit are strong and consistent, and the causation is easy to determine.  Being born well has an enormously positive impact on the education merit you acquire across your life.  Let us count the ways.  Economic capital is one obvious factor.  Wealthy communities can support better schools. Social capital is another factor.  Families from the upper middle classes have a much broader network of relationships with the larger society than those form the working class, which provides a big advantage for their schooling prospects.  For them, the educational system is not foreign territory but feels like home.

Cultural capital is a third factor, and the most important of all.  School is a place that teaches students the cognitive skills, cultural norms, and forms of knowledge that are required for competent performance in positions of power.  Schools demonstrate a strong disposition toward these capacities over others:  mental over manual skills, theoretical over practical knowledge, decontextualized over contextualized perspectives, mind over body, Gesellschaft over Gemeinschaft.  Parents in the upper middle class are already highly skilled in these cultural capacities, which they deploy in their professional and managerial work on a daily basis.  Their children have grown up in the world of cultural capital.  It’s a language they learn to speak at home.  For working-class children, school is an introduction to a foreign culture and a new language, which unaccountably other students seem to already know.  They’re playing catchup from day one.  Also, it turns out that schools are better at rewarding cultural capital than they are at teaching it.  So kids from the upper middle class can glide through school with little effort while others continually struggle to keep up.  The longer they remain in school, the larger the achievement gap between the two groups.

So, in the wonderful world of academic merit, the fix is in.  Upper income students have a built-in advantage in acquiring the grades, credits, and degrees that constitute the primary prizes of the school meritocracy.  But – and this is the true magic of the educational process – the merits that these students accumulate at school come in a purified academic form that is independent of their social origins.  They may have entered schooling as people of privilege, but they leave it as people of merit.  They’re good students.  They’re smart.  They’re well educated.  As a result, they’re totally deserving of special access to the best jobs.  They arrived with inherited privilege but they leave with earned privilege.  So now they fully deserve what they get with their new educational credentials.

In this way, the merit structure of schooling performs a kind of alchemy.  It turns class position into academic merit.  It turns ascribed status into achieved status. You may have gotten into Harvard by growing up in a rich neighborhood with great schools and by being a legacy.  But when you graduate, you bear the label of a person of merit, whose future accomplishments arise alone from your superior abilities.  You’ve been given a second nature.

Consequences of Naturalized Privilege: The New Aristocracy

The process by which schools naturalize academic merit brings major consequences to the larger society.  The most important of these is that it legitimizes social inequality.  People who were born on third base get credit for hitting a triple, and people who have to start in the batter’s box face the real possibility of striking out.  According to the educational system, divergent social outcomes are the result of differences in individual merit, so, one way or the other, people get what they deserve.  The fact that a fraction of students from the lower classes manage against the odds to prove themselves in school and move up the social scale only adds further credibility to the existence of a real meritocracy.

In the United States in the last 40 years, we have come to see the broader implications of this system of status attainment through institutional merit.  It has created a new kind of aristocracy.  This is not Jefferson’s natural aristocracy, grounded in public accomplishments, but a caste of meritocratic privilege, grounded in the formalized and naturalized merit signaled by educational credentials.  As with aristocracies of old, the new meritocracy is a system of rule by your betters – no longer defined as those who are better born or more accomplished but now as those who are better educated.  Michael Young saw this coming back in 1958, as he predicted in his fable, The Rise of the Meritocracy.  But now we can see that it has truly taken hold.

The core expertise of this new aristocracy is skill in working the system.  You have to know how to play the game of educational merit-getting and pass this on to your children.  The secret is in knowing that the achievements that get awarded merit points through the process of schooling are not substantive but formal.  Schooling is not about learning the subject matter; it’s about getting good grades, accumulating course credits, and collecting the diploma on the way out the door.  Degrees pay off, not what you learned in school or even the number of years of schooling you have acquired.  What you need to know is what’s going to be on the test and nothing else.  So you need to study strategically and spend of lot of effort working the refs.  Give teacher what she wants and be sure to get on her good side.  Give the college admissions officers the things they are looking for in your application.  Pump up your test scores with coaching and learning how to game the questions.

Members of the new aristocracy are particularly aggressive about carrying out a strategy known as opportunity hoarding.  There is no academic advantage too trivial to pursue, and the number of advantages you accumulate can never be enough.  In order to get your children into the right selective college you need send them to the right school, get them into the gifted program in elementary school and the right track in high school, hire a tutor, carry out test prep, do the college tour, pursue prizes, develop a well-rounded resume for the student (sport, student leadership, musical instrument, service), pull strings as a legacy and a donor, and on and on and on.

Such behavior by upper-middle-class parents is not a crazy as it seems.  The problem with being at the top is that there’s nowhere to go but down.  If you look at studies of intergenerational mobility in the US, the top quintile of families have a big advantage, with more than 40 percent of children ending up in the same quintile as their parents, twice the rate that would occur by chance.  But that still means that 60 percent are going to be downwardly mobile.  The system is just meritocratic enough to keep the most privileged families on edge, worried about having their child bested by a smart poor kid.   As Jerry Karabel puts it in The Chosen, the only thing U.S. education equalizes is anxiety.

As with earlier aristocracies, the new aristocrats of merit cluster together in the same communities, where the schools are like no other.  Their children attend the same elite colleges, where they meet their future mates and then transmit their combined cultural, social, and economic capital in concentrated form to their children, a process sociologists call assortative mating.  And one consequence of this increase concentration of educational resources is that the achievement gap between low and high income students has been rising; Sean Reardon’s study shows the gap growing 40 percent in the last quarter of the twentieth century.  This is how educational and social inequality grows larger over time.

 

By democratizing, formalizing, and naturalizing merit, schools have played a central role in defining the character of modern society.  In the process they have served to increase social opportunity while also increasing social inequality.  At the same time, they have established a solid educational basis for the legitimacy of this new inequality, and they have fostered the development of a new aristocracy of educational merit whose economic power, social privilege, and cultural cohesion would be the envy of the high nobility in early modern England or France.  Now, as then, the aristocracy assumes its outsized social role as a matter of natural right.

 

Posted in Credentialing, Higher Education, Meritocracy

Michael Lewis: Don’t Eat Fortune’s Cookie

In the last year or so, I’ve been reading and writing about the American meritocracy, and I’m going to be posting some of these pieces here from time to time.  But today I want to post a wonderful statement on the subject by Michael Lewis, which I somehow had missed when it first came out.  It’s his address at the Princeton commencement in 2012 called, “Don’t Eat Fortune’s Cookie.”  The theme for the new Princeton grads is simple and powerful:  You shouldn’t assume you deserve to be where you are today.

Princeton University’s 2012 Baccalaureate Remarks

June 3, 2012 4:17 p.m.

Don’t Eat Fortune’s Cookie
Michael Lewis
June 3, 2012 — As Prepared

(NOTE: The video of Lewis’ speech as delivered is available on the Princeton YouTube channel.)

Thank you. President Tilghman. Trustees and Friends. Parents of the Class of 2012. Above all, Members of the Princeton Class of 2012. Give yourself a round of applause. The next time you look around a church and see everyone dressed in black it’ll be awkward to cheer. Enjoy the moment.

Thirty years ago I sat where you sat. I must have listened to some older person share his life experience. But I don’t remember a word of it. I can’t even tell you who spoke. What I do remember, vividly, is graduation. I’m told you’re meant to be excited, perhaps even relieved, and maybe all of you are. I wasn’t. I was totally outraged. Here I’d gone and given them four of the best years of my life and this is how they thanked me for it. By kicking me out.

At that moment I was sure of only one thing: I was of no possible economic value to the outside world. I’d majored in art history, for a start. Even then this was regarded as an act of insanity. I was almost certainly less prepared for the marketplace than most of you. Yet somehow I have wound up rich and famous. Well, sort of. I’m going to explain, briefly, how that happened. I want you to understand just how mysterious careers can be, before you go out and have one yourself.

I graduated from Princeton without ever having published a word of anything, anywhere. I didn’t write for the Prince, or for anyone else. But at Princeton, studying art history, I felt the first twinge of literary ambition. It happened while working on my senior thesis. My adviser was a truly gifted professor, an archaeologist named William Childs. The thesis tried to explain how the Italian sculptor Donatello used Greek and Roman sculpture — which is actually totally beside the point, but I’ve always wanted to tell someone. God knows what Professor Childs actually thought of it, but he helped me to become engrossed. More than engrossed: obsessed. When I handed it in I knew what I wanted to do for the rest of my life: to write senior theses. Or, to put it differently: to write books.

Then I went to my thesis defense. It was just a few yards from here, in McCormick Hall. I listened and waited for Professor Childs to say how well written my thesis was. He didn’t. And so after about 45 minutes I finally said, “So. What did you think of the writing?”

“Put it this way” he said. “Never try to make a living at it.”

And I didn’t — not really. I did what everyone does who has no idea what to do with themselves: I went to graduate school. I wrote at nights, without much effect, mainly because I hadn’t the first clue what I should write about. One night I was invited to a dinner, where I sat next to the wife of a big shot at a giant Wall Street investment bank, called Salomon Brothers. She more or less forced her husband to give me a job. I knew next to nothing about Salomon Brothers. But Salomon Brothers happened to be where Wall Street was being reinvented—into the place we have all come to know and love. When I got there I was assigned, almost arbitrarily, to the very best job in which to observe the growing madness: they turned me into the house expert on derivatives. A year and a half later Salomon Brothers was handing me a check for hundreds of thousands of dollars to give advice about derivatives to professional investors.

Now I had something to write about: Salomon Brothers. Wall Street had become so unhinged that it was paying recent Princeton graduates who knew nothing about money small fortunes to pretend to be experts about money. I’d stumbled into my next senior thesis.

I called up my father. I told him I was going to quit this job that now promised me millions of dollars to write a book for an advance of 40 grand. There was a long pause on the other end of the line. “You might just want to think about that,” he said.

“Why?”

“Stay at Salomon Brothers 10 years, make your fortune, and then write your books,” he said.

I didn’t need to think about it. I knew what intellectual passion felt like — because I’d felt it here, at Princeton — and I wanted to feel it again. I was 26 years old. Had I waited until I was 36, I would never have done it. I would have forgotten the feeling.

The book I wrote was called “Liar’s Poker.”  It sold a million copies. I was 28 years old. I had a career, a little fame, a small fortune and a new life narrative. All of a sudden people were telling me I was born to be a writer. This was absurd. Even I could see there was another, truer narrative, with luck as its theme. What were the odds of being seated at that dinner next to that Salomon Brothers lady? Of landing inside the best Wall Street firm from which to write the story of an age? Of landing in the seat with the best view of the business? Of having parents who didn’t disinherit me but instead sighed and said “do it if you must?” Of having had that sense of must kindled inside me by a professor of art history at Princeton? Of having been let into Princeton in the first place?

This isn’t just false humility. It’s false humility with a point. My case illustrates how success is always rationalized. People really don’t like to hear success explained away as luck — especially successful people. As they age, and succeed, people feel their success was somehow inevitable. They don’t want to acknowledge the role played by accident in their lives. There is a reason for this: the world does not want to acknowledge it either.

I wrote a book about this, called “Moneyball.” It was ostensibly about baseball but was in fact about something else. There are poor teams and rich teams in professional baseball, and they spend radically different sums of money on their players. When I wrote my book the richest team in professional baseball, the New York Yankees, was then spending about $120 million on its 25 players. The poorest team, the Oakland A’s, was spending about $30 million. And yet the Oakland team was winning as many games as the Yankees — and more than all the other richer teams.

This isn’t supposed to happen. In theory, the rich teams should buy the best players and win all the time. But the Oakland team had figured something out: the rich teams didn’t really understand who the best baseball players were. The players were misvalued. And the biggest single reason they were misvalued was that the experts did not pay sufficient attention to the role of luck in baseball success. Players got given credit for things they did that depended on the performance of others: pitchers got paid for winning games, hitters got paid for knocking in runners on base. Players got blamed and credited for events beyond their control. Where balls that got hit happened to land on the field, for example.

Forget baseball, forget sports. Here you had these corporate employees, paid millions of dollars a year. They were doing exactly the same job that people in their business had been doing forever.  In front of millions of people, who evaluate their every move. They had statistics attached to everything they did. And yet they were misvalued — because the wider world was blind to their luck.

This had been going on for a century. Right under all of our noses. And no one noticed — until it paid a poor team so well to notice that they could not afford not to notice. And you have to ask: if a professional athlete paid millions of dollars can be misvalued who can’t be? If the supposedly pure meritocracy of professional sports can’t distinguish between lucky and good, who can?

The “Moneyball” story has practical implications. If you use better data, you can find better values; there are always market inefficiencies to exploit, and so on. But it has a broader and less practical message: don’t be deceived by life’s outcomes. Life’s outcomes, while not entirely random, have a huge amount of luck baked into them. Above all, recognize that if you have had success, you have also had luck — and with  luck comes obligation. You owe a debt, and not just to your Gods. You owe a debt to the unlucky.

I make this point because — along with this speech — it is something that will be easy for you to forget.

I now live in Berkeley, California. A few years ago, just a few blocks from my home, a pair of researchers in the Cal psychology department staged an experiment. They began by grabbing students, as lab rats. Then they broke the students into teams, segregated by sex. Three men, or three women, per team. Then they put these teams of three into a room, and arbitrarily assigned one of the three to act as leader. Then they gave them some complicated moral problem to solve: say what should be done about academic cheating, or how to regulate drinking on campus.

Exactly 30 minutes into the problem-solving the researchers interrupted each group. They entered the room bearing a plate of cookies. Four cookies. The team consisted of three people, but there were these four cookies. Every team member obviously got one cookie, but that left a fourth cookie, just sitting there. It should have been awkward. But it wasn’t. With incredible consistency the person arbitrarily appointed leader of the group grabbed the fourth cookie, and ate it. Not only ate it, but ate it with gusto: lips smacking, mouth open, drool at the corners of their mouths. In the end all that was left of the extra cookie were crumbs on the leader’s shirt.

This leader had performed no special task. He had no special virtue. He’d been chosen at random, 30 minutes earlier. His status was nothing but luck. But it still left him with the sense that the cookie should be his.

This experiment helps to explain Wall Street bonuses and CEO pay, and I’m sure lots of other human behavior. But it also is relevant to new graduates of Princeton University. In a general sort of way you have been appointed the leader of the group. Your appointment may not be entirely arbitrary. But you must sense its arbitrary aspect: you are the lucky few. Lucky in your parents, lucky in your country, lucky that a place like Princeton exists that can take in lucky people, introduce them to other lucky people, and increase their chances of becoming even luckier. Lucky that you live in the richest society the world has ever seen, in a time when no one actually expects you to sacrifice your interests to anything.

All of you have been faced with the extra cookie. All of you will be faced with many more of them. In time you will find it easy to assume that you deserve the extra cookie. For all I know, you may. But you’ll be happier, and the world will be better off, if you at least pretend that you don’t.

Never forget: In the nation’s service. In the service of all nations.

Thank you.

And good luck.