Posted in Course Syllabus, History, History of School Reform Class, School reform

Class on History of School Reform in the U.S.

This post contains all of the material for the class on the History of School Reform in the US that I taught for at the Stanford Graduate School of Education for 15 years.  In retirement I wanted to make the course available on the internet to anyone who is interested.  If you are a college teacher, feel free to use any of it in whole or part.  If you are a student or a group of students, you can work your way through the class on your own at your own pace.  Any benefits that accrue are purely intrinsic, since no one will get college credits.  But that also means you’re free to pursue the parts of the class that you want and you don’t have any requirements or papers.  How great is that.

I’m posting the full syllabus below.  But it would be more useful to download it as a Word document through this link.  Feel free to share this with anyone you like.

All of the course materials are embedded in the syllabus through hyperlinks to a Google drive.  For each week, this includes a link to tips for approaching the readings, links to the PDFs of the readings, and a link to the slides for that week’s class.  Slides also include links to additional sources.  So the syllabus is all that is needed to gain access to the full class.

What are the central themes of this class? 

One is that reform of schooling is all about trying to improve it.  But whether a particular reform constitutes an improvement or a detriment to school and society rests in the eye of the beholder.  It all depends on what goals you want schools to accomplish, but the fact is that we don’t fully agree on what those goals are.  So reform is not a linear process leading inevitably toward a better future but a cyclical process resulting from efforts to accomplish alternative goals that are in tension with each other.  This produces a series on long-term pendulum swings between alternative visions of education.

A second theme is that examining the history of reform efforts can provide us with rich insight into the nature of the educational system.  Think of reforms as a series of experiments in school improvement.  How the system reacts to each of these experimental interventions tells us something important about how the system operates. 

One conclusion I have drawn from examining this process is that reformers have been overconfident about their understanding of what the problems with schooling are and therefore what solutions are required.  The fact is that a lot of reform efforts make schools worse.  By failing to understand the complexity of the system and the way it has evolved over time, reformers have frequently been throwing a monkey wrench into the works.  They try to solve one problem and in the process create another.  (This is one reason by Dick Elmore and Milbrey McLaughlin call school reform “steady work.”) 

So one message the course sends to students is to show a little humility in your efforts to improve schools.  Learn more about how the system works before tinkering with it.  And consider the possibility that you might make things worse.

I hope you find this useful.

History of School Reform in the US

A Ten-Week Class

David Labaree

Web: http://www.stanford.edu/~dlabaree/

Twitter: @Dlabaree

Blog: https://davidlabaree.com/

Course Description

In this course, we will explore the history of school reform in the United States.  In only 10 weeks we will not be able to pursue a systematic study of this history from beginning to end, so instead we will explore a few of the major issues in this history and examine some pertinent cases of school reform to consider their consequences.  School reform is the intended change of schooling toward accomplishment of a valued goal.  One problem with reform, therefore, is intent.  Education is an extraordinarily complex social institution – involving a vast array of people, structures, and organizations – which means that reforming education in ways that make it produce the intended results is quite difficult.  Frequently reforms unintentionally generate new problems, which then require a new wave of reform to deal with them.  (This is why Elmore and McLaughlin call school reform “steady work.”)  A second problem with reform is that reasonable people can disagree over the goals of schooling, which means that what is a positive reform for some people may be a negative change for others.  The result is that your reaction to the success or failure of a reform effort depends on where you stand on its value, since the failure of a bad reform is a good thing.

Major Issues in the History of School Reform:  Framing our look at the history of reform will be two core books:  Tinkering Toward Utopia, which David Tyack and Larry Cuban wrote in response to what they learned from teaching this class at Stanford for a number of years; and Someone Has to Fail, the book I wrote after teaching the same course for a decade.  We’ll read their book at the start of the class and read mine in pieces across the quarter.  A key theme in Tyack and Cuban is the paradox of school reform, in which it seems that schools are constantly being bounced around by a stream of reform efforts while at the same time they never seem to change.  They unravel this paradox by separating the history of reform into two interacting elements:  the noisy and often contradictory rounds of reform rhetoric that intrude upon schools at irregular intervals, and the slower and steadier process of evolutionary change in the structure of schooling that takes place largely outside of public view.  We will look at both aspects of reform, with special attention to assessing the outcomes of reform in the realm of the structure and practice of schooling itself.  My own book takes a more jaundiced view of reform, examining why the common school movement was such a success and later reforms were such failures.  In the early part of the book, the focus is on how the loosely coupled organization of schooling and the peculiar characteristics of teaching as a practice have put severe limits on the possibilities of reform.  In the latter part, I explore why the failure of reform is largely good news, protecting the system from damaging experiments based on misguided visions of what schools can do to solve social problems.  I argue that schools are a terrible way to solve most of the social problems that they are asked to address.  I also suggest that schools are doing what educational consumers want from them – providing us with social access and social advantage – even if they don’t do what reformers ask of them.

The class starts with the work of David Cohen, Richard Elmore, and Milbrey McLaughlin, who consider the organizational and pedagogical reasons it has been so difficult to change the basic grammar of schooling through deliberate reform efforts.  Next we read Tyack and Cuban to get an overview of the subject.  Then we look at my representation of the two most important reform movements in the history of American schools, one promoting the common school and the other pushing for progressive education.  Next we look at the rhetorics of school reform by examining a series of reform documents from the last 200 years.  We will then look in detail at the nature and variety of school reform rhetoric, through a close study of a few key reform texts over the years, including pedagogical progressivism, administrative progressivism, desegregation, the standards movement, and school choice.  In succeeding weeks, we explore the core factors that make the school system so resistant to reform and consider some of the kinds of reform practices that are more likely to bring about results.  Then we examine the system’s core social role, showing how the system continually adapts to pressure for greater social access by stratifying instruction in a way the preserves social advantage.  In week 8 we look at issues surrounding race and American schooling.  In week 9, we put the issue of school reform in the larger context of state-driven social change efforts, by focusing on James Scott’s framework, which examines why it has been so hard over the years for governments to impose order on complex social institutions such as schooling.  For the last class, we read the final chapters in my book, talk about what schools can do, and what they can’t do.

What This Class Is and Is Not About:  This class is intended to encourage you to think hard about the things that make educational reform so complex, contradictory, difficult, and often dysfunctional.  Its focus is on analyzing what happens to reform efforts between initial proposals and eventual outcomes.  This means that its aim is not to provide you with a how-to manual that will enable you to be a successful reformer.  I don’t think such a manual exists, and the dream of finding the one right way to fix things has done a lot of damage to schools over the years.  Instead, think of this class as an exercise in realism, a set of cautionary tales that I hope will help you locate your own efforts to improve schools within a useful historical framework.  The idea is to encourage students to develop a rich understanding of the American system of schooling – even a grudging respect for it – before trying to institute reforms, and to instill a little humility into people’s plans for saving the world with better schools.

Audience

This class was originally designed for master’s and doctoral students in education, but it has also works for graduate or undergraduate students in any field who are interested in learning about the nature of the American system of education.

Readings

Books:  The following books are used in the course; both are in print.  Also, pirated digital versions of both books can be found online.

Tyack, David & Cuban, Larry. (1995). Tinkering toward utopia: Reflections on a century of public school reform. Cambridge: Harvard University Press.

Labaree, David F. (2010). Someone has to fail: The zero-sum game of public schooling. Cambridge: Harvard University Press.

            Assigned Articles and Other Readings:  All other readings are available in PDF on the course Google Drive.

 Course Outline

Below are the topics we will cover, week by week, with the readings for each week.  For each week, I provide:  a link to tips for how to approach each week’s readings; links for access to the PDFs of these readings; a link to the class slides for that week.

Week 1

Introduction to course

Tips for week 1 readings

Elmore, Richard F., & McLaughlin, Milbrey W. (1988).  Steady work.  Santa Monica, CA: Rand.

Cohen, David K. (1988), Teaching practice: Plus que ça change.  In Phillip W. Jackson (ed.), Contributing to Educational change (pp. 27-84).  Berkeley: McCutchan.

Labaree, David F. (2010).  Someone has to fail: The zero-sum game of public schooling. Cambridge: Harvard University Press.  Introduction.

Class slides for week 1:  slides 1a, slides 1b, slides 1c

 Week 2

The History of Educational Reform:  An Overview

Tips for week 2 readings

Tyack, David & Cuban, Larry. (1995). Tinkering toward utopia: Reflections on a century of public school reform. Cambridge: Harvard University Press.

Metz, Mary H. (1990). Real school: A universal drama amid disparate experience. In Douglas E. Mitchell & Margaret E. Goertz (Eds.), Education Politics for the New Century (pp. 75-91). New York: Falmer.

Class slides for week 2

Week 3

The Two Major Reform Movements – Common School and Progressivism; Schooling and the Meritocracy

Tips for week 3 readings

Labaree.  Someone has to fail.  Chapters 1, 2, and 3.

McClay, William M. (2016). A distant elite: How meritocracy went wrong. The Hedgehog Review 18:2 (Summer).

Class slides for week 3

Week 4

Factors That Make Reform Difficult

Tips for week 4 readings

Labaree.  Someone has to fail.  Chapters 4 and 5

Meyer, John W. & Rowan, Brian. (1983). The structure of educational organizations. In Organizational environments: Ritual and rationality (pp. 71-97), edited by John W. Meyer and William R. Scott. Beverly Hills, CA: Sage.

Cuban, Larry. (2013). Why so many structural changes in schools and so little reform in teaching practice? In Inside the black box of classroom practice: Change without reform in American education (pp. 155-187). Cambridge: Harvard Education Press.

Check out Larry Cuban’s blog on school reform and classroom practice, always a good read: http://larrycuban.wordpress.com/.

Class slides for week 4

 Week 5

The Rhetorics of Reform:  Cases in Point

Read any four of these closely; lightly skim the rest.

Tips for week 5 readings

Common School Movement

Mann, Horace. (1848). Twelfth Annual Report to the State Board of Education of Massachusetts.  Selections.

Committee of 10

Committee of 10. (1893). Report to the National Council of Education.  Selections.

Pedagogical Progressivism

Dewey, John. (1902/1990). The child and the curriculum. In Philip W. Jackson (ed.), The school and society and the child and the curriculum (pp. 181-209). Chicago: University of Chicago Press.

Administrative Progressivism

Commission on the Reorganization of Secondary Education. (1918). Cardinal principles of secondary education. Washington, DC: National Education Association.

Desegregation

Brown v. Board of Education of Topeka, 347 U.S. 483 (1954).

Standards Movement 1.0

National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Department of Education.

School Choice

Walberg, Herbert J. & Best, Joseph L. (2003). Failure of the public school monopoly. In Education and capitalism: How overcoming our fear of markets and economics can improve America’s schools (pp. 3-32). Stanford: Hoover Institution Press.

Standards Movement 2.0

No Child Left Behind Act.  (2002).  Public Law 107-110.  Title I.

School Choice 2.0

https://drive.google.com/open?id=1-d5LuTZ5YYkLbsL0S268vtaRLFoEqJ_b

Class slides for week 5

Week 6

Making Educational Change

Tips for week 6 readings

Fullan, Michael G. (2001). The new meaning of educational change (3rd ed.). New York: Teachers College Press. Chapter 2.

Fullan, Michael G. (2001). The new meaning of educational change (3rd ed.). New York: Teachers College Press. Chapter 5.

Wolf, Shelby A., Borko, Hilda, Elliott, Rebekah L., & McIver, Monette C. (2000). “That dog won’t hunt!:” Exemplary school change efforts within the Kentucky reform. American Educational Research Journal, 37:2, 349-393.

Delpit, Lisa. (1995).  The silenced dialogue.  In Other people’s children (pp. 21-47).  New York: New Press.

Class slides for week 6

Week 7

Balancing Social Access and Social Advantage

Tips for week 7 readings

Labaree, David F.  (2013).  Balancing access and advantage in the history of American schooling. In Rolf Becker, Patrick Bühler, & Thomas Bühler (Eds.), Bildungsungleichheit und Gerechtigkeit: Wissenschaftliche und Gesellschaftliche Herausforderungen (pp. 101-114).  Bern: Haupt Verlag.

Cohen, David. K., & Neufeld, Barbara. (1981). The failure of high schools and the progress of education. Daedelus, 110 (Summer), 69-89.

Labaree, David F.  (1997).  The middle class and the high school.  In How to Succeed in School Without Really Learning: The Credentials Race in American Education (pp. 92-109).  New Haven, CT: Yale University Press.

Ladson-Billings, Gloria.  (1995).  But that’s just good teaching! The case for culturally relevant pedagogy.  Theory into Practice, 34:3, pp. 159-165.

Class slides for week 7

Week 8

Race and American Schooling

Tips for week 8 readings

Nieto, Sonia (1994).  Affirmation, solidarity, and critique: Moving beyond tolerance in multicultural education.  Multicultural Education, 1:4, pp. 9-12, 35-38.

Fine, Michelle. (1986). Why urban adolescents drop into and out of public high school. The Teachers College Record, 87(3), 393-409.

McWhorter, John. (2018). There’s nothing wrong with Black English. The Atlantic. https://www.theatlantic.com/politics/archive/2018/08/who-gets-to-use-black-english/566867/?utm_source=twb.

Recommended:  The Problem We All Live With.  (2015). This American Life Podcast (July 31).  Available in audio (below) and in transcripthttp://www.thisamericanlife.org/radio-archives/episode/562/the-problem-we-all-live-with.

Class slides for week 8

Week 9

Problems in Making Systematic Reform of Education

Tips for week 9 readings

Scott, James. (1999).  Seeing like a state.  New Haven: Yale University Press.  Pay close attention to Introduction, chapters 1-2 and 9-10.  Skim through the rest looking for examples.

Class slides for week 9

Week 10

Conclusions

Tips for week 10 readings

Labaree.  Someone has to fail.  Chapters 6, 7, and 8.

Cohen, David K. (1990). A revolution in one classroom: The case of Mrs. Oublier,” Educational Evaluation and Policy Analysis, 12:3, pp. 311-329.

March, James G. (1975). Education and the pursuit of optimism. Texas Tech Journal of Education, 2:1, 5-17.

Class slides for week 10

Guidelines for Critical Reading

As a critical reader of a particular text (a book, article, speech, proposal), you need to use the following questions as a framework to guide you as you read:

  1. What’s the point? This is the analysis issue: what is the author’s angle?
  2. Who says? This is the validity issue: On what (data, literature) are the claims based?
  3. What’s new? This is the value-added issue: What does the author contribute that we don’t already know?
  4. Who cares? This is the significance issue, the most important issue of all, the one that subsumes all the others: Is this work worth doing?  Is the text worth reading?  Does it contribute something important?

If this is the way critical readers are going to approach a text, then as an analytical writer you need to guide readers toward the desired answers to each of these questions.

 Guidelines for Analytical Writing

 In writing papers for any course, keep in mind the following points.

  1. Pick an important issue: Make sure that your analysis meets the “so what” test. Why should anyone care about this topic, anyway?  Pick an issue or issues that matters and that you really care about.
  2. Keep focused: Don’t lose track of the point you are trying to make and make sure the reader knows where you are heading and why.
  3. Aim for clarity: Don’t assume that the reader knows what you’re talking about; it’s your job to make your points clearly.  In part this means keeping focused and avoiding distracting clutter.  But in part it means that you need to make more than elliptical references to concepts and sources or to professional experience.  When referring to readings (from the course or elsewhere), explain who said what and why this point is pertinent to the issue at hand.  When drawing on your own experiences or observations, set the context so the reader can understand what you mean.  Proceed as though you were writing for an educated person who is neither a member of this class nor a professional colleague, someone who has not read the material you are referring to.
  4. Provide analysis: A good paper is more than a catalogue of facts, concepts, experiences, or references; it is more than a description of the content of a set of readings; it is more than an expression of your educational values or an announcement of your prescription for what ails education.  A good paper is a logical and coherent analysis of the issues raised within your chosen area of focus.  This means that your paper should aim to explain rather than describe.  If you give examples, be sure to tell the reader what they mean in the context of your analysis.  Make sure the reader understands the connection between the various points in your paper.
  5. Provide depth, insight, and connections: The best papers are ones that go beyond making obvious points, superficial comparisons, and simplistic assertions.  They dig below the surface of the issue at hand, demonstrating a deeper level of understanding and an ability to make interesting connections.
  6. Support your analysis with evidence: You need to do more than simply state your ideas, however informed and useful these may be.  You also need to provide evidence that reassures the reader that you know what you are talking about, thus providing a foundation for your argument.  Evidence comes in part from the academic literature, whether encountered in this course or elsewhere.  Evidence can also come from your own experience.  Remember that you are trying to accomplish two things with the use of evidence.  First, you are saying that it is not just you making this assertion but that authoritative sources and solid evidence back you up.  Second, you are supplying a degree of specificity and detail, which helps to flesh out an otherwise skeletal argument.
  7. Recognize complexity and acknowledge multiple viewpoints. The issues in the history of American education are not simple, and your paper should not propose simple solutions to complex problems.  It should not reduce issues to either/or, black/white, good/bad.  Your paper should give evidence that you understand and appreciate more than one perspective on an issue.  This does not mean you should be wishy-washy.  Instead, you should aim to make a clear point by showing that you have considered alternate views.
  8. Challenge assumptions. The paper should show that you have learned something by doing this paper. There should be evidence that you have been open to changing your mind.
  9. Do not overuse quotation: In a short paper, long quotations (more than a sentence or two in length) are generally not appropriate.  Even in longer papers, quotations should be used sparingly unless they constitute a primary form of data for your analysis.  In general, your paper is more effective if written primarily in your own words, using ideas from the literature but framing them in your own way in order to serve your own analytical purposes.  However, selective use of quotations can be very useful as a way of capturing the author’s tone or conveying a particularly aptly phrased point.
  10. Cite your sources: You need to identify for the reader where particular ideas or examples come from.  Note that citing a source is not sufficient to fulfill the requirement to provide evidence for your argument.  As spelled out in #6 above, you need to transmit to the reader some of the substance of what appears in the source cited, so the reader can understand the connection with the point you are making and can have some meat to chew on.  The best analytical writing provides a real feel for the material and not just a list of assertions and citations.  Depth, insight, and connections count for more than a superficial collection of glancing references.  In other words, don’t just mention an array of sources without drawing substantive points and examples from these sources; and don’t draw on ideas from such sources without identifying the ones you used.
  11. Take care in the quality of your prose: A paper that is written in a clear and effective style makes a more convincing argument than one written in a murky manner, even when both writers start with the same basic understanding of the issues.  However, writing that is confusing usually signals confusion in a person’s thinking.  After all, one key purpose of writing is to put down your ideas in a way that permits you and others to reflect on them critically, to see if they stand up to analysis.  So you should take the time to reflect on your own ideas on paper and revise them as needed.
Posted in Higher Education, History, Race

Du Bois — Of the Coming of John

This post is a classic piece by W. E. B. Du Bois called “Of the Coming of John.”  It’s a chapter from his book, The Souls of Black Folkpublished in 1903.  Here’s a link to the online version.

It’s a heartbreaking work of fiction filled with a lot of hard truths.  It’s the story of two boys named John, one Black and one white, who great up together in a small town in Jim Crow Alabama at the turn of the 20th century.  Both went away to college up North and both came home to visit their families at the same time.

The story is about race and about education.  It tells of a racial divide that education can’t cure, a tragedy just waiting to unfold.  It also tells how education divides people of all races.  Education, it tells us, can be both liberating and alienating.  For the Black John, education showed him a whole new world, different from anything he  had ever experienced, and a way of living that was less confining for people like him.  But it also left him an alien in his own hometown, who no longer felt comfortable there and who no longer could communicate with family and friends in the old way.  College also left the other John alienated from his home environment, but it did nothing to change his thinking about the racial divide there.  When the two Johns collided on their home ground, the end was ugly and somehow unavoidable.

Du Bois himself had a different story.  He was born right after the Civil War in an integrated town in Massachusetts and went on to study at the University of Berlin and Harvard, where he became the first Black person to earn a Ph.D.  You can hear the echoes of his own education running through the story, from the poem by Elizabeth Barrett Browning at the beginning to the phrase from a German song at the end.

It’s grim to read this story, but it’s also a pleasure to spend some time inside the mind of one of America’s great scholars and civil rights leaders.

Du Bois Cover

Of the Coming of John

W. E. B Du Bois

What bring they ‘neath the midnight,

Beside the River–sea?

They bring the human heart wherein

No nightly calm can be;

That droppeth never with the wind,

Nor drieth with the dew;

O calm it, God; thy calm is broad

To cover spirits too.

The river floweth on.

MRS. BROWNING.

Carlisle Street runs westward from the centre of Johnstown, across a great black bridge, down a hill and up again, by little shops and meat–markets, past single–storied homes, until suddenly it stops against a wide green lawn. It is a broad, restful place, with two large buildings outlined against the west. When at evening the winds come swelling from the east, and the great pall of the city’s smoke hangs wearily above the valley, then the red west glows like a dreamland down Carlisle Street, and, at the tolling of the supper–bell, throws the passing forms of students in dark silhouette against the sky. Tall and black, they move slowly by, and seem in the sinister light to flit before the city like dim warning ghosts. Perhaps they are; for this is Wells Institute, and these black students have few dealings with the white city below.

And if you will notice, night after night, there is one dark form that ever hurries last and late toward the twinkling lights of Swain Hall,—for Jones is never on time. A long, straggling fellow he is, brown and hard–haired, who seems to be growing straight out of his clothes, and walks with a half–apologetic roll. He used perpetually to set the quiet dining–room into waves of merriment, as he stole to his place after the bell had tapped for prayers; he seemed so perfectly awkward. And yet one glance at his face made one forgive him much,—that broad, good–natured smile in which lay no bit of art or artifice, but seemed just bubbling good–nature and genuine satisfaction with the world.

He came to us from Altamaha, away down there beneath the gnarled oaks of Southeastern Georgia, where the sea croons to the sands and the sands listen till they sink half drowned beneath the waters, rising only here and there in long, low islands. The white folk of Altamaha voted John a good boy,—fine plough–hand, good in the rice–fields, handy everywhere, and always good–natured and respectful. But they shook their heads when his mother wanted to send him off to school. “It’ll spoil him,—ruin him,” they said; and they talked as though they knew. But full half the black folk followed him proudly to the station, and carried his queer little trunk and many bundles. And there they shook and shook hands, and the girls kissed him shyly and the boys clapped him on the back. So the train came, and he pinched his little sister lovingly, and put his great arms about his mother’s neck, and then was away with a puff and a roar into the great yellow world that flamed and flared about the doubtful pilgrim. Up the coast they hurried, past the squares and palmettos of Savannah, through the cotton–fields and through the weary night, to Millville, and came with the morning to the noise and bustle of Johnstown.

And they that stood behind, that morning in Altamaha, and watched the train as it noisily bore playmate and brother and son away to the world, had thereafter one ever–recurring word,—”When John comes.” Then what parties were to be, and what speakings in the churches; what new furniture in the front room,—perhaps even a new front room; and there would be a new schoolhouse, with John as teacher; and then perhaps a big wedding; all this and more—when John comes. But the white people shook their heads.

At first he was coming at Christmas–time,—but the vacation proved too short; and then, the next summer,—but times were hard and schooling costly, and so, instead, he worked in Johnstown. And so it drifted to the next summer, and the next,—till playmates scattered, and mother grew gray, and sister went up to the Judge’s kitchen to work. And still the legend lingered,—”When John comes.”

Up at the Judge’s they rather liked this refrain; for they too had a John—a fair–haired, smooth–faced boy, who had played many a long summer’s day to its close with his darker namesake. “Yes, sir! John is at Princeton, sir,” said the broad–shouldered gray–haired Judge every morning as he marched down to the post–office. “Showing the Yankees what a Southern gentleman can do,” he added; and strode home again with his letters and papers. Up at the great pillared house they lingered long over the Princeton letter,—the Judge and his frail wife, his sister and growing daughters. “It’ll make a man of him,” said the Judge, “college is the place.” And then he asked the shy little waitress, “Well, Jennie, how’s your John?” and added reflectively, “Too bad, too bad your mother sent him off—it will spoil him.” And the waitress wondered.

Thus in the far–away Southern village the world lay waiting, half consciously, the coming of two young men, and dreamed in an inarticulate way of new things that would be done and new thoughts that all would think. And yet it was singular that few thought of two Johns,—for the black folk thought of one John, and he was black; and the white folk thought of another John, and he was white. And neither world thought the other world’s thought, save with a vague unrest.

Up in Johnstown, at the Institute, we were long puzzled at the case of John Jones. For a long time the clay seemed unfit for any sort of moulding. He was loud and boisterous, always laughing and singing, and never able to work consecutively at anything. He did not know how to study; he had no idea of thoroughness; and with his tardiness, carelessness, and appalling good–humor, we were sore perplexed. One night we sat in faculty–meeting, worried and serious; for Jones was in trouble again. This last escapade was too much, and so we solemnly voted “that Jones, on account of repeated disorder and inattention to work, be suspended for the rest of the term.”

It seemed to us that the first time life ever struck Jones as a really serious thing was when the Dean told him he must leave school. He stared at the gray–haired man blankly, with great eyes. “Why,—why,” he faltered, “but—I haven’t graduated!” Then the Dean slowly and clearly explained, reminding him of the tardiness and the carelessness, of the poor lessons and neglected work, of the noise and disorder, until the fellow hung his head in confusion. Then he said quickly, “But you won’t tell mammy and sister,—you won’t write mammy, now will you? For if you won’t I’ll go out into the city and work, and come back next term and show you something.” So the Dean promised faithfully, and John shouldered his little trunk, giving neither word nor look to the giggling boys, and walked down Carlisle Street to the great city, with sober eyes and a set and serious face.

Perhaps we imagined it, but someway it seemed to us that the serious look that crept over his boyish face that afternoon never left it again. When he came back to us he went to work with all his rugged strength. It was a hard struggle, for things did not come easily to him,—few crowding memories of early life and teaching came to help him on his new way; but all the world toward which he strove was of his own building, and he builded slow and hard. As the light dawned lingeringly on his new creations, he sat rapt and silent before the vision, or wandered alone over the green campus peering through and beyond the world of men into a world of thought. And the thoughts at times puzzled him sorely; he could not see just why the circle was not square, and carried it out fifty–six decimal places one midnight,—would have gone further, indeed, had not the matron rapped for lights out. He caught terrible colds lying on his back in the meadows of nights, trying to think out the solar system; he had grave doubts as to the ethics of the Fall of Rome, and strongly suspected the Germans of being thieves and rascals, despite his textbooks; he pondered long over every new Greek word, and wondered why this meant that and why it couldn’t mean something else, and how it must have felt to think all things in Greek. So he thought and puzzled along for himself,—pausing perplexed where others skipped merrily, and walking steadily through the difficulties where the rest stopped and surrendered.

Thus he grew in body and soul, and with him his clothes seemed to grow and arrange themselves; coat sleeves got longer, cuffs appeared, and collars got less soiled. Now and then his boots shone, and a new dignity crept into his walk. And we who saw daily a new thoughtfulness growing in his eyes began to expect something of this plodding boy. Thus he passed out of the preparatory school into college, and we who watched him felt four more years of change, which almost transformed the tall, grave man who bowed to us commencement morning. He had left his queer thought–world and come back to a world of motion and of men. He looked now for the first time sharply about him, and wondered he had seen so little before. He grew slowly to feel almost for the first time the Veil that lay between him and the white world; he first noticed now the oppression that had not seemed oppression before, differences that erstwhile seemed natural, restraints and slights that in his boyhood days had gone unnoticed or been greeted with a laugh. He felt angry now when men did not call him “Mister,” he clenched his hands at the “Jim Crow” cars, and chafed at the color–line that hemmed in him and his. A tinge of sarcasm crept into his speech, and a vague bitterness into his life; and he sat long hours wondering and planning a way around these crooked things. Daily he found himself shrinking from the choked and narrow life of his native town. And yet he always planned to go back to Altamaha,—always planned to work there. Still, more and more as the day approached he hesitated with a nameless dread; and even the day after graduation he seized with eagerness the offer of the Dean to send him North with the quartette during the summer vacation, to sing for the Institute. A breath of air before the plunge, he said to himself in half apology.

It was a bright September afternoon, and the streets of New York were brilliant with moving men. They reminded John of the sea, as he sat in the square and watched them, so changelessly changing, so bright and dark, so grave and gay. He scanned their rich and faultless clothes, the way they carried their hands, the shape of their hats; he peered into the hurrying carriages. Then, leaning back with a sigh, he said, “This is the World.” The notion suddenly seized him to see where the world was going; since many of the richer and brighter seemed hurrying all one way. So when a tall, light–haired young man and a little talkative lady came by, he rose half hesitatingly and followed them. Up the street they went, past stores and gay shops, across a broad square, until with a hundred others they entered the high portal of a great building.

He was pushed toward the ticket–office with the others, and felt in his pocket for the new five–dollar bill he had hoarded. There seemed really no time for hesitation, so he drew it bravely out, passed it to the busy clerk, and received simply a ticket but no change. When at last he realized that he had paid five dollars to enter he knew not what, he stood stockstill amazed. “Be careful,” said a low voice behind him; “you must not lynch the colored gentleman simply because he’s in your way,” and a girl looked up roguishly into the eyes of her fair–haired escort. A shade of annoyance passed over the escort’s face. “You WILL not understand us at the South,” he said half impatiently, as if continuing an argument. “With all your professions, one never sees in the North so cordial and intimate relations between white and black as are everyday occurrences with us. Why, I remember my closest playfellow in boyhood was a little Negro named after me, and surely no two,—WELL!” The man stopped short and flushed to the roots of his hair, for there directly beside his reserved orchestra chairs sat the Negro he had stumbled over in the hallway. He hesitated and grew pale with anger, called the usher and gave him his card, with a few peremptory words, and slowly sat down. The lady deftly changed the subject.

All this John did not see, for he sat in a half–daze minding the scene about him; the delicate beauty of the hall, the faint perfume, the moving myriad of men, the rich clothing and low hum of talking seemed all a part of a world so different from his, so strangely more beautiful than anything he had known, that he sat in dreamland, and started when, after a hush, rose high and clear the music of Lohengrin’s swan. The infinite beauty of the wail lingered and swept through every muscle of his frame, and put it all a–tune. He closed his eyes and grasped the elbows of the chair, touching unwittingly the lady’s arm. And the lady drew away. A deep longing swelled in all his heart to rise with that clear music out of the dirt and dust of that low life that held him prisoned and befouled. If he could only live up in the free air where birds sang and setting suns had no touch of blood! Who had called him to be the slave and butt of all? And if he had called, what right had he to call when a world like this lay open before men?

Then the movement changed, and fuller, mightier harmony swelled away. He looked thoughtfully across the hall, and wondered why the beautiful gray–haired woman looked so listless, and what the little man could be whispering about. He would not like to be listless and idle, he thought, for he felt with the music the movement of power within him. If he but had some master–work, some life–service, hard,—aye, bitter hard, but without the cringing and sickening servility, without the cruel hurt that hardened his heart and soul. When at last a soft sorrow crept across the violins, there came to him the vision of a far–off home, the great eyes of his sister, and the dark drawn face of his mother. And his heart sank below the waters, even as the sea–sand sinks by the shores of Altamaha, only to be lifted aloft again with that last ethereal wail of the swan that quivered and faded away into the sky.

It left John sitting so silent and rapt that he did not for some time notice the usher tapping him lightly on the shoulder and saying politely, “Will you step this way, please, sir?” A little surprised, he arose quickly at the last tap, and, turning to leave his seat, looked full into the face of the fair–haired young man. For the first time the young man recognized his dark boyhood playmate, and John knew that it was the Judge’s son. The White John started, lifted his hand, and then froze into his chair; the black John smiled lightly, then grimly, and followed the usher down the aisle. The manager was sorry, very, very sorry,—but he explained that some mistake had been made in selling the gentleman a seat already disposed of; he would refund the money, of course,—and indeed felt the matter keenly, and so forth, and—before he had finished John was gone, walking hurriedly across the square and down the broad streets, and as he passed the park he buttoned his coat and said, “John Jones, you’re a natural–born fool.” Then he went to his lodgings and wrote a letter, and tore it up; he wrote another, and threw it in the fire. Then he seized a scrap of paper and wrote: “Dear Mother and Sister—I am coming—John.”

“Perhaps,” said John, as he settled himself on the train, “perhaps I am to blame myself in struggling against my manifest destiny simply because it looks hard and unpleasant. Here is my duty to Altamaha plain before me; perhaps they’ll let me help settle the Negro problems there,—perhaps they won’t. ‘I will go in to the King, which is not according to the law; and if I perish, I perish.'” And then he mused and dreamed, and planned a life–work; and the train flew south.

Down in Altamaha, after seven long years, all the world knew John was coming. The homes were scrubbed and scoured,—above all, one; the gardens and yards had an unwonted trimness, and Jennie bought a new gingham. With some finesse and negotiation, all the dark Methodists and Presbyterians were induced to join in a monster welcome at the Baptist Church; and as the day drew near, warm discussions arose on every corner as to the exact extent and nature of John’s accomplishments. It was noontide on a gray and cloudy day when he came. The black town flocked to the depot, with a little of the white at the edges,—a happy throng, with “Good–mawnings” and “Howdys” and laughing and joking and jostling. Mother sat yonder in the window watching; but sister Jennie stood on the platform, nervously fingering her dress, tall and lithe, with soft brown skin and loving eyes peering from out a tangled wilderness of hair. John rose gloomily as the train stopped, for he was thinking of the “Jim Crow” car; he stepped to the platform, and paused: a little dingy station, a black crowd gaudy and dirty, a half–mile of dilapidated shanties along a straggling ditch of mud. An overwhelming sense of the sordidness and narrowness of it all seized him; he looked in vain for his mother, kissed coldly the tall, strange girl who called him brother, spoke a short, dry word here and there; then, lingering neither for handshaking nor gossip, started silently up the street, raising his hat merely to the last eager old aunty, to her open–mouthed astonishment. The people were distinctly bewildered. This silent, cold man,—was this John? Where was his smile and hearty hand–grasp? “‘Peared kind o’ down in the mouf,” said the Methodist preacher thoughtfully. “Seemed monstus stuck up,” complained a Baptist sister. But the white postmaster from the edge of the crowd expressed the opinion of his folks plainly. “That damn Nigger,” said he, as he shouldered the mail and arranged his tobacco, “has gone North and got plum full o’ fool notions; but they won’t work in Altamaha.” And the crowd melted away.

The meeting of welcome at the Baptist Church was a failure. Rain spoiled the barbecue, and thunder turned the milk in the ice–cream. When the speaking came at night, the house was crowded to overflowing. The three preachers had especially prepared themselves, but somehow John’s manner seemed to throw a blanket over everything,—he seemed so cold and preoccupied, and had so strange an air of restraint that the Methodist brother could not warm up to his theme and elicited not a single “Amen”; the Presbyterian prayer was but feebly responded to, and even the Baptist preacher, though he wakened faint enthusiasm, got so mixed up in his favorite sentence that he had to close it by stopping fully fifteen minutes sooner than he meant. The people moved uneasily in their seats as John rose to reply. He spoke slowly and methodically. The age, he said, demanded new ideas; we were far different from those men of the seventeenth and eighteenth centuries,—with broader ideas of human brotherhood and destiny. Then he spoke of the rise of charity and popular education, and particularly of the spread of wealth and work. The question was, then, he added reflectively, looking at the low discolored ceiling, what part the Negroes of this land would take in the striving of the new century. He sketched in vague outline the new Industrial School that might rise among these pines, he spoke in detail of the charitable and philanthropic work that might be organized, of money that might be saved for banks and business. Finally he urged unity, and deprecated especially religious and denominational bickering. “To–day,” he said, with a smile, “the world cares little whether a man be Baptist or Methodist, or indeed a churchman at all, so long as he is good and true. What difference does it make whether a man be baptized in river or washbowl, or not at all? Let’s leave all that littleness, and look higher.” Then, thinking of nothing else, he slowly sat down. A painful hush seized that crowded mass. Little had they understood of what he said, for he spoke an unknown tongue, save the last word about baptism; that they knew, and they sat very still while the clock ticked. Then at last a low suppressed snarl came from the Amen corner, and an old bent man arose, walked over the seats, and climbed straight up into the pulpit. He was wrinkled and black, with scant gray and tufted hair; his voice and hands shook as with palsy; but on his face lay the intense rapt look of the religious fanatic. He seized the Bible with his rough, huge hands; twice he raised it inarticulate, and then fairly burst into words, with rude and awful eloquence. He quivered, swayed, and bent; then rose aloft in perfect majesty, till the people moaned and wept, wailed and shouted, and a wild shrieking arose from the corners where all the pent–up feeling of the hour gathered itself and rushed into the air. John never knew clearly what the old man said; he only felt himself held up to scorn and scathing denunciation for trampling on the true Religion, and he realized with amazement that all unknowingly he had put rough, rude hands on something this little world held sacred. He arose silently, and passed out into the night. Down toward the sea he went, in the fitful starlight, half conscious of the girl who followed timidly after him. When at last he stood upon the bluff, he turned to his little sister and looked upon her sorrowfully, remembering with sudden pain how little thought he had given her. He put his arm about her and let her passion of tears spend itself on his shoulder.

Long they stood together, peering over the gray unresting water.

“John,” she said, “does it make every one—unhappy when they study and learn lots of things?”

He paused and smiled. “I am afraid it does,” he said.

“And, John, are you glad you studied?”

“Yes,” came the answer, slowly but positively.

She watched the flickering lights upon the sea, and said thoughtfully, “I wish I was unhappy,—and—and,” putting both arms about his neck, “I think I am, a little, John.”

It was several days later that John walked up to the Judge’s house to ask for the privilege of teaching the Negro school. The Judge himself met him at the front door, stared a little hard at him, and said brusquely, “Go ’round to the kitchen door, John, and wait.” Sitting on the kitchen steps, John stared at the corn, thoroughly perplexed. What on earth had come over him? Every step he made offended some one. He had come to save his people, and before he left the depot he had hurt them. He sought to teach them at the church, and had outraged their deepest feelings. He had schooled himself to be respectful to the Judge, and then blundered into his front door. And all the time he had meant right,—and yet, and yet, somehow he found it so hard and strange to fit his old surroundings again, to find his place in the world about him. He could not remember that he used to have any difficulty in the past, when life was glad and gay. The world seemed smooth and easy then. Perhaps,—but his sister came to the kitchen door just then and said the Judge awaited him.

The Judge sat in the dining–room amid his morning’s mail, and he did not ask John to sit down. He plunged squarely into the business. “You’ve come for the school, I suppose. Well John, I want to speak to you plainly. You know I’m a friend to your people. I’ve helped you and your family, and would have done more if you hadn’t got the notion of going off. Now I like the colored people, and sympathize with all their reasonable aspirations; but you and I both know, John, that in this country the Negro must remain subordinate, and can never expect to be the equal of white men. In their place, your people can be honest and respectful; and God knows, I’ll do what I can to help them. But when they want to reverse nature, and rule white men, and marry white women, and sit in my parlor, then, by God! we’ll hold them under if we have to lynch every Nigger in the land. Now, John, the question is, are you, with your education and Northern notions, going to accept the situation and teach the darkies to be faithful servants and laborers as your fathers were,—I knew your father, John, he belonged to my brother, and he was a good Nigger. Well—well, are you going to be like him, or are you going to try to put fool ideas of rising and equality into these folks’ heads, and make them discontented and unhappy?”

“I am going to accept the situation, Judge Henderson,” answered John, with a brevity that did not escape the keen old man. He hesitated a moment, and then said shortly, “Very well,—we’ll try you awhile. Good–morning.”

It was a full month after the opening of the Negro school that the other John came home, tall, gay, and headstrong. The mother wept, the sisters sang. The whole white town was glad. A proud man was the Judge, and it was a goodly sight to see the two swinging down Main Street together. And yet all did not go smoothly between them, for the younger man could not and did not veil his contempt for the little town, and plainly had his heart set on New York. Now the one cherished ambition of the Judge was to see his son mayor of Altamaha, representative to the legislature, and—who could say?—governor of Georgia. So the argument often waxed hot between them. “Good heavens, father,” the younger man would say after dinner, as he lighted a cigar and stood by the fireplace, “you surely don’t expect a young fellow like me to settle down permanently in this—this God–forgotten town with nothing but mud and Negroes?” “I did,” the Judge would answer laconically; and on this particular day it seemed from the gathering scowl that he was about to add something more emphatic, but neighbors had already begun to drop in to admire his son, and the conversation drifted.

“Heah that John is livenin’ things up at the darky school,” volunteered the postmaster, after a pause.

“What now?” asked the Judge, sharply.

“Oh, nothin’ in particulah,—just his almighty air and uppish ways. B’lieve I did heah somethin’ about his givin’ talks on the French Revolution, equality, and such like. He’s what I call a dangerous Nigger.”

“Have you heard him say anything out of the way?”

“Why, no,—but Sally, our girl, told my wife a lot of rot. Then, too, I don’t need to heah: a Nigger what won’t say ‘sir’ to a white man, or—”

“Who is this John?” interrupted the son.

“Why, it’s little black John, Peggy’s son,—your old playfellow.”

The young man’s face flushed angrily, and then he laughed.

“Oh,” said he, “it’s the darky that tried to force himself into a seat beside the lady I was escorting—”

But Judge Henderson waited to hear no more. He had been nettled all day, and now at this he rose with a half–smothered oath, took his hat and cane, and walked straight to the schoolhouse.

For John, it had been a long, hard pull to get things started in the rickety old shanty that sheltered his school. The Negroes were rent into factions for and against him, the parents were careless, the children irregular and dirty, and books, pencils, and slates largely missing. Nevertheless, he struggled hopefully on, and seemed to see at last some glimmering of dawn. The attendance was larger and the children were a shade cleaner this week. Even the booby class in reading showed a little comforting progress. So John settled himself with renewed patience this afternoon.

“Now, Mandy,” he said cheerfully, “that’s better; but you mustn’t chop your words up so: ‘If—the–man—goes.’ Why, your little brother even wouldn’t tell a story that way, now would he?”

“Naw, suh, he cain’t talk.”

“All right; now let’s try again: ‘If the man—’

“John!”

The whole school started in surprise, and the teacher half arose, as the red, angry face of the Judge appeared in the open doorway.

“John, this school is closed. You children can go home and get to work. The white people of Altamaha are not spending their money on black folks to have their heads crammed with impudence and lies. Clear out! I’ll lock the door myself.”

Up at the great pillared house the tall young son wandered aimlessly about after his father’s abrupt departure. In the house there was little to interest him; the books were old and stale, the local newspaper flat, and the women had retired with headaches and sewing. He tried a nap, but it was too warm. So he sauntered out into the fields, complaining disconsolately, “Good Lord! how long will this imprisonment last!” He was not a bad fellow,—just a little spoiled and self–indulgent, and as headstrong as his proud father. He seemed a young man pleasant to look upon, as he sat on the great black stump at the edge of the pines idly swinging his legs and smoking. “Why, there isn’t even a girl worth getting up a respectable flirtation with,” he growled. Just then his eye caught a tall, willowy figure hurrying toward him on the narrow path. He looked with interest at first, and then burst into a laugh as he said, “Well, I declare, if it isn’t Jennie, the little brown kitchen–maid! Why, I never noticed before what a trim little body she is. Hello, Jennie! Why, you haven’t kissed me since I came home,” he said gaily. The young girl stared at him in surprise and confusion,—faltered something inarticulate, and attempted to pass. But a wilful mood had seized the young idler, and he caught at her arm. Frightened, she slipped by; and half mischievously he turned and ran after her through the tall pines.

Yonder, toward the sea, at the end of the path, came John slowly, with his head down. He had turned wearily homeward from the schoolhouse; then, thinking to shield his mother from the blow, started to meet his sister as she came from work and break the news of his dismissal to her. “I’ll go away,” he said slowly; “I’ll go away and find work, and send for them. I cannot live here longer.” And then the fierce, buried anger surged up into his throat. He waved his arms and hurried wildly up the path.

The great brown sea lay silent. The air scarce breathed. The dying day bathed the twisted oaks and mighty pines in black and gold. There came from the wind no warning, not a whisper from the cloudless sky. There was only a black man hurrying on with an ache in his heart, seeing neither sun nor sea, but starting as from a dream at the frightened cry that woke the pines, to see his dark sister struggling in the arms of a tall and fair–haired man.

He said not a word, but, seizing a fallen limb, struck him with all the pent–up hatred of his great black arm, and the body lay white and still beneath the pines, all bathed in sunshine and in blood. John looked at it dreamily, then walked back to the house briskly, and said in a soft voice, “Mammy, I’m going away—I’m going to be free.”

She gazed at him dimly and faltered, “No’th, honey, is yo’ gwine No’th agin?”

He looked out where the North Star glistened pale above the waters, and said, “Yes, mammy, I’m going—North.”

Then, without another word, he went out into the narrow lane, up by the straight pines, to the same winding path, and seated himself on the great black stump, looking at the blood where the body had lain. Yonder in the gray past he had played with that dead boy, romping together under the solemn trees. The night deepened; he thought of the boys at Johnstown. He wondered how Brown had turned out, and Carey? And Jones,—Jones? Why, he was Jones, and he wondered what they would all say when they knew, when they knew, in that great long dining–room with its hundreds of merry eyes. Then as the sheen of the starlight stole over him, he thought of the gilded ceiling of that vast concert hall, heard stealing toward him the faint sweet music of the swan. Hark! was it music, or the hurry and shouting of men? Yes, surely! Clear and high the faint sweet melody rose and fluttered like a living thing, so that the very earth trembled as with the tramp of horses and murmur of angry men.

He leaned back and smiled toward the sea, whence rose the strange melody, away from the dark shadows where lay the noise of horses galloping, galloping on. With an effort he roused himself, bent forward, and looked steadily down the pathway, softly humming the “Song of the Bride,”—

“Freudig gefuhrt, ziehet dahin.”

Amid the trees in the dim morning twilight he watched their shadows dancing and heard their horses thundering toward him, until at last they came sweeping like a storm, and he saw in front that haggard white–haired man, whose eyes flashed red with fury. Oh, how he pitied him,—pitied him,—and wondered if he had the coiling twisted rope. Then, as the storm burst round him, he rose slowly to his feet and turned his closed eyes toward the Sea.

And the world whistled in his ears.

Posted in Meritocracy, Populism, Welfare

Hochschild — Strangers in Their Own Land

This post is a reflection on a book by Arlie Russell Hochschild, Strangers in Their Own Land: Anger and Mourning on the American Right.  In it she provides one of the most compelling and persuasive explanation for the turn toward right-wing populism in American politics and the peculiar appeal of Donald Trump.  As she puts it in her subtitle, this is “A Journey to the Heart of Our Political Divide.”

The book, published in 2016, is based on intensive interviews that she did in Louisiana with people on the populist right, long before Trump launched his campaign for president.  At the time, the political movement was the Tea Party, but her subjects ended up providing her an advance look at at the deep issues that led voters to support Trump.

There is no substitute for reading the book, which I strongly recommend.  But to whet your appetite, I provide some of the key points below and some of the most telling quotes.  You’ll find that a lot or her analysis aligns with the analysis by Michael Sandel in The Tyranny of Merit, which I commented on recently.

Hochschild Cover

Here’s the heart of what people were telling her:

You are a stranger in your own land. You do not recognize yourself in how others see you. It is a struggle to feel seen and honored. And to feel honored you have to feel—and feel seen as—moving forward. But through no fault of your own, and in ways that are hidden, you are slipping backward.

As Sandel noted, the meritocracy leaves the uncredentialed with no basis for public respect.  Without SATs and fancy degrees, it’s like you don’t count or you don’t even exist.  This used to be your country and there used to be honor in simply doing your job, going to church, obeying the law, and raising a family, but none of that seems to be true any more.  Respect only now seems to go to those who who are moving ahead in the new knowledge economy, but you and people around you seem to be barely holding your own or falling behind.  

How do you handle this situation?  Not by playing the victim card; that’s for a different kind of person.  “Like nearly everyone I spoke with, Donny was not one to think of himself as a victim. That was the language of the ‘poor me’s’ asking for government handouts. The very word ‘victim’ didn’t sit right.”  Instead, you take stoic stance, adopting one of three versions of what Hochschild calls the “endurance self.”

I was discovering three distinct expressions of this endurance self in different people around Lake Charles—the Team Loyalist, the Worshipper, and the Cowboy, as I came to see them. Each kind of person expresses the value of endurance and expresses a capacity for it. Each attaches an aspect of self to this heroism. The Team Loyalist accomplishes a team goal, supporting the Republican Party. The Worshipper sacrifices a strong wish. The Cowboy affirms a fearless self. 

Each identity involves holding on in spite of the sacrifices you have to make.  The Loyalist sticks by the Republican Party even though it keeps betraying you time and again, as is so often the case in Louisiana.  They allow companies to pollute your environment and skimp on their taxes, but they’re still all you’ve got.  

The Worshipper keeps the faith even though it means giving up something you really care about.

But sometimes you had to do without what you wanted. You couldn’t have both the oil industry and clean lakes, she thought, and if you had to choose, you had to choose oil. “Oil’s been pretty darned good to us,” she said. “I don’t want a smaller house. I don’t want to drive a smaller car.”

So you hang in there.  The Cowboy understands character as a willingness to take risks and live with the consequences.  You can make it on your own, without having to rely on welfare and special privileges.

To Donny, the Cowboy expressed high moral virtue. Equating creativity with daring—the stuff of great explorers, inventors, generals, winners—Donny honored the capacity to take risk and face fear. He could take hard knocks like a man. He could endure. 

The people she spoke with had a deep suspicion of the state.

“The state always seems to come down on the little guy,” he notes. “Take this bayou. If your motorboat leaks a little gas into the water, the warden’ll write you up. But if companies leak thousands of gallons of it and kill all the life here? The state lets them go. If you shoot an endangered brown pelican, they’ll put you in jail. But if a company kills the brown pelican by poisoning the fish he eats? They let it go. I think they overregulate the bottom because it’s harder to regulate the top.”

For liberals, this stance is hard to fathom, because for them the institutions of the state are the key guardians of the public square, which is central to their values.  And this space is now under threat.

…In the liberal deep story, an alarming event occurs; marauders invade the public square, recklessly dismantle it, and selfishly steal away bricks and concrete chunks from the public buildings at its center. Seeing insult added to injury, those guarding the public square watch helplessly as those who’ve dismantled it construct private McMansions with the same bricks and pieces of concrete, privatizing the public realm. That’s the gist of the liberal deep story, and the right can’t understand the deep pride liberals take in their creatively designed, hard-won public sphere as a powerful integrative force in American life. Ironically, you may have more in common with the left than you imagine, for many on the left feel like strangers in their own land too.

For right-wing populists, the federal government is the biggest threat.  For those in the West, the feds are the ones who seem to own all the land and regulate what you can do with it.  In the South, the resentments runs even deeper.

After the Civil War, the North replaced Southern state governments with its own hand-picked governors. The profit-seeking carpetbaggers came, it seemed to those I interviewed, as agents of the dominating North. Exploiters from the North, an angry, traumatized black population at home, and moral condemnation from all—this was the scene some described to me. When the 1960s began sending Freedom Riders and civil rights activists, pressing for new federal laws to dismantle Jim Crow, there they came again, it seemed, the moralizing North. And again, Obamacare, global warming, gun control, abortion rights—did these issues, too, fall into the emotional grooves of history? Does it feel like another strike from the North, from Washington, that has put the brown pelican ahead of the Tea Partier waiting in line?

And then there’s the last issue:  waiting in line.  Hochschild identifies a deep story that runs through all of the accounts she heard, and at the heart is a sense of resentment about being treated unfairly in the pursuit of the American Dream.  The dream is all about the possibilities for getting ahead, and this means an orderly process of status advancement in which people wait in line until it’s their turn.  The core problem is that suddenly they find other people cutting in front of them in line, and the federal government is helping them do it.

Look! You see people cutting in line ahead of you! You’re following the rules. They aren’t. As they cut in, it feels like you are being moved back. How can they just do that? Who are they? Some are black. Through affirmative action plans, pushed by the federal government, they are being given preference for places in colleges and universities, apprenticeships, jobs, welfare payments, and free lunches…. Women, immigrants, refugees, public sector workers—where will it end? Your money is running through a liberal sympathy sieve you don’t control or agree with. These are opportunities you’d have loved to have had in your day—and either you should have had them when you were young or the young shouldn’t be getting them now. It’s not fair.

You’re a compassionate person. But now you’ve been asked to extend your sympathy to all the people who have cut in front of you. So you have your guard up against requests for sympathy. People complain: Racism. Discrimination. Sexism. You’ve heard stories of oppressed blacks, dominated women, weary immigrants, closeted gays, desperate refugees, but at some point, you say to yourself, you have to close the borders to human sympathy—especially if there are some among them who might bring you harm. You’ve suffered a good deal yourself, but you aren’t complaining about it.

Posted in Academic writing, Writing

Dumitrescu: How to Write Well

This post is a review essay by Irina Dumitrescu about five books that explore how to write well.  It appeared in Times Literary Supplement, March 20, 2020.  Here’s a link to the original.  

She’s reviewing five books about writing.  Is there any writing task more fraught with peril that trying to write about writing?  Anything less than superlative literary style would constitute an abject failure.  Fortunately this author is up to the challenge.

Here are some of my favorite passages.

She reminds of an enduring truth about writing.  Everyone starts with imitating others.  You need models to work from:

Shakespeare patterned his comedies on Terence’s Latin romps, and Terence stole his plots from the Greek Menander. Milton copied Virgil, who plagiarized Homer. The history of literature is a catwalk on which the same old skeletons keep coming out in new clothes.

On the other hand,

Style unsettles this pedagogy of models and moulds. As the novelist Elizabeth McCracken once told Ben Yagoda in an interview, “A writer’s voice lives in his or her bad habits … the trick is to make them charming bad habits”. Readers longing for something beyond mere information – verbal fireworks, the tremor of an authentic connection, a touch of quiet magic – will do well to find the rule-breakers on the bookshop shelf. Idiosyncrasies (even mistakes) account for the specific charm of a given author, and they slyly open the door to decisions of taste.

One author makes the case against turgid academic writing in a book that she admiringly calls

an inspiring mess, a book that in its haphazard organization is its own argument for playfulness and improvisation. Like Warner, Kumar cannot stand “the five-paragraph costume armor of the high school essay”. Nor does he have much patience for other formulaic aspects of academic writing: didactic topic sentences, or jargony vocabulary such as “emergence” and “post-capitalist hegemony”. In his description of a website that produces meaningless theoretical prose at the touch of a button, Kumar notes that “the academy is the original random sentence generator”.

Of all the books she discusses, my favorite (and hers, I think) is 

Joe Moran’s exquisite book First You Write a Sentence…. As befits a cultural historian, Moran compares writing sentences to crafting other artisanal objects – they are artworks and spaces of refuge, gifts with which an author shapes the world to be more beautiful and capacious and kind. Like a town square or a city park, “a well-made sentence shows … silent solicitude for others. It cares”.

Moran’s own sentences are so deliciously epigrammatic that I considered giving up chocolate in favour of re-reading his book. Because he has dedicated an entire volume to one small form, he has the leisure to attend to fine details. As he explores sentences from every angle, he describes the relative heat of different verbs, the delicately shading nuances of punctuation choices, how short words feel in the mouth, the opportunity of white space. “Learn to love the feel of sentences,” he writes with a connoisseur’s delight, “the arcs of anticipation and suspense, the balancing phrases, the wholesome little snap of the full stop.”

Enjoy.

How to write well

Rules, style and the ‘well-made sentence’

By Irina Dumitrescu

IN THIS REVIEW
WHY THEY CAN’T WRITE
Killing the five-paragraph essay and other necessities
288pp. Johns Hopkins University Press. £20.50 (US $27.95).
John Warner
WRITING TO PERSUADE
How to bring people over to your side
224pp. Norton. £18.99 (US $26.95).
Trish Hall
EVERY DAY I WRITE THE BOOK
Notes on style
256pp. Duke University Press. Paperback, £20.99 (US $24.95).
Amitava Kumar
FIRST YOU WRITE A SENTENCE
The elements of reading, writing … and life
240pp. Penguin. Paperback, £9.99.
Joe Moran
MEANDER, SPIRAL, EXPLODE
Design and pattern in narrative
272pp. Catapult. Paperback, $16.95.
Jane Alison

In high school a close friend told me about a lesson her father had received when he was learning to write in English. Any essay could be improved by the addition of one specific phrase: “in a world tormented by the spectre of thermonuclear holocaust”. We thought it would be hilarious to surprise our own teachers with this gem, but nothing came of it. Twenty years later, as I looked through the files on an old computer, I discovered my high school compositions. There, at the end of an essay on Hugo Grotius and just war theory I must have written for this purpose alone, was that irresistible rhetorical flourish.

As much as we might admire what is fresh and innovative, we all learn by imitating patterns. Babies learning to speak do not immediately acquire the full grammar of their mother tongue and a vocabulary to slot into it, but inch slowly into the language by repeating basic phrases, then varying them. Adults learning a foreign language are wise to do the same. Pianists run through exercises to train their dexterity, basketball players run through their plays, dancers rehearse combos they can later slip into longer choreographies. To be called “formulaic” is no compliment, but whenever people express themselves or take action in the world, they rely on familiar formulas.

Writing advice is caught in this paradox. Mavens of clear communication know that simple rules are memorable and easy to follow. Use a verb instead of a noun. Change passive to active. Cut unnecessary words. Avoid jargon. No aspiring author will make the language dance by following these dictates, but they will be understood, and that is something. The same holds for structure. In school, pupils are drilled in the basic shapes of arguments, such as the “rule of three”, the “five-paragraph essay” or, à l’américaine, the Hamburger Essay (the main argument being the meat). Would-be novelists weigh their Fichtean Curves against their Hero’s Journeys, and screenwriters can buy software that will ensure their movie script hits every beat prescribed by Blake Snyder in his bestselling book Save the Cat! (2005). And why not? Shakespeare patterned his comedies on Terence’s Latin romps, and Terence stole his plots from the Greek Menander. Milton copied Virgil, who plagiarized Homer. The history of literature is a catwalk on which the same old skeletons keep coming out in new clothes.

Style unsettles this pedagogy of models and moulds. As the novelist Elizabeth McCracken once told Ben Yagoda in an interview, “A writer’s voice lives in his or her bad habits … the trick is to make them charming bad habits”. Readers longing for something beyond mere information – verbal fireworks, the tremor of an authentic connection, a touch of quiet magic – will do well to find the rule-breakers on the bookshop shelf. Idiosyncrasies (even mistakes) account for the specific charm of a given author, and they slyly open the door to decisions of taste. Think of David Foster Wallace’s endless sentences, George R. R. Martin’s neologisms, the faux-naivety of Gertrude Stein. In his book on literary voice, The Sound on the Page (2004), Yagoda argues that style reveals “something essential” and impossible to conceal about an author’s character. The notion that the way a person arranges words is inextricably tied to their moral core has a long history, but its implication for teaching writing is what interests me here: convince or compel writers to cleave too closely to a set of prescribed rules, and you chip away at who they are.

This explains why John Warner’s book about writing, Why They Can’t Write: Killing the five-paragraph essay and other necessities, contains almost no advice on how to write. A long-time college instructor, Warner hints at his argument in his subtitle: his is a polemical take on American standardized testing practices, socioeconomic conditions, and institutions of learning that destroy any love or motivation young people might have for expressing themselves in writing. Against the perennial assumption that today’s students are too lazy and precious to work hard, Warner holds firm: “Students are not entitled or coddled. They are defeated”. The symbol of the US’s misguided approach to education is the argumentative structure drilled into each teenager as a shortcut for thinking and reflection. “If writing is like exercise,” he quips, “the five-paragraph essay is like one of those ab belt doohickeys that claim to electroshock your core into a six-pack.”

What is to blame for students’ bad writing? According to Warner, the entire context in which it is taught. He rails against school systems that privilege shallow “achievement” over curiosity and learning, a culture of “surveillance and compliance” (including apps that track students’ behaviour and report it to parents in real time), an obsession with standardized testing that is fundamentally inimical to thoughtful reading and writing, and a love of faddish psychological theories and worthless digital learning projects.

It is easy for a lover of good writing to share Warner’s anger at the shallow and mechanistic culture of public education in the United States, easy to smile knowingly when he notes that standardized tests prize students’ ability to produce “pseudo-academic BS”, meaningless convoluted sentences cobbled together out of sophisticated-sounding words. Warner’s argument against teaching grammar is harder to swallow. Seeing in grammar yet another case of rules and correctness being put ahead of thoughtful engagement, Warner claims, “the sentence is not the basic skill or fundamental unit of writing. The idea is”. Instead of assignments, he gives his students “writing experiences”, interlocked prompts designed to hone their ability to observe, analyse and communicate. His position on grammatical teaching is a step too far: it can be a tool as much as a shackle. Still, writers may recognize the truth of Warner’s reflection that “what looks like a problem with basic sentence construction may instead be a struggle to find an idea for the page”.

Trish Hall shares Warner’s belief that effective writing means putting thinking before craft. Hall ran the New York Times’s op-ed page for half a decade, and in Writing To Persuade she shows us how to succeed at one kind of formula, the short newspaper opinion piece. The book is slim, filled out with personal recollections in muted prose, and enlivened by the occasional celebrity anecdote. Her target audience seems to be the kind of educated professionals who regularly read the New York Times, who may even write as part of their work, but who have not thought about what it means to address those who do not share their opinions. Hall does offer useful, sometimes surprising, tips on avoiding jargon, finding a writerly voice, and telling a story, but most of the book is dedicated to cultivating the humanity beneath the writing.

“I can’t overstate the value of putting down your phone and having conversations with people”, she writes. Persuasion is not simply a matter of hammering one’s own point through with unassailable facts and arguments. It is a question of listening to other people, cultivating empathy for their experience, drawing on shared values to reach common ground. It also demands vulnerability; Hall praises writers who “reveal something almost painfully personal even as they connect to a larger issue or story that feels both universal and urgent”.

Much of her advice would not have surprised a classical rhetorician. She even quotes Cicero’s famous remark about it being a mistake to try “to compel others to believe and live as we do”, a mantra for this book. At her best, Hall outlines a rhetoric that is also a guide to living peaceably with others: understanding their desires, connecting. A simple experiment – not finishing other people’s sentences even when you think you know what they will say – exemplifies this understated wisdom. At her worst, Hall is too much the marketer, as when she notes that strong emotions play well on social media and enjoins her readers to “stay away from depressing images and crying people”. There ought to be enough space in a newspaper for frankly expressed opinions about the suffering of humanity. What she demonstrates, however, is that writing for an audience is a social act. Writing To Persuade is a stealth guide to manners for living in a world where conversations are as likely to take place in 280 characters on a screen as they are at a dinner table.

In Hall’s hands, considering other people means following a programmatic set of writing instructions. Amitava Kumar, a scholar who has written well- regarded works of memoir and journalism, thinks another way is possible. In Every Day I Write the Book: Notes on style, he breaks out of the strictures of academic prose by creating a virtual community of other writers on his pages. The book is a collection of short meditations on different topics related to writing, its form and practice, primarily in the university. Kumar’s style is poised and lyrical elsewhere, but here he takes on a familiar, relaxed persona, and he often lets his interlocutors have the best lines. Selections from his reading bump up against email conversations, chats on the Vassar campus, and Facebook comments; it is a noisy party where everyone has a bon mot at the ready. The book itself is assembled like a scrapbook, filled with reproductions of photographs, screenshots, handwritten notes and newspaper clippings Kumar has gathered over the years.

It is, in other words, an inspiring mess, a book that in its haphazard organization is its own argument for playfulness and improvisation. Like Warner, Kumar cannot stand “the five-paragraph costume armor of the high school essay”. Nor does he have much patience for other formulaic aspects of academic writing: didactic topic sentences, or jargony vocabulary such as “emergence” and “post-capitalist hegemony”. In his description of a website that produces meaningless theoretical prose at the touch of a button, Kumar notes that “the academy is the original random sentence generator”. He is not anti-intellectual; his loyalties lie with the university, even as he understands its provinciality too well. But he asks his fellow writers to hold on fiercely to the weird and whimsical elements in their own creations, to be “inventive in our use of language and in our search for form”.

This means many things in practice. Kumar includes a section of unusual writing exercises, many of them borrowed from other authors: rewriting a brilliant passage badly to see what made it work; scribbling just what will fit on a Post-it Note to begin a longer piece; writing letters to public figures. Other moments are about connection. In a chapter on voice, he quotes the poet and novelist Bhanu Kapil’s description of how she began a series of interviews with Indian and Pakistani women: “The first question I asked, to a young Muslim woman … Indian parents, thick Glaswegian accent, [was] ‘Who was responsible for the suffering of your mother?’ She burst into tears”. That one question could fill many libraries. Invention also means embracing collaboration with editors, and understanding writing as “a practice of revision and extension and opening”. Kumar calls for loyalty to one’s creative calling, wherever it may lead. The reward? Nothing less than freedom and immortality.

But surely craft still matters? We may accept that writing is rooted in the ethical relationships between teachers, students, writers, editors and those silent imagined readers. Does this mean that the skill of conveying an idea in language in a clear and aesthetically pleasing fashion is nothing but the icing on the cake? Joe Moran’s exquisite book First You Write a Sentence: The elements of reading, writing … and life suggests otherwise. As befits a cultural historian, Moran compares writing sentences to crafting other artisanal objects – they are artworks and spaces of refuge, gifts with which an author shapes the world to be more beautiful and capacious and kind. Like a town square or a city park, “a well-made sentence shows … silent solicitude for others. It cares”.

Moran’s own sentences are so deliciously epigrammatic that I considered giving up chocolate in favour of re-reading his book. Because he has dedicated an entire volume to one small form, he has the leisure to attend to fine details. As he explores sentences from every angle, he describes the relative heat of different verbs, the delicately shading nuances of punctuation choices, how short words feel in the mouth, the opportunity of white space. “Learn to love the feel of sentences,” he writes with a connoisseur’s delight, “the arcs of anticipation and suspense, the balancing phrases, the wholesome little snap of the full stop.”

The book is full of advice, but Moran’s rules are not meant to inhibit. He will happily tell you how to achieve a style clear as glass, then praise the rococo rhetorician who “wants to forge reality through words, not just gaze at it blankly through a window”. He is more mentor than instructor, slowly guiding us to notice and appreciate the intricacies of a well-forged phrase. And he does so with tender generosity towards the unloved heroism of “cussedly making sentences that no one asked for and no one will be obliged to read”. As pleasurable as it is to watch Moran unfold the possibilities of an English sentence, his finest contribution is an understanding of the psychology – fragile, labile – of the writer. He knows that a writer must fight distraction, bad verbal habits, and the cheap appeal of early drafts to find their voice. There it is! “It was lost amid your dishevelled thoughts and wordless anxieties, until you pulled it out of yourself, as a flowing line of sentences.”

Human beings take pleasure in noticing nature’s patterns, according to Moran, and these patterns help them to thrive, sometimes in unforeseen ways. A sentence is also form imposed on chaos, and his suggestion that it has an organic role in the survival of the species might seem bold. (Though how many of us owe our lives to a parent who said the right words in a pleasing order?) The novelist Jane Alison’s invigorating book Meander, Spiral, Explode: Design and pattern in narrative follows a similar impulse, seeking the elegant forms that order nature in the structures of stories and novels. Her bugbear is the dramatic arc, the shape that Aristotle noticed in the tragedies of his time but that has become a tyrant of creative writing instruction. “Something that swells and tautens until climax, then collapses? Bit masculo-sexual, no?” Alison has other ideas for excitement.

In brief, compelling meditations on contemporary fiction, she teases out figures we might expect to spy from a plane window or in the heart of a tree. Here are corkscrews and wavelets and fractals and networks of cells. Is this forced? Alison recognizes the cheekiness of her project, knows her readings of form may not convince every reader. Her aim is not to classify tales, to pin them like butterflies on a styrofoam board. She knows, for example, that any complex literary narrative will create a network of associations in the reader’s mind. Her goal is to imagine how a reader might experience a story, looking for “structures that create an inner sensation of traveling toward something and leave a sense of shape behind, so that the stories feel organized”.

Shapes appear in Alison’s mind as clusters of images, so what begins as literary analysis condenses into a small poem. For “meander”, Alison asks us to “picture a river curving and kinking, a snake in motion, a snail’s silver trail, or the path left by a goat”. She speaks of the use of colour in narrative “as a unifying wash, a secret code, or a stealthy constellation”. The point is not ornamentation, though Alison can write a sentence lush enough to drown in, but tempting fiction writers to render life more closely. Against the grand tragedy of the narrative arc, she proposes small undulations: “Dispersed patterning, a sense of ripple or oscillation, little ups and downs, might be more true to human experience than a single crashing wave”. These are the shifting moods of a single day, the temporary loss of the house keys, the sky a sunnier hue than expected.

The Roman educator Quintilian once insisted that an orator must be a good man. It was a commonplace of his time. The rigorous study of eloquence, he thought, required a mind undistracted by vice. The books discussed here inherit this ancient conviction that the attempt to write well is a bettering one. Composing a crisp sentence demands attention to fine detail and a craftsmanlike dedication to perfection. Deciding what to set to paper requires the ability to imagine where a reader might struggle or yawn. In a world tormented by spectres too reckless to name, care and empathy are welcome strangers.

Irina Dumitrescu is Professor of English Medieval Studies at the University of Bonn

Posted in Empire, History, Modernity

Mikhail — How the Ottomans Shaped the Modern World

This post is a reflection on the role that the Ottoman Empire played in shaping the modern world.  It draws on a new book by Alan Mikhail, God’s Shadow: Sultan Selim, His Ottoman Empire, and the Making of the Modern World.  

The Ottomans are the Rodney Dangerfields of empires: They don’t get no respect.  If we picture them at all, it’s either the exotic image of turbans and concubines in Topkapi Palace or the sad image of the “sick man of Europe” in the days before World War I, which finally put them out of their misery.  Neither does them justice.  For a long time, they were the most powerful empire in the world, which dramatically shaped life on three continents — Europe, Asia, and Africa. 

But what makes their story so interesting is that it is more than just an account of some faded glory in the past.  As Mikhail points out, the Ottomans left an indelible stamp on the modern world.  It was their powerful presence in the middle of Eurasia that pushed the minor but ambitious states of Western Europe to set sail for the East and West Indies.  The Dutch, Portuguese, Spanish, and English couldn’t get to the treasures of China and India by land because of the impassable presence of the Ottomans.  So they either had to sail east around Africa to get there or forge a new path to the west, which led them to the Americas.  In fact, they did both, and the result was the riches that turned them into imperial powers who came to dominate much of the known world.  

Without the Ottomans, there would not have been the massive expansion of world trade, the Spanish empire, the riches and technological innovations that spurred the industrial revolution and empowered the English and American empires.

God's Shadow

Here are some passages from the book that give you a feel of the impact the Ottomans had:

For half a century before 1492, and for centuries afterward, the Ottoman Empire stood as the most powerful state on earth: the largest empire in the Mediterranean since ancient Rome, and the most enduring in the history of Islam. In the decades around 1500, the Ottomans controlled more territory and ruled over more people than any other world power. It was the Ottoman monopoly of trade routes with the East, combined with their military prowess on land and on sea, that pushed Spain and Portugal out of the Mediterranean, forcing merchants and sailors from these fifteenth-century kingdoms to become global explorers as they risked treacherous voyages across oceans and around continents—all to avoid the Ottomans.

From China to Mexico, the Ottoman Empire shaped the known world at the turn of the sixteenth century. Given its hegemony, it became locked in military, ideological, and economic competition with the Spanish and Italian states, Russia, India, and China, as well as other Muslim powers. The Ottomans influenced in one way or another nearly every major event of those years, with reverberations down to our own time. Dozens of familiar figures, such as Columbus, Vasco da Gama, Montezuma, the reformer Luther, the warlord Tamerlane, and generations of popes—as well as millions of other greater and lesser historical personages—calibrated their actions and defined their very existence in reaction to the reach and grasp of Ottoman power.

Other facts, too, have blotted out our recognition of the Ottoman influence on our own history. Foremost, we tend to read the history of the last half-millennium as “the rise of the West.” (This anachronism rings as true in Turkey and the rest of the Middle East as it does in Europe and America.) In fact, in 1500, and even in 1600, there was no such thing as the now much-vaunted notion of “the West.” Throughout the early modern centuries, the European continent consisted of a fragile collection of disparate kingdoms and small, weak principalities locked in constant warfare. The large land-based empires of Eurasia were the dominant powers of the Old World, and, apart from a few European outposts in and around the Caribbean, the Americas remained the vast domain of its indigenous peoples. The Ottoman Empire held more territory in Europe than did most European-based states. In 1600, if asked to pick a single power that would take over the world, a betting man would have put his money on the Ottoman Empire, or perhaps China, but certainly not on any European entity.

The sheer scope was the empire at its height was extraordinary:

For close to four centuries, from 1453 until well into the exceedingly fractured 1800s, the Ottomans remained at the center of global politics, economics, and war. As European states rose and fell, the Ottomans stood strong. They battled Europe’s medieval and early modern empires, and in the twentieth century continued to fight in Europe, albeit against vastly different enemies. Everyone from Machiavelli to Jefferson to Hitler—quite an unlikely trio—was forced to confront the challenge of the Ottomans’ colossal power and influence. Counting from their first military victory, at Bursa, they ruled for nearly six centuries in territories that today comprise some thirty-three countries. Their armies would control massive swaths of Europe, Africa, and Asia; some of the world’s most crucial trade corridors; and cities along the shores of the Mediterranean, Red, Black, and Caspian seas, the Indian Ocean, and the Persian Gulf. They held Istanbul and Cairo, two of the largest cities on earth, as well as the holy cities of Mecca, Medina, and Jerusalem, and what was the world’s largest Jewish city for over four hundred years, Salonica (Thessaloniki in today’s Greece). From their lowly beginnings as sheep-herders on the long, hard road across Central Asia, the Ottomans ultimately succeeded in proving themselves the closest thing to the Roman Empire since the Roman Empire itself.

One of the interesting things about the Ottomans was how cosmopolitan and relatively tolerant they were.  The Spanish threw the Muslims and Jews out of Spain but the Ottomans welcomed a variety of peoples, cultures, languages, and religions.  It wasn’t until relatively late that the empire came to be predominately Muslim.

Although all religious minorities throughout the Mediterranean were subjected to much hardship, the Ottomans, despite what Innocent thought, never persecuted non-Muslims in the way that the Inquisition persecuted Muslims and Jews—and, despite the centuries of calls for Christian Crusades, Muslims never attempted a war against the whole of Christianity. While considered legally inferior to Muslims, Christians and Jews in the Ottoman Empire (as elsewhere in the lands of Islam) had more rights than other religious minorities around the world. They had their own law courts, freedom to worship in the empire’s numerous synagogues and churches, and communal autonomy. While Christian Europe was killing its religious minorities, the Ottomans protected theirs and welcomed those expelled from Europe. Although the sultans of the empire were Muslims, the majority of the population was not. Indeed, the Ottoman Empire was effectively the Mediterranean’s most populous Christian state: the Ottoman sultan ruled over more Christian subjects than the Catholic pope.

The sultan who moved the Ottoman empire into the big leagues — tripling its size — was Selim the Grim, who is the central figure of this book (look at his image on the book’s cover and you’ll see how he earned the name).  His son was Suleyman the Magnificent, whose long rule made him the lasting symbol of the empire at its peak.  Another sign of the heterogeneous nature of the Ottomans is that the sultans themselves were of mixed blood.

Because, in this period, Ottoman sultans and princes produced sons not from their wives but from their concubines, all Ottoman sultans were the sons of foreign, usually Christian-born, slaves like Gülbahar [Selim’s mother].

In the exceedingly cosmopolitan empire, the harem ensured that a non-Turkish, non-Muslim, non-elite diversity was infused into the very bloodline of the imperial family. As the son of a mother with roots in a far-off land, a distant culture, and a religion other than Islam, Selim viscerally experienced the ethnically and religiously amalgamated nature of the Ottoman Empire, and grew up in provincial Amasya with an expansive outlook on the fifteenth-century world.

Posted in History, Liberal democracy, Philosophy

Fukuyama — Liberalism and Its Discontents

This post is a brilliant essay by Francis Fukuyama, “Liberalism and Its Discontents.”  In it, he explores the problems facing liberal democracy today.  As always, it is threatened by autocratic regimes around the world.  But what’s new since the fall of the Soviet Union is the threat from illiberal democracy, both at home and abroad, in the form of populism of the right and the left.  
His argument is a strong defense of the liberal democratic order, but it is also a very smart analysis of how liberal democracy has sowed the seeds of its own downfall.  He shows how much it depends on the existence of a vibrant civil society and robust social capital, both of which its own emphasis on individual liberty tends to undermine.  He also shows how its stress on free markets has fostered the rise of the neoliberal religion, which seeks to subordinate the once robust liberal state to the market.  And he notes how its tolerance of diverse viewpoints leaves it vulnerable to illiberal views that seek to wipe it out of existence.
This essay was published in the inaugural issue of the magazine American Purpose On October 5, 2020.  Here’s a link to the original.
It’s well worth your while to give this essay a close read.

Illustration_AmericanPurpose_Edited

Liberalism and Its Discontents

The challenges from the left and the right.

Francis Fukuyama

Today, there is a broad consensus that democracy is under attack or in retreat in many parts of the world. It is being contested not just by authoritarian states like China and Russia, but by populists who have been elected in many democracies that seemed secure.

The “democracy” under attack today is a shorthand for liberal democracy, and what is really under greatest threat is the liberal component of this pair. The democracy part refers to the accountability of those who hold political power through mechanisms like free and fair multiparty elections under universal adult franchise. The liberal part, by contrast, refers primarily to a rule of law that constrains the power of government and requires that even the most powerful actors in the system operate under the same general rules as ordinary citizens. Liberal democracies, in other words, have a constitutional system of checks and balances that limits the power of elected leaders.Democracy itself is being challenged by authoritarian states like Russia and China that manipulate or dispense with free and fair elections. But the more insidious threat arises from populists within existing liberal democracies who are using the legitimacy they gain through their electoral mandates to challenge or undermine liberal institutions. Leaders like Hungary’s Viktor Orbán, India’s Narendra Modi, and Donald Trump in the United States have tried to undermine judicial independence by packing courts with political supporters, have openly broken laws, or have sought to delegitimize the press by labeling mainstream media as “enemies of the people.” They have tried to dismantle professional bureaucracies and to turn them into partisan instruments. It is no accident that Orbán puts himself forward as a proponent of “illiberal democracy.”

The contemporary attack on liberalism goes much deeper than the ambitions of a handful of populist politicians, however. They would not be as successful as they have been were they not riding a wave of discontent with some of the underlying characteristics of liberal societies. To understand this, we need to look at the historical origins of liberalism, its evolution over the decades, and its limitations as a governing doctrine.

What Liberalism Was

Classical liberalism can best be understood as an institutional solution to the problem of governing over diversity. Or to put it in slightly different terms, it is a system for peacefully managing diversity in pluralistic societies. It arose in Europe in the late 17th and 18th centuries in response to the wars of religion that followed the Protestant Reformation, wars that lasted for 150 years and killed major portions of the populations of continental Europe.

While Europe’s religious wars were driven by economic and social factors, they derived their ferocity from the fact that the warring parties represented different Christian sects that wanted to impose their particular interpretation of religious doctrine on their populations. This was a period in which the adherents of forbidden sects were persecuted—heretics were regularly tortured, hanged, or burned at the stake—and their clergy hunted. The founders of modern liberalism like Thomas Hobbes and John Locke sought to lower the aspirations of politics, not to promote a good life as defined by religion, but rather to preserve life itself, since diverse populations could not agree on what the good life was. This was the distant origin of the phrase “life, liberty, and the pursuit of happiness” in the Declaration of Independence. The most fundamental principle enshrined in liberalism is one of tolerance: You do not have to agree with your fellow citizens about the most important things, but only that each individual should get to decide what those things are without interference from you or from the state. The limits of tolerance are reached only when the principle of tolerance itself is challenged, or when citizens resort to violence to get their way.

Understood in this fashion, liberalism was simply a pragmatic tool for resolving conflicts in diverse societies, one that sought to lower the temperature of politics by taking questions of final ends off the table and moving them into the sphere of private life. This remains one of its most important selling points today: If diverse societies like India or the United States move away from liberal principles and try to base national identity on race, ethnicity, or religion, they are inviting a return to potentially violent conflict. The United States suffered such conflict during its Civil War, and Modi’s India is inviting communal violence by shifting its national identity to one based on Hinduism.

There is however a deeper understanding of liberalism that developed in continental Europe that has been incorporated into modern liberal doctrine. In this view, liberalism is not simply a mechanism for pragmatically avoiding violent conflict, but also a means of protecting fundamental human dignity.

The ground of human dignity has shifted over time. In aristocratic societies, it was an attribute only of warriors who risked their lives in battle. Christianity universalized the concept of dignity based on the possibility of human moral choice: Human beings had a higher moral status than the rest of created nature but lower than that of God because they could choose between right and wrong. Unlike beauty or intelligence or strength, this characteristic was universally shared and made human beings equal in the sight of God. By the time of the Enlightenment, the capacity for choice or individual autonomy was given a secular form by thinkers like Rousseau (“perfectibility”) and Kant (a “good will”), and became the ground for the modern understanding of the fundamental right to dignity written into many 20th-century constitutions. Liberalism recognizes the equal dignity of every human being by granting them rights that protect individual autonomy: rights to speech, to assembly, to belief, and ultimately to participate in self-government.

Liberalism thus protects diversity by deliberately not specifying higher goals of human life. This disqualifies religiously defined communities as liberal. Liberalism also grants equal rights to all people considered full human beings, based on their capacity for individual choice. Liberalism thus tends toward a kind of universalism: Liberals care not just about their rights, but about the rights of others outside their particular communities. Thus the French Revolution carried the Rights of Man across Europe. From the beginning the major arguments among liberals were not over this principle, but rather over who qualified as rights-bearing individuals, with various groups—racial and ethnic minorities, women, foreigners, the propertyless, children, the insane, and criminals—excluded from this magic circle.

A final characteristic of historical liberalism was its association with the right to own property. Property rights and the enforcement of contracts through legal institutions became the foundation for economic growth in Britain, the Netherlands, Germany, the United States, and other states that were not necessarily democratic but protected property rights. For that reason liberalism strongly associated with economic growth and modernization. Rights were protected by an independent judiciary that could call on the power of the state for enforcement. Properly understood, rule of law referred both to the application of day-to-day rules that governed interactions between individuals and to the design of political institutions that formally allocated political power through constitutions. The class that was most committed to liberalism historically was the class of property owners, not just agrarian landlords but the myriads of middle-class business owners and entrepreneurs that Karl Marx would label the bourgeoisie.

Liberalism is connected to democracy, but is not the same thing as it. It is possible to have regimes that are liberal but not democratic: Germany in the 19th century and Singapore and Hong Kong in the late 20th century come to mind. It is also possible to have democracies that are not liberal, like the ones Viktor Orbán and Narendra Modi are trying to create that privilege some groups over others. Liberalism is allied to democracy through its protection of individual autonomy, which ultimately implies a right to political choice and to the franchise. But it is not the same as democracy. From the French Revolution on, there were radical proponents of democratic equality who were willing to abandon liberal rule of law altogether and vest power in a dictatorial state that would equalize outcomes. Under the banner of Marxism-Leninism, this became one of the great fault lines of the 20th century. Even in avowedly liberal states, like many in late 19th- and early 20th-century Europe and North America, there were powerful trade union movements and social democratic parties that were more interested in economic redistribution than in the strict protection of property rights.

Liberalism also saw the rise of another competitor besides communism: nationalism. Nationalists rejected liberalism’s universalism and sought to confer rights only on their favored group, defined by culture, language, or ethnicity. As the 19th century progressed, Europe reorganized itself from a dynastic to a national basis, with the unification of Italy and Germany and with growing nationalist agitation within the multiethnic Ottoman and Austro-Hungarian empires. In 1914 this exploded into the Great War, which killed millions of people and laid the kindling for a second global conflagration in 1939.

The defeat of Germany, Italy, and Japan in 1945 paved the way for a restoration of liberalism as the democratic world’s governing ideology. Europeans saw the folly of organizing politics around an exclusive and aggressive understanding of nation, and created the European Community and later the European Union to subordinate the old nation-states to a cooperative transnational structure. For its part, the United States played a powerful role in creating a new set of international institutions, including the United Nations (and affiliated Bretton Woods organizations like the World Bank and IMF), GATT and the World Trade Organization, and cooperative regional ventures like NATO and NAFTA.

The largest threat to this order came from the former Soviet Union and its allied communist parties in Eastern Europe and the developing world. But the former Soviet Union collapsed in 1991, as did the perceived legitimacy of Marxism-Leninism, and many former communist countries sought to incorporate themselves into existing international institutions like the EU and NATO. This post-Cold War world would collectively come to be known as the liberal international order.

But the period from 1950 to the 1970s was the heyday of liberal democracy in the developed world. Liberal rule of law abetted democracy by protecting ordinary people from abuse: The U.S. Supreme Court, for example, was critical in breaking down legal racial segregation through decisions like Brown v. Board of Education. And democracy protected the rule of law: When Richard Nixon engaged in illegal wiretapping and use of the CIA, it was a democratically elected Congress that helped drive him from power. Liberal rule of law laid the basis for the strong post-World War II economic growth that then enabled democratically elected legislatures to create redistributive welfare states. Inequality was tolerable in this period because most people could see their material conditions improving. In short, this period saw a largely happy coexistence of liberalism and democracy throughout the developed world.

Discontents

Liberalism has been a broadly successful ideology, and one that is responsible for much of the peace and prosperity of the modern world. But it also has a number of shortcomings, some of which were triggered by external circumstances, and others of which are intrinsic to the doctrine. The first lies in the realm of economics, the second in the realm of culture.

The economic shortcomings have to do with the tendency of economic liberalism to evolve into what has come to be called “neoliberalism.” Neoliberalism is today a pejorative term used to describe a form of economic thought, often associated with the University of Chicago or the Austrian school, and economists like Friedrich Hayek, Milton Friedman, George Stigler, and Gary Becker. They sharply denigrated the role of the state in the economy, and emphasized free markets as spurs to growth and efficient allocators of resources. Many of the analyses and policies recommended by this school were in fact helpful and overdue: Economies were overregulated, state-owned companies inefficient, and governments responsible for the simultaneous high inflation and low growth experienced during the 1970s.

But valid insights about the efficiency of markets evolved into something of a religion, in which state intervention was opposed not based on empirical observation but as a matter of principle. Deregulation produced lower airline ticket prices and shipping costs for trucks, but also laid the ground for the great financial crisis of 2008 when it was applied to the financial sector. Privatization was pushed even in cases of natural monopolies like municipal water or telecom systems, leading to travesties like the privatization of Mexico’s TelMex, where a public monopoly was transformed into a private one. Perhaps most important, the fundamental insight of trade theory, that free trade leads to higher wealth for all parties concerned, neglected the further insight that this was true only in the aggregate, and that many individuals would be hurt by trade liberalization. The period from the 1980s onward saw the negotiation of both global and regional free trade agreements that shifted jobs and investment away from rich democracies to developing countries, increasing within-country inequalities. In the meantime, many countries starved their public sectors of resources and attention, leading to deficiencies in a host of public services from education to health to security.

The result was the world that emerged by the 2010s in which aggregate incomes were higher than ever but inequality within countries had also grown enormously. Many countries around the world saw the emergence of a small class of oligarchs, multibillionaires who could convert their economic resources into political power through lobbyists and purchases of media properties. Globalization enabled them to move their money to safe jurisdictions easily, starving states of tax revenue and making regulation very difficult. Globalization also entailed liberalization of rules concerning migration. Foreign-born populations began to increase in many Western countries, abetted by crises like the Syrian civil war that sent more than a million refugees into Europe. All of this paved the way for the populist reaction that became clearly evident in 2016 with Britain’s Brexit vote and the election of Donald Trump in the United States.

The second discontent with liberalism as it evolved over the decades was rooted in its very premises. Liberalism deliberately lowered the horizon of politics: A liberal state will not tell you how to live your life, or what a good life entails; how you pursue happiness is up to you. This produces a vacuum at the core of liberal societies, one that often gets filled by consumerism or pop culture or other random activities that do not necessarily lead to human flourishing. This has been the critique of a group of (mostly) Catholic intellectuals including Patrick Deneen, Sohrab Ahmari, Adrian Vermeule, and others, who feel that liberalism offers “thin gruel” for anyone with deeper moral commitments.

This leads us to a deeper stratum of discontent. Liberal theory, both in its economic and political guises, is built around individuals and their rights, and the political system protects their ability to make these choices autonomously. Indeed, in neoclassical economic theory, social cooperation arises only as a result of rational individuals deciding that it is in their self-interest to work with other individuals. Among conservative intellectuals, Patrick Deneen has gone the furthest by arguing that this whole approach is deeply flawed precisely because it is based on this individualistic premise, and sanctifies individual autonomy above all other goods. Thus for him, the entire American project based as it was on Lockean individualistic principles was misfounded. Human beings for him are not primarily autonomous individuals, but deeply social beings who are defined by their obligations and ties to a range of social structures, from families to kin groups to nations.

This social understanding of human nature was a truism taken for granted by most thinkers prior to the Western Enlightenment. It is also one that is one supported by a great deal of recent research in the life sciences that shows that human beings are hard-wired to be social creatures: Many of our most salient faculties are ones that lead us to cooperate with one another in groups of various sizes and types. This cooperation does not arise necessarily from rational calculation; it is supported by emotional faculties like pride, guilt, shame, and anger that reinforce social bonds. The success of human beings over the millennia that has allowed our species to completely dominate its natural habitat has to do with this aptitude for following norms that induce social cooperation.

By contrast, the kind of individualism celebrated in liberal economic and political theory is a contingent development that emerged in Western societies over the centuries. Its history is long and complicated, but it originated in the inheritance rules set down by the Catholic Church in early medieval times which undermined the extended kinship networks that had characterized Germanic tribal societies. Individualism was further validated by its functionality in promoting market capitalism: Markets worked more efficiently if individuals were not constrained by obligations to kin and other social networks. But this kind of individualism has always been at odds with the social proclivities of human beings. It also does not come naturally to people in certain other non-Western societies like India or the Arab world, where kin, caste, or ethnic ties are still facts of life.

The implication of these observations for contemporary liberal societies is straightforward. Members of such societies want opportunities to bond with one another in a host of ways: as citizens of a nation, members of an ethnic or racial group, residents of a region, or adherents to a particular set of religious beliefs. Membership in such groups gives their lives meaning and texture in a way that mere citizenship in a liberal democracy does not.

Many of the critics of liberalism on the right feel that it has undervalued the nation and traditional national identity: Thus Viktor Orbán has asserted that Hungarian national identity is based on Hungarian ethnicity and on maintenance of traditional Hungarian values and cultural practices. New nationalists like Yoram Hazony celebrate nationhood and national culture as the rallying cry for community, and they bemoan liberalism’s dissolving effect on religious commitment, yearning for a thicker sense of community and shared values, underpinned by virtues in service of that community.

There are parallel discontents on the left. Juridical equality before the law does not mean that people will be treated equally in practice. Racism, sexism, and anti-gay bias all persist in liberal societies, and those injustices have become identities around which people could mobilize. The Western world has seen the emergence of a series of social movements since the 1960s, beginning with the civil rights movement in the United States, and movements promoting the rights of women, indigenous peoples, the disabled, the LGBT community, and the like. The more progress that has been made toward eradicating social injustices, the more intolerable the remaining injustices seem, and thus the moral imperative to mobilizing to correct them. The complaint of the left is different in substance but similar in structure to that of the right: Liberal society does not do enough to root out deep-seated racism, sexism, and other forms of discrimination, so politics must go beyond liberalism. And, as on the right, progressives want the deeper bonding and personal satisfaction of associating—in this case, with people who have suffered from similar indignities.

This instinct for bonding and the thinness of shared moral life in liberal societies has shifted global politics on both the right and the left toward a politics of identity and away from the liberal world order of the late 20th century. Liberal values like tolerance and individual freedom are prized most intensely when they are denied: People who live in brutal dictatorships want the simple freedom to speak, associate, and worship as they choose. But over time life in a liberal society comes to be taken for granted and its sense of shared community seems thin. Thus in the United States, arguments between right and left increasingly revolve around identity, and particularly racial identity issues, rather than around economic ideology and questions about the appropriate role of the state in the economy.

There is another significant issue that liberalism fails to grapple adequately with, which concerns the boundaries of citizenship and rights. The premises of liberal doctrine tend toward universalism: Liberals worry about human rights, and not just the rights of Englishmen, or white Americans, or some other restricted class of people. But rights are protected and enforced by states which have limited territorial jurisdiction, and the question of who qualifies as a citizen with voting rights becomes a highly contested one. Some advocates of migrant rights assert a universal human right to migrate, but this is a political nonstarter in virtually every contemporary liberal democracy. At the present moment, the issue of the boundaries of political communities is settled by some combination of historical precedent and political contestation, rather than being based on any clear liberal principle.

Conclusion

Vladimir Putin told the Financial Times that liberalism has become an “obsolete” doctrine. While it may be under attack from many quarters today, it is in fact more necessary than ever.

It is more necessary because it is fundamentally a means of governing over diversity, and the world is more diverse than it ever has been. Democracy disconnected from liberalism will not protect diversity, because majorities will use their power to repress minorities. Liberalism was born in the mid-17th century as a means of resolving religious conflicts, and it was reborn again after 1945 to solve conflicts between nationalisms. Any illiberal effort to build a social order around thick ties defined by race, ethnicity, or religion will exclude important members of the community, and down the road will lead to conflict. Russia itself retains liberal characteristics: Russian citizenship and nationality is not defined by either Russian ethnicity or the Orthodox religion; the Russian Federation’s millions of Muslim inhabitants enjoy equal juridical rights. In situations of de facto diversity, attempts to impose a single way of life on an entire population is a formula for dictatorship.

The only other way to organize a diverse society is through formal power-sharing arrangements among different identity groups that give only a nod toward shared nationality. This is the way that Lebanon, Iraq, Bosnia, and other countries in the Middle East and the Balkans are governed. This type of consociationalism leads to very poor governance and long-term instability, and works poorly in societies where identity groups are not geographically based. This is not a path down which any contemporary liberal democracy should want to tread.

That being said, the kinds of economic and social policies that liberal societies should pursue is today a wide-open question. The evolution of liberalism into neoliberalism after the 1980s greatly reduced the policy space available to centrist political leaders, and permitted the growth of huge inequalities that have been fueling populisms of the right and the left. Classical liberalism is perfectly compatible with a strong state that seeks social protections for populations left behind by globalization, even as it protects basic property rights and a market economy. Liberalism is necessarily connected to democracy, and liberal economic policies need to be tempered by considerations of democratic equality and the need for political stability.

I suspect that most religious conservatives critical of liberalism today in the United States and other developed countries do not fool themselves into thinking that they can turn the clock back to a period when their social views were mainstream. Their complaint is a different one: that contemporary liberals are ready to tolerate any set of views, from radical Islam to Satanism, other than those of religious conservatives, and that they find their own freedom constrained.

This complaint is a serious one: Many progressives on the left have shown themselves willing to abandon liberal values in pursuit of social justice objectives. There has been a sustained intellectual attack on liberal principles over the past three decades coming out of academic pursuits like gender studies, critical race theory, postcolonial studies, and queer theory, that deny the universalistic premises underlying modern liberalism. The challenge is not simply one of intolerance of other views or “cancel culture” in the academy or the arts. Rather, the challenge is to basic principles that all human beings were born equal in a fundamental sense, or that a liberal society should strive to be color-blind. These different theories tend to argue that the lived experiences of specific and ever-narrower identity groups are incommensurate, and that what divides them is more powerful than what unites them as citizens. For some in the tradition of Michel Foucault, foundational approaches to cognition coming out of liberal modernity like the scientific method or evidence-based research are simply constructs meant to bolster the hidden power of racial and economic elites.

The issue here is thus not whether progressive illiberalism exists, but rather how great a long-term danger it represents. In countries from India and Hungary to the United States, nationalist conservatives have actually taken power and have sought to use the power of the state to dismantle liberal institutions and impose their own views on society as a whole. That danger is a clear and present one.

Progressive anti-liberals, by contrast, have not succeeded in seizing the commanding heights of political power in any developed country. Religious conservatives are still free to worship in any way they see fit, and indeed are organized in the United States as a powerful political bloc that can sway elections. Progressives exercise power in different and more nuanced ways, primarily through their dominance of cultural institutions like the mainstream media, the arts, and large parts of academia. The power of the state has been enlisted behind their agenda on such matters as striking down via the courts conservative restrictions on abortion and gay marriage and in the shaping of public school curricula. An open question for the future is whether cultural dominance today will ultimately lead to political dominance in the future, and thus a more thoroughgoing rollback of liberal rights by progressives.

Liberalism’s present-day crisis is not new; since its invention in the 17th century, liberalism has been repeatedly challenged by thick communitarians on the right and progressive egalitarians on the left. Liberalism properly understood is perfectly compatible with communitarian impulses and has been the basis for the flourishing of deep and diverse forms of civil society. It is also compatible with the social justice aims of progressives: One of its greatest achievements was the creation of modern redistributive welfare states in the late 20th century. Liberalism’s problem is that it works slowly through deliberation and compromise, and never achieves its communal or social justice goals as completely as their advocates would like. But it is hard to see how the discarding of liberal values is going to lead to anything in the long term other than increasing social conflict and ultimately a return to violence as a means of resolving differences.

Francis Fukuyama, chairman of the editorial board of American Purpose, directs the Center on Democracy, Development and the Rule of Law at Stanford University.

Posted in History, History of education, War

An Affair to Remember: America’s Brief Fling with the University as a Public Good

This post is an essay about the brief but glorious golden age of the US university during the three decades after World War II.  

American higher education rose to fame and fortune during the Cold War, when both student enrollments and funded research shot upward. Prior to World War II, the federal government showed little interest in universities and provided little support. The war spurred a large investment in defense-based scientific research in universities, and the emergence of the Cold War expanded federal investment exponentially. Unlike a hot war, the Cold War offered an extended period of federally funded research public subsidy for expanding student enrollments. The result was the golden age of the American university. The good times continued for about 30 years and then began to go bad. The decline was triggered by the combination of a decline in the perceived Soviet threat and a taxpayer revolt against high public spending; both trends culminating with the fall of the Berlin Wall in 1989. With no money and no enemy, the Cold War university fell as quickly as it arose. Instead of seeing the Cold War university as the norm, we need to think of it as the exception. What we are experiencing now in American higher education is a regression to the mean, in which, over the long haul, Americans have understood higher education to be a distinctly private good.

I originally presented this piece in 2014 at a conference at Catholic University in Leuven, Belgium.  It was then published in the Journal of Philosophy of Education in 2016 (here’s a link to the JOPE version) and then became a chapter in my 2017 book, A Perfect Mess.  Waste not, want not.  Hope you enjoy it.

Cold War

An Affair to Remember:

America’s Brief Fling with the University as a Public Good

David F. Labaree

            American higher education rose to fame and fortune during the Cold War, when both student enrollments and funded research shot upward.  Prior to World War II, the federal government showed little interest in universities and provided little support.  The war spurred a large investment in defense-based scientific research in universities for reasons of both efficiency and necessity:  universities had the researchers and infrastructure in place and the government needed to gear up quickly.  With the emergence of the Cold War in 1947, the relationship continued and federal investment expanded exponentially.  Unlike a hot war, the Cold War offered a long timeline for global competition between communism and democracy, which meant institutionalizing the wartime model of federally funded research and building a set of structures for continuing investment in knowledge whose military value was unquestioned. At the same time, the communist challenge provided a strong rationale for sending a large number of students to college.  These increased enrollments would educate the skilled workers needed by the Cold War economy, produce informed citizens to combat the Soviet menace, and demonstrate to the world the broad social opportunities available in a liberal democracy.  The result of this enormous public investment in higher education has become known as the golden age of the American university.

            Of course, as is so often the case with a golden age, it didn’t last.  The good times continued for about 30 years and then began to go bad.  The decline was triggered by the combination of a decline in the perceived Soviet threat and a taxpayer revolt against high public spending; both trends with the fall of the Berlin Wall in 1989.  With no money and no enemy, the Cold War university fell as quickly as it arose. 

            In this paper I try to make sense of this short-lived institution.  But I want to avoid the note of nostalgia that pervades many current academic accounts, in which professors and administrators grieve for the good old days of the mid-century university and spin fantasies of recapturing them.  Barring another national crisis of the same dimension, however, it just won’t happen.  Instead of seeing the Cold War university as the norm that we need to return to, I suggest that it’s the exception.  What we’re experiencing now in American higher education is, in many ways, a regression to the mean. 

            My central theme is this:  Over the long haul, Americans have understood higher education as a distinctly private good.  The period from 1940 to 1970 was the one time in our history when the university became a public good.  And now we are back to the place we have always been, where the university’s primary role is to provide individual consumers a chance to gain social access and social advantage.  Since students are the primary beneficiaries, then they should also foot the bill; so state subsidies are hard to justify.

            Here is my plan.  First, I provide an overview of the long period before 1940 when American higher education functioned primarily as a private good.  During this period, the beneficiaries changed from the university’s founders to its consumers, but private benefit was the steady state.  This is the baseline against which we can understand the rapid postwar rise and fall of public investment in higher education.  Next, I look at the huge expansion of public funding for higher education starting with World War II and continuing for the next 30 years.  Along the way I sketch how the research university came to enjoy a special boost in support and rising esteem during these decades.  Then I examine the fall from grace toward the end of the century when the public-good rationale for higher ed faded as quickly as it had emerged.  And I close by exploring the implications of this story for understanding the American system of higher education as a whole. 

            During most of its history, the central concern driving the system has not been what it can do for society but what it can do for me.  In many ways, this approach has been highly beneficial.  Much of its success as a system – as measured by wealth, rankings, and citations – derives from its core structure as a market-based system producing private goods for consumers rather than a politically-based system producing public goods for state and society.  But this view of higher education as private property is also a key source of the system’s pathologies.  It helps explain why public funding for higher education is declining and student debt is rising; why private colleges are so much richer and more prestigious that public colleges; why the system is so stratified, with wealthy students attending the exclusive colleges at the top where social rewards are high and with poor students attending the inclusive colleges at the bottom where such rewards are low; and why quality varies so radically, from colleges that ride atop the global rankings to colleges that drift in intellectual backwaters.

The Private Origins of the System

            One of the peculiar aspects of the history of American higher education is that private colleges preceded public.  Another, which in part follows from the first, is that private colleges are also more prestigious.  Nearly everywhere else in the world, state-supported and governed universities occupy the pinnacle of the national system while private institutions play a small and subordinate role, supplying degrees of less distinction and serving students of less ability.  But in the U.S., the top private universities produce more research, gain more academic citations, attract better faculty and students, and graduate more leaders of industry, government, and the professions.  According to the 2013 Shanghai rankings, 16 of the top 25 universities in the U.S. are private, and the concentration is even higher at the top of this list, where private institutions make up 8 of the top 10 (Institute of Higher Education, 2013). 

            This phenomenon is rooted in the conditions under which colleges first emerged in the U.S.  American higher education developed into a system in the early 19th century, when three key elements were in place:  the state was weak, the market was strong, and the church was divided.  The federal government at the time was small and poor, surviving largely on tariffs and the sale of public lands, and state governments were strapped simply trying to supply basic public services.  Colleges were a low priority for government since they served no compelling public need – unlike public schools, which states saw as essential for producing citizens for the republic.  So colleges only emerged when local promoters requested and received a  corporate charter from the state.  These were private not-for-profit institutions that functioned much like any other corporation.  States provided funding only sporadically and only if an institution’s situation turned dire.  And after the Dartmouth College decision in 1819, the Supreme Court made clear that a college’s corporate charter meant that it could govern itself without state interference.  Therefore, in the absence of state funding and control, early American colleges developed a market-based system of higher education. 

            If the roots of the American system were private, they were also extraordinarily local.  Unlike the European university, with its aspirations toward universality and its history of cosmopolitanism, the American college of the nineteenth century was a home-town entity.  Most often, it was founded to advance the parochial cause of promoting a particular religious denomination rather than to promote higher learning.  In a setting where no church was dominant and all had to compete for visibility, stature, and congregants, founding colleges was a valuable way to plant the flag and promote the faith.  This was particularly true when the population was rapidly expanding into new territories to the west, which meant that no denomination could afford to cede the new terrain to competitors.  Starting a college in Ohio was a way to ensure denominational growth, prepare clergy, and spread the word.

            At the same time, colleges were founded with an eye toward civic boosterism, intended to shore up a community’s claim to be a major cultural and commercial center rather than a sleepy farm town.  With a college, a town could claim that it deserved to gain lucrative recognition as a stop on the railroad line, the site for a state prison, the county seat, or even the state capital.  These consequences would elevate the value of land in the town, which would work to the benefit of major landholders.  In this sense, the nineteenth century college, like much of American history, was in part the product of a land development scheme.  In general, these two motives combined: colleges emerged as a way to advance both the interests of particular sects and also the interests of the towns where they were lodged.  Often ministers were also land speculators.  It was always better to have multiple rationales and sources of support than just one (Brown (1995); Boorstin (1965); Potts (1971).  In either case, however, the benefits of founding a college accrued to individual landowners and particular religious denominations and not to the larger public.

As a result these incentives, church officials and civic leaders around the country scrambled to get a state charter for a college, establish a board of trustees made up of local notables, and install a president.  The latter (usually a clergyman) would rent a local building, hire a small and not very accomplished faculty, and serve as the CEO of a marginal educational enterprise, one that sought to draw tuition-paying students from the area in order to make the college a going concern.  With colleges arising to meet local and sectarian needs, the result was the birth of a large number of small, parochial, and weakly funded institutions in a very short period of time in the nineteenth century, which meant that most of these colleges faced a difficult struggle to survive in the competition with peer institutions.  In the absence of reliable support from church or state, these colleges had to find a way to get by on their own. 

            Into this mix of private colleges, state and local governments began to introduce public institutions.  First came a series of universities established by individual states to serve their local populations.  Here too competition was a bigger factor than demand for learning, since a state government increasingly needed to have a university of its own in order to keep up with its neighbors.  Next came a group of land-grant colleges that began to emerge by midcentury.  Funded by grants of land from the federal government, these were public institutions that focused on providing practical education for occupations in agriculture and engineering.  Finally was an array of normal schools, which aimed at preparing teachers for the expanding system of public elementary education.  Like the private colleges, these public institutions emerged to meet the economic needs of towns that eagerly sought to house them.  And although they colleges were creatures of the state, they had only limited public funding and had to rely heavily on student tuition and private donations.

            The rate of growth of this system of higher education was staggering.  At the beginning of the American republic in 1790 the country had 19 institutions calling themselves colleges or universities (Tewksbury (1932), Table 1; Collins, 1979, Table 5.2).  By 1880, it had 811, which doesn’t even include the normal schools.  As a comparison, this was five times as many institutions as existed that year in all of Western Europe (Ruegg (2004).  To be sure, the American institutions were for the most part colleges in name only, with low academic standards, an average student body of 131 (Carter et al. (2006), Table Bc523) and faculty of 14 (Carter et al. (2006), Table Bc571).  But nonetheless this was a massive infrastructure for a system of higher education. 

            At a density of 16 colleges per million of population, the U.S. in 1880 had the most overbuilt system of higher education in the world (Collins, 1979, Table 5.2).  Created in order to meet the private needs of land speculators and religious sects rather that the public interest of state and society, the system got way ahead of demand for its services.  That changed in the 1880s.  By adopting parts of the German research university model (in form if not in substance), the top level of the American system acquired a modicum of academic respectability.  In addition – and this is more important for our purposes here – going to college finally came to be seen as a good investment for a growing number of middle-class student-consumers. 

            Three factors came together to make college attractive.  Primary among these was the jarring change in the structure of status transmission for middle-class families toward the end of the nineteenth century.  The tradition of passing on social position to your children by transferring ownership of the small family business was under dire threat, as factories were driving independent craft production out of the market and department stories were making small retail shops economically marginal.  Under these circumstances, middle class families began to adopt what Burton Bledstein calls the “culture of professionalism” (Bledstein, 1976).  Pursuing a profession (law, medicine, clergy) had long been an option for young people in this social stratum, but now this attraction grew stronger as the definition of profession grew broader.  With the threat of sinking into the working class becoming more likely, families found reassurance in the prospect of a form of work that would buffer their children from the insecurity and degradation of wage labor.  This did not necessarily mean becoming a traditional professional, where the prospects were limited and entry costs high, but instead it meant becoming a salaried employee in a management position that was clearly separated from the shop floor.  The burgeoning white-collar work opportunities as managers in corporate and government bureaucracies provided the promise of social status, economic security, and protection from downward mobility.  And the best way to certify yourself as eligible for this kind of work was to acquire a college degree. 

            Two other factors added to the attractions of college.  One was that a high school degree – once a scarce commodity that became a form of distinction for middle class youth during the nineteenth century – was in danger of becoming commonplace.  Across the middle of the century, enrollments in primary and grammar schools were growing fast, and by the 1880s they were filling up.  By 1900, the average American 20-year-old had eight years of schooling, which meant that political pressure was growing to increase access to high school (Goldin & Katz, 2008, p. 19).  This started to happen in the 1880s, and for the next 50 years high school enrollments doubled every decade.  The consequences were predictable.  If the working class was beginning to get a high school education, then middle class families felt compelled to preserve their advantage by pursuing college.

            The last piece that fell into place to increase the drawing power of college for middle class families was the effort by colleges in the 1880s and 90s to make undergraduate enrollment not just useful but enjoyable.  Ever desperate to find ways to draw and retain students, colleges responded to competitive pressure by inventing the core elements that came to define the college experience for American students in the twentieth century.  These included fraternities and sororities, pleasant residential halls, a wide variety of extracurricular entertainments, and – of course – football.  College life became a major focus of popular magazines, and college athletic events earned big coverage in newspapers.  In remarkably short order, going to college became a life stage in the acculturation of middle class youth.  It was the place where you could prepare for a respectable job, acquire sociability, learn middle class cultural norms, have a good time, and meet a suitable spouse.  And, for those who were so inclined, was the potential fringe benefit of getting an education.

            Spurred by student desire to get ahead or stay ahead, college enrollments started growing quickly.  They were at 116,000  in 1879, 157,000 in 1889, 238,000 in 1899, 355,000 in 1909, 598,000 in 1919, 1,104,000 in 1929, and 1,494,000 in 1939 (Carter et al. (2006), Table Bc523).  This was a rate of increase of more than 50 percent a decade – not as fast as the increases that would come at midcentury, but still impressive.  During this same 60-year period, total college enrollment as a proportion of the population 18-to-24 years old rose from 1.6 percent to 9.1 percent (Carter et al. (2006), Table Bc524).  By 1930, U.S. had three times the population of the U.K. and 20 times the number of college students (Levine. 1986, p. 135).  And the reason they were enrolling in such numbers was clear.  According to studies in the 1920s, almost two-thirds of undergraduates were there to get ready for a particular job, mostly in the lesser professions and middle management (Levine, 1986, p. 40).  Business and engineering were the most popular majors and the social sciences were on the rise.  As David Levine put it in his important book about college in the interwar years, “Institutions of higher learning were no longer content to educate; they now set out to train, accredit, and impart social status to their students” (Levine, 1986, p. 19.

            Enrollments were growing in public colleges faster than in private colleges, but only by a small amount.  In fact it wasn’t until 1931 – for the first time in the history of American higher education – that the public sector finally accounted for a majority of college students (Carter et al., 2006, Tables Bc531 and Bc534).  The increases occurred across all levels of the system, including the top public research universities; but the largest share of enrollments flowed into the newer institutions at the bottom of the system:  the state colleges that were emerging from normal schools, urban commuter colleges (mostly private), and an array of public and private junior colleges that offered two-year vocational programs. 

            For our purposes today, the key point is this:  The American system of colleges and universities that emerged in the nineteenth century and continued until World War II was a market-driven structure that construed higher education as a private good.  Until around 1880, the primary benefits of the system went to the people who founded individual institutions – the land speculators and religious sects for whom a new college brought wealth and competitive advantage.  This explains why colleges emerged in such remote places long before there was substantial student demand.  The role of the state in this process was muted.  The state was too weak and too poor to provide strong support for higher education, and there was no obvious state interest that argued for doing so.  Until the decade before the war, most student enrollments were in the private sector, and even at the war’s start the majority of institutions in the system were private (Carter et al., 2006, Tables Bc510 to Bc520).  

            After 1880, the primary benefits of the system went to the students who enrolled.  For them, it became the primary way to gain entry to the relatively secure confines of salaried work in management and the professions.  For middle class families, college in this period emerged as the main mechanism for transmitting social advantage from parents to children; and for others, it became the object of aspiration as the place to get access to the middle class.  State governments put increasing amounts of money into support for public higher education, not because of the public benefits it would produce but because voters demanded increasing access to this very attractive private good.

The Rise of the Cold War University

            And then came the Second World War.  There is no need here to recount the devastation it brought about or the nightmarish residue it left.  But it’s worth keeping in mind the peculiar fact that this conflict is remembered fondly by Americans, who often refer to it as the Good War (Terkel, 1997).  The war cost a lot of American lives and money, but it also brought a lot of benefits.  It didn’t hurt, of course, to be on the winning side and to have all the fighting take place on foreign territory.  And part of the positive feeling associated with the war comes from the way it thrust the country into a new role as the dominant world power.  But perhaps even more the warm feeling arises from the memory of this as a time when the country came together around a common cause.  For citizens of the United States – the most liberal of liberal democracies, where private liberty is much more highly valued than public loyalty – it was a novel and exciting feeling to rally around the federal government.  Usually viewed with suspicion as a threat to the rights of individuals and a drain on private wealth, the American government in the 1940s took on the mantle of good in the fight against evil.  Its public image became the resolute face of a white-haired man dressed in red, white, and blue, who pointed at the viewer in a famous recruiting poster.  It’s slogan: “Uncle Sam Wants You.” 

            One consequence of the war was a sharp increase in the size of the U.S. government.  The historically small federal state had started to grow substantially in the 1930s as a result of the New Deal effort to spend the country out of a decade-long economic depression, a time when spending doubled.  But the war raised the level of federal spending by a factor of seven, from $1,000 to $7,000 per capita.  After the war, the level dropped back to $2,000; and then the onset of the Cold War sent federal spending into a sharp, and this time sustained, increase – reaching $3,000 in the 50s, 4,000 in the 60s, and regaining the previous high of $7,000 in the 80s, during the last days of the Soviet Union (Garrett & Rhine, 2006, figure 3). 

            If for Americans in general World War II carries warm associations, for people in higher education it marks the beginning of the Best of Times – a short but intense period of generous public funding and rapid expansion.  Initially, of course, the war brought trouble, since it sent most prospective college students into the military.  Colleges quickly adapted by repurposing their facilities for military training and other war-related activities.  But the real long-term benefits came when the federal government decided to draw higher education more centrally into the war effort – first, as the central site for military research and development; and second, as the place to send veterans when the war was over.  Let me say a little about each.

            In the first half of the twentieth century, university researchers had to scrabble around looking for funding, forced to rely on a mix of foundations, corporations, and private donors.  The federal government saw little benefit in employing their services.  In a particularly striking case at the start of World War, the professional association of academic chemists offered its help to the War Department, which declined “on the grounds that it already had a chemist in its employ” (Levine, 1986, p. 51).[1]  The existing model was for government to maintain its own modest research facilities instead of relying on the university. 

            The scale of the next war changed all this.  At the very start, a former engineering dean from MIT, Vannevar Bush, took charge of mobilizing university scientists behind the war effort as head of the Office of Scientific Research and Development.  The model he established for managing the relationship between government and researchers set the pattern for university research that still exists in the U.S. today: Instead of setting up government centers, the idea was to farm out research to universities.  Issue a request for proposals to meet a particular research need; award the grant to the academic researchers who seemed best equipped to meet this need; and pay 50 percent or more overhead to the university for the facilities that researchers would use.  This method drew on the expertise and facilities that already existed at research universities, which both saved the government from having to maintain a costly permanent research operation and also gave it the flexibility to draw on the right people for particular projects.  For universities, it provided a large source of funds, which enhanced their research reputations, helped them expand faculty, and paid for infrastructure.  It was a win-win situation.  It also established the entrepreneurial model of the university researcher in perpetual search for grant money.  And for the first time in the history of American higher education, the university was being considered a public good, whose research capacity could serve the national interest by helping to win a war. 

            If universities could meet one national need during the war by providing military research, they could meet another national need after the war by enrolling veterans.  The GI Bill of Rights, passed by congress in 1944, was designed to pay off a debt and resolve a manpower problem.  Its official name, the Servicemen’s Readjustment Act of 1944, reflects both aims.  By the end of the war there were 15 million men and women who had served in the military, who clearly deserved a reward for their years of service to the country.  The bill offered them the opportunity to continue their education at federal expense, which included attending the college of their choice.  This opportunity also offered another public benefit, since it responded to deep concern about the ability of the economy to absorb this flood of veterans.  The country had been sliding back into depression at the start of the war, and the fear was that massive unemployment at war’s end was a real possibility.  The strategy worked.  Under the GI Bill, about two million veterans eventually attended some form of college.  By 1948, when veteran enrollment peaked, American colleges and universities had one million more students than 10 years earlier (Geiger (2004), pp. 40-41; Carter et al. (2006), Table Bc523).  This was another win-win situation.  The state rewarded national service, headed off mass unemployment, and produced a pile of human capital for future growth.  Higher education got a flood of students who could pay their own way.  The worry, of course, was what was going to happen when the wartime research contracts ended and the veterans graduated. 

            That’s where the Cold War came in to save the day.  And the timing was perfect.  The first major action of the new conflict – the Berlin Blockade – came in 1948, the same year that veteran enrollments at American colleges reached their peak.  If World War II was good for American higher education, the Cold War was a bonanza.  The hot war meant boom and bust – providing a short surge of money and students followed by a sharp decline.  But the Cold War was a prolonged effort to contain Communism.  It was sustainable because actual combat was limited and often carried out by proxies.  For universities this was a gift that, for 30 years, kept on giving.  The military threat was massive in scale – nothing less than the threat of nuclear annihilation.  And supplementing it was an ideological challenge – the competition between two social and political systems for hearts and minds.  As a result, the government needed top universities to provide it with massive amounts of scientific research that would support the military effort.  And it also needed all levels of the higher education system to educate the large numbers of citizens required to deal with the ideological menace.  We needed to produce the scientists and engineers who would allow us to compete with Soviet technology.  We needed to provide high-level human capital in order to promote economic growth and demonstrate the economic superiority of capitalism over communism.  And we needed to provide educational opportunity for our own racial minorities and lower classes in order to show that our system is not only effective but also fair and equitable.  This would be a powerful weapon in the effort to win over the third world with the attractions of the American Way.  The Cold War American government treated higher education system as a highly valuable public good, which would make a large contribution to the national interest; and the system was pleased to be the object of so much federal largesse (Loss, 2012).

            On the research side, the impact of the Cold War on American universities was dramatic.  The best way to measure this is by examining patterns of federal research and development spending over the years, which traces the ebb and flow of national threats across the last 60 years.  Funding rose slowly  from $13 billion in 1953 (in constant 2014 dollars) until the Sputnik crisis (after the Soviets succeeded in placing the first satellite in earth orbit), when funding jumped to $40 billion in 1959 and rose rapidly to a peak of $88 billion in 1967.  Then the amount backed off to $66 billion in 1975, climbing to a new peak of $104 billion in 1990 just before the collapse of the Soviet Union and then dropping off.  It started growing again in 2002 after the attack on the twin towers, reaching an all-time high of $151 billion in 2010 and has been declining ever since (AAAS, 2014).[2] 

            Initially, defense funding accounted for 85 percent of federal research funding, gradually falling back to about half in 1967, as nondefense funding increased, but remaining in a solid majority position up until the present.  For most of the period after 1957, however, the largest element in nondefense spending was research on space technology, which arose directly from the Soviet Sputnik threat.  If you combine defense and space appropriations, this accounts for about three-quarters of federal research funding until 1990.  Defense research closely tracked perceived threats in the international environment, dropping by 20 percent after 1989 and then making a comeback in 2001.  Overall,  federal funding during the Cold War for research of all types grew in constant dollars from $13 billion in 1953 to $104 in 1990, an increase of 700 percent.  These were good times for university researchers (AAAS, 2014).

            At the same time that research funding was growing rapidly, so were college enrollments.  The number of students in American higher education grew from 2.4 million in 1949 to 3.6 million in 1959; but then came the 1960s, when enrollments more than doubled, reaching 8 million in 1969.  The number hit 11.6 million in 1979 and then began to slow down – creeping up to 13.5 million in 1989 and leveling off at around 14 million in the 1990s (Carter et al., 2006, Table Bc523; NCES, 2014, Table 303.10).  During the 30 years between 1949 and 1979, enrollments increased by more than 9 million students, a growth of almost 400 percent.  And the bulk of the enrollment increases in the last two decades were in part-time students and at two-year colleges.  Among four-year institutions, the primary growth occurred not at private or flagship public universities but at regional state universities, the former normal schools.  The Cold War was not just good for research universities; it was also great for institutions of higher education all the way down the status ladder.

            In part we can understand this radical growth in college enrollments as an extension of the long-term surge in consumer demand for American higher education as a private good.  Recall that enrollments started accelerating late in the nineteenth century, when college attendance started to provide an edge in gaining middle class jobs.  This meant that attending college gave middle-class families a way to pass on social advantage while attending high school gave working-class families a way to gain social opportunity.  But by 1940, high school enrollments had become universal.  So for working-class families, the new zone of social opportunity became higher education.  This increase in consumer demand provided a market-based explanation for at least part of the flood of postwar enrollments.

            At the same time, however, the Cold War provided a strong public rationale for broadening access to college.  In 1946, President Harry Truman appointed a commission to provide a plan for expanding access to higher education, which was first time in American history that a president sought advice about education at any level.  The result was a six-volume report with the title Higher Education for American Democracy.  It’s no coincidence that the report was issued in 1947, the starting point of the Cold War.  The authors framed the report around the new threat of atomic war, arguing that “It is essential today that education come decisively to grips with the world-wide crisis of mankind” (President’s Commission, 1947, vol. 1, p. 6).  What they proposed as a public response to the crisis was a dramatic increase in access to higher education.

            The American people should set as their ultimate goal an educational system in which at no level – high school, college, graduate school, or professional school – will a qualified individual in any part of the country encounter an insuperable economic barrier to the attainment of the kind of education suited to his aptitudes and interests.
        This means that we shall aim at making higher education equally available to all young people, as we now do education in the elementary and high schools, to the extent that their capacity warrants a further social investment in their training (President’s Commission, 1947, vol. 1, p. 36).

Tellingly, the report devotes a lot of space exploring the existing barriers to educational opportunity posed by class and race – exactly the kinds of issues that were making liberal democracies look bad in light of the egalitarian promise of communism.

Decline of the System’s Public Mission

            So in the mid twentieth century, Americans went through an intense but brief infatuation with higher education as a public good.  Somehow college was going to help save us from the communist menace and the looming threat of nuclear war.  Like World War II, the Cold War brought together a notoriously individualistic population around the common goal of national survival and the preservation of liberal democracy.  It was a time when every public building had an area designated as a bomb shelter.  In the elementary school I attended in the 1950s, I can remember regular air raid drills.  The alarm would sound and teachers would lead us downstairs to the basement, whose concrete-block walls were supposed to protect us from a nuclear blast.  Although the drills did nothing to preserve life, they did serve an important social function.  Like Sunday church services, these rituals drew individuals together into communities of faith where we enacted our allegiance to a higher power. 

            For American college professors, these were the glory years, when fear of annihilation gave us a glamorous public mission and what seemed like an endless flow of public funds and funded students.  But it did not – and could not – last.  Wars can bring great benefits to the home front, but then they end.  The Cold War lasted longer than most, but this longevity came at the expense of intensity.  By the 1970s, the U.S. had lived with the nuclear threat for 30 years without any sign that the worst case was going to materialize.  You can only stand guard for so long before attention begins to flag and ordinary concerns start to push back to the surface.  In addition, waging war is extremely expensive, draining both public purse and public sympathy.  The two Cold War conflicts that engaged American troops cost a lot, stirred strong opposition, and ended badly, providing neither the idealistic glow of the Good War nor the satisfying closure of unconditional surrender by the enemy.  Korea ended with a stalemate and the return to the status quo ante bellum.  Vietnam ended with defeat and the humiliating image in 1975 of the last Americans being plucked off a rooftop in Saigon – which the victors then promptly renamed Ho Chi Minh City.

            The Soviet menace and the nuclear threat persisted, but in a form that – after the grim experience of war in the rice paddies – seemed distant and slightly unreal.  Add to this the problem that, as a tool for defeating the enemy, the radical expansion of higher education by the 70s did not appear to be a cost-effective option.  Higher ed is a very labor-intensive enterprise, in which size brings few economies of scale, and its public benefits in the war effort were hard to pin down.  As the national danger came to seem more remote, the costs of higher ed became more visible and more problematic.  Look around any university campus, and the primary beneficiaries of public largesse seem to be private actors – the faculty and staff who work there and the students whose degrees earn them higher income.  So about 30 years into the Cold War, the question naturally arose:  Why should the public pay so much to provide cushy jobs for the first group and to subsidize the personal ambition of the second?  If graduates reap the primary benefits of a college education, shouldn’t they be paying for it rather than the beleaguered taxpayer?

            The 1970s marked the beginning of the American tax revolt, and not surprisingly this revolt emerged first in the bellwether state of California.  Fueled by booming defense plants and high immigration, California had a great run in the decades after 1945.  During this period, the state developed the most comprehensive system of higher education in the country.  In 1960 it formalized this system with a Master Plan that offered every Californian the opportunity to attend college in one of three state systems.  The University of California focused on research, graduate programs, and educating the top high school graduates.  California State University (developed mostly from former teachers colleges) focused on undergraduate programs for the second tier of high school graduates.  The community college system offered the rest of the population two-year programs for vocational training and possible transfer to one of the two university systems.  By 1975, there were 9 campuses in the University of California, 23 in California State University, and xx in the community college system, with a total enrollment across all systems of 1.5 million students – accounting for 14 percent of the college students in the U.S. (Carter et al., 2006, Table Bc523; Douglass, 2000, Table 1).  Not only was the system enormous, but the Master Plan declared it illegal to charge California students tuition.  The biggest and best public system of higher education in the country was free.

            And this was the problem.  What allowed the system to grow so fast was a state fiscal regime that was quite rare in the American context – one based on high public services supported by high taxes.  After enjoying the benefits of this combination for a few years, taxpayers suddenly woke up to the realization that this approach to paying for higher education was at core un-American.  For a country deeply grounded in liberal democracy, the system of higher ed for all at no cost to the consumer looked a lot like socialism.  So, of course, it had to go.  In the mid-1970s the country’s first taxpayer revolt emerged in California, culminating in a successful campaign in 1978 to pass a state-wide initiative that put a limit on increases in property taxes.  Other tax limitation initiatives followed (Martin, 2008).  As a result, the average state appropriation per student at University of California dropped from about $3,400 (in 1960 dollars) in 1987 to $1,100 in 2010, a decline of 68 percent (UC Data Analysis (2014).  This quickly led to a steady increase in fees charged to students at California’s colleges and universities.  (It turned out that tuition was illegal but demanding fees from students was not.)  In 1960 dollars, the annual fees for in-state undergraduates at the University of California rose from $317 in 1987 to $1,122 in 2010, an increase of more than 250 percent (UC Data Analysis (2014).  This pattern of tax limitations and tuition increases spread across the country.  Nationwide during the same period of time, the average state appropriation per student at a four year public college fell from $8,500 to $5,900 (in 2012 dollars), a decline of 31 percent, while average undergraduate tuition doubled, rising from $2,600 to $5,200 (SHEEO, 2013, Figure 3).

            The decline in the state share of higher education costs was most pronounced at the top public research universities, which had a wider range of income sources.  By 2009, the average such institution was receiving only 25 percent of its revenue from state government (National Science Board (2012), Figure 5).  An extreme case is University of Virginia, where in 2013 the state provided less than six percent of the university’s operating budget (University of Virginia, 2014). 

            While these changes were happening at the state level, the federal government was also backing away from its Cold War generosity to students in higher education.  Legislation such as the National Defense Education Act (1958) and Higher Education Act (1965) had provided support for students through a roughly equal balance of grants and loans.  But in 1980 the election of Ronald Reagan as president meant that the push to lower taxes would become national policy.  At this point, support for students shifted from cash support to federally guaranteed loans.  The idea was that a college degree was a great investment for students, which would pay long-term economic dividends, so they should shoulder an increasing share of the cost.  The proportion of total student support in the form of loans was 54 percent in 1975, 67 percent in 1985, and 78 percent in 1995, and the ratio has remained at that level ever since (McPherson & Schapiro, 1998, Table 3.3; College Board, 2013, Table 1).  By 1995, students were borrowing $41 billion to attend college, which grew to $89 billion in 2005 (College Board, 2014, Table 1).  At present, about 60 percent of all students accumulate college debt, most of it in the form of federal loans, and the total student debt load has passed $1 trillion.

            At the same time that the federal government was cutting back on funding college students, it was also reducing funding for university research.  As I mentioned earlier, federal research grants in constant dollars peaked at about $100 billion in 1990, the year after the fall of the Berlin wall – a good marker for the end of the Cold War.  At this point defense accounted for about two-thirds of all university research funding – three-quarters if your include space research.  Defense research declined by about 20 percent during the 90s and didn’t start rising again substantially until 2002, the year after the fall of the Twin Towers and the beginning of the new existential threat known as the War on Terror.  Defense research reached a new peak in 2009 at a level about a third above the Cold War high, and it has been declining steadily ever since.  Increases in nondefense research helped compensate for only a part of the loss of defense funds (AAAS, 2014).

Conclusion

            The American system of higher education came into existence as a distinctly private good.  It arose in the nineteenth century to serve the pursuit of sectarian advantage and land speculation, and then in the twentieth century it evolved into a system for providing individual consumers a way to get ahead or stay ahead in the social hierarchy.  Quite late in the game it took World War II to give higher education an expansive national mission and reconstitute it as a public good.  But hot wars are unsustainable for long, so in 1945 the system was sliding quickly back toward public irrelevance before it was saved by the timely arrival of the Cold War.  As I have shown, the Cold War was very very good for American system of higher education.  It produced a massive increase in funding by federal and state governments, both for university research and for college student subsidies, and – more critically – it sustained this support for a period of three decades.  But these golden years gradually gave way before a national wave of taxpayer fatigue and the surprise collapse of the Soviet Union.  With the nation strapped for funds and with its global enemy dissolved, it no longer had the urgent need to enlist America’s colleges and universities in a grand national cause.  The result was a decade of declining research support and static student enrollments. In 2002 the wars in Afghanistan and Iraq brought a momentary surge in both, but these measures peaked after only eight years and then went again into decline.  Increasingly, higher education is returning to its roots as a private good.

            So what are we to take away from this story of the rise and fall of the Cold War university?  One conclusion is that the golden age of the American university in the mid twentieth century was a one-off event.  Wars may be endemic but the Cold War was unique.  So American university administrators and professors need to stop pining for a return to the good old days and learn how to live in the post-Cold-War era.  The good news is that the impact of the surge in public investment in higher education has left the system in a radically stronger condition than it was in before World War II.  Enrollments have gone from 1.5 million to 21 million; federal research funding has gone from zero to $135 billion; federal grants and loans to college students have gone from zero to $170 billion (NCES, 2014, Table 303.10; AAAS, 2014; College Board, 2014, Table 1).  And the American system of colleges and universities went from an international also-ran to a powerhouse in the world economy of higher education.  Even though all of the numbers are now dropping, they are dropping from a very high level, which is the legacy of the Cold War.  So really, we should stop whining.  We should just say thanks to the bomb for all that it did for us and move on.

            The bad news, of course, is that the numbers really are going down.  Government funding for research is declining and there is no prospect for a turnaround in the foreseeable future.  This is a problem because the federal government is the primary source of funds for basic research in the U.S.; corporations are only interested in investing in research that yields immediate dividends.  During the Cold War, research universities developed a business plan that depended heavily on external research funds to support faculty, graduate students, and overhead.  That model is now broken.  The cost of pursuing a college education is increasingly being borne by the students themselves, as states are paying a declining share of the costs of higher education.  Tuition is rising and as a result student loans are rising.  Public research universities are in a particularly difficult position because their state funding is falling most rapidly.  According to one estimate, at the current rate of decline the average state fiscal support for public higher education will reach zero in 2059 (Mortenson, 2012). 

            But in the midst of all of this bad news, we need to keep in mind that the American system of higher education has a long history of surviving and even thriving under conditions of at best modest public funding.  At its heart, this is a system of higher education based not on the state but the market.  In the hardscrabble nineteenth century, the system developed mechanisms for getting by without the steady support of funds from church or state.  It learned how to attract tuition-paying students, give them the college experience they wanted, get them to identify closely with the institution, and then milk them for donations when they graduate.  Football, fraternities, logo-bearing T shirts, and fund-raising operations all paid off handsomely.  It learned how to adapt quickly to trends in the competitive environment, whether it’s the adoption of intercollegiate football, the establishment of research centers to capitalize on funding opportunities, or providing students with food courts and rock-climbing walls.  Public institutions have a long history of behaving much like private institutions because they were never able to count on continuing state funding. 

            This system has worked well over the years.  Along with the Cold War, it has enabled American higher education to achieve an admirable global status.  By the measures of citations, wealth, drawing power, and Nobel prizes, the system has been very effective.  But it comes with enormous costs.  Private universities have serious advantages over public universities, as we can see from university rankings.  The system is the most stratified structure of higher education in the world.  Top universities in the U.S. get an unacknowledged subsidy from the colleges at the bottom of the hierarchy, which receive less public funding, charge less tuition, and receive less generous donations.  And students sort themselves into institutions in the college hierarchy that parallels their position in the status hierarchy.  Students with more cultural capital and economic capital gain greater social benefit from the system than those with less, since they go to college more often, attend the best institutions, and graduate at a much higher rate.  Nearly everyone can go to college in the U.S., but the colleges that are most accessible provide the least social advantage. 

            So, conceived and nurtured into maturity as a private good, the American system of higher education remains a market-based organism.  It took the threat of nuclear war to turn it – briefly – into a public good.  But these days seem as remote as the time when schoolchildren huddled together in a bomb shelter. 

References

American Association for the Advancement of Science. (2014). Historical Trends in Federal R & D: By Function, Defense and Nondefense R & D, 1953-2015.  http://www.aaas.org/page/historical-trends-federal-rd (accessed 8-21-14.

Bledstein, B. J. (1976). The Culture of Professionalism: The Middle Class and the Development of Higher Education in America. New York:  W. W. Norton.

Boorstin, D. J. (1965). Culture with Many Capitals: The  Booster College. In The Americans: The National Experience (pp. 152-161). New York: Knopf Doubleday.

Brown, D. K. (1995). Degrees of Control: A Sociology of Educational Expansion and Occupational Credentialism. New York: Teachers College Press.

Carter, S. B., et al. (2006). Historical Statistics of the United States, Millennial Education on Line. New York: Cambridge University Press.

College Board. (2013). Trends in student aid, 2013. New York: The College Board.

College Board. (2014). Trends in Higher Education: Total Federal and Nonfederal Loans over Time.  https://trends.collegeboard.org/student-aid/figures-tables/growth-federal-and-nonfederal-loans-over-time (accessed 9-4-14).

Collins, R. (1979). The Credential Society: An Historical Sociology of Education and Stratification. New York: Academic Press.

Douglass, J. A. (2000). The California Idea and American Higher Education: 1850 to the 1960 Master Plan. Stanford, CA: Stanford University Press.

Garrett, T. A., & Rhine, R. M. (2006).  On the Size and Growth of Government. Federal Reserve Bank of St. Louis Review, 88:1 (pp. 13-30).

Geiger, R. L. (2004). To Advance Knowledge: The Growth of American research Universities, 1900-1940. New Brunswick: Transaction.

Goldin, C. & Katz, L. F. (2008). The Race between Education and Technology. Cambridge: Belknap Press of Harvard University Press.

Institute of Higher Education, Shanghai Jiao Tong University.  (2013).  Academic Ranking of World Universities – 2013.  http://www.shanghairanking.com/ARWU2013.html (accessed 6-11-14).

Levine, D. O. (1986). The American college and the culture of aspiration, 1914-1940 Ithaca: Cornell University Press.

Loss, C. P.  (2011).  Between citizens and the state: The politics of American higher education in the 20th century. Princeton, NJ: Princeton University Press.

Martin, I. W. (2008). The Permanent Tax Revolt: How the Property Tax Transformed American Politics. Stanford, CA: Stanford University Press.

McPherson, M. S. & Schapiro, M. O.  (1999).  Reinforcing Stratification in American Higher Education:  Some Disturbing Trends.  Stanford: National Center for Postsecondary Improvement.

Mortenson, T. G. (2012).  State Funding: A Race to the Bottom.  The Presidency (winter).  http://www.acenet.edu/the-presidency/columns-and-features/Pages/state-funding-a-race-to-the-bottom.aspx (accessed 10-18-14).

National Center for Education Statistics. (2014). Digest of Education Statistics, 2013. Washington, DC: US Government Printing Office.

National Science Board. (2012). Diminishing Funding Expectations: Trends and Challenges for Public Research Universities. Arlington, VA: National Science Foundation.

Potts, D. B. (1971).  American Colleges in the Nineteenth Century: From Localism to Denominationalism. History of Education Quarterly, 11: 4 (pp. 363-380).

President’s Commission on Higher Education. (1947). Higher education for American democracy, a report. Washington, DC: US Government Printing Office.

Rüegg, W. (2004). European Universities and Similar Institutions in Existence between 1812 and the End of 1944: A Chronological List: Universities.  In Walter Rüegg, A History of the University in Europe, vol. 3. London: Cambridge University Press.

State Higher Education Executive Officers (SHEEO). (2013). State Higher Education Finance, FY 2012. www.sheeo.org/sites/default/files/publications/SHEF-FY12.pdf (accessed 9-8-14).

Terkel, S. (1997). The Good War: An Oral History of World War II. New York: New Press.

Tewksbury, D. G. (1932). The Founding of American Colleges and Universities before the Civil War. New York: Teachers College Press.

U of California Data Analysis. (2014). UC Funding and Fees Analysis.  http://ucpay.globl.org/funding_vs_fees.php (accessed 9-2-14).

University of Virginia (2014). Financing the University 101. http://www.virginia.edu/finance101/answers.html (accessed 9-2-14).

[1] Under pressure of the war effort, the department eventually relented and enlisted the help of chemists to study gas warfare.  But the initial response is telling.

[2] Not all of this funding went into the higher education system.  Some went to stand-alone research organizations such as the Rand Corporation and American Institute of Research.  But these organizations in many ways function as an adjunct to higher education, with researcher moving freely between them and the university.

Posted in Academic writing, Course Syllabus, Writing Class

Class on Academic Writing

This is the syllabus for a class on academic writing for clarity and grace, which I originally posted more than a year ago.  It is designed as a 10-week class, with weekly readings, slides, and texts for editing.  It’s aimed at doctoral students who are preparing to become researchers who seek to publish their scholarship.  Ideally you can take the class with a group of peers, where you give each other feedback on your own writing projects in progress.  But you can also take the class by yourself.

Below is the syllabus, which includes links to all readings, class slides, and texts for editing.  Here’s a link to the Word document with all of the links, which is easier to work with.

I’ve also constructed a 6-week version of the class, which is aimed at graduate and undergraduate students who want to work on their writing for whatever purpose they choose.  Here’s a link to that syllabus as a Word document.

 

“The effort the writer does not put into writing, the reader has to put into reading.”

Stephen Toulmin

Academic Writing for Clarity and Grace

A Ten-Week Class

David Labaree                            

Web: http://www.stanford.edu/~dlabaree/

Twitter: @Dlabaree

Blog: https://davidlabaree.com/                                                     

                                                Course Description

            The title sounds like a joke, since academics (especially in the social sciences) do not have a reputation for writing with either clarity or grace much less both.  But I hope in this class to draw students into my own and every other academic’s lifelong quest to become a better writer.  The course will bring in a wide range of reference works that I have found useful over the years in working on my own writing and in helping students with theirs.  The idea is not that a 10-week class will make students good writers; many of us have been working at this for 40 years or more and we’re just getting started.  Instead, the plan is to provide students with some helpful strategies, habits, and critical faculties; increase their sense of writing as an extended process of revision; and leave them with a set of books that will support them in their own lifelong pursuit of good writing.

This online course is based on one I used to teach at Stanford for graduate students in education who wanted to work on their writing.  It was offered in the ten-week format of the university’s quarter system, and I’m keeping that format.  But you can use it in any way that works for you. 

Some may want to treat it as a weekly class, doing the readings for each week, reviewing the PowerPoint slides for that week, and working through some of the exercises.  If you’re treating it this way, it would work best if you can do it with a writing group made up of other students with similar interests.  That way you can take advantage of the workshop component of the class, in which members of the group exchange sections of a paper they working on, giving and receiving feedback.

Others may use it as a general source of information about writing, diving into particular readings or slide decks as needed.

Classes include some instruction on particular skills and particular aspects of the writing process:  developing an analytical angle on a subject; writing a good sentence; getting started in the writing process; working out the logic of the argument; developing the forms of validation for the argument; learning what your point is from the process of writing rather than as a precursor to writing; and revising, revising, revising.  We spend another part of the class working as a group doing exercises in spotting and fixing problems.  For these purposes we will use some helpful examples from the Williams book and elsewhere that focus on particular skills, but you can use the work produced within your own writing group. 

Work in your writing group:  Everyone needs to develop a recognition of the value of getting critical feedback from others on their work in progress, so you should be exchanging papers and work at editing each other’s work.  Student work outside of class will include reading required texts, editing other student’s work around particular areas of concern, and working on revising your own paper or papers.  Every week you will be submitting a piece of written work to your writing group, which will involve repeated efforts to edit a particular text of your own; and every week you will provide feedback to others in your group about their own texts. 

Much of class time will focus on working on particular texts around a key issue of the day – like framing, wordiness, clarity, sentence rhythm.  These texts will be examples from the readings and also papers by students, on which they would like to get feedback from the class as a whole.  Topics will include things like:

  • Framing an argument, writing the introduction to a paper
  • Elements of rhetoric
  • Sentence rhythm and music
  • Emphasis – putting the key element at the end of sentence and paragraph; delivering the punch line
  • Concision – eliminating wordiness
  • Clarity – avoiding nominalizations; opting for Anglo-Saxon words; clearing up murky syntax
  • Focusing on action and actors
  • Metaphor and imagery
  • Correct usage: punctuation, common grammatical errors, word use
  • Avoiding the most common academic tics: jargon, isms, Latinate constructions, nominalizations, abstraction, hiding from view behind passive voice and third person
  • The basics of making an argument
  • Using quotes – integrating them into your argument, and commenting on them instead of assuming they make the point on their own.
  • Using data – how to integrate data into a text and explain its meaning and significance
  • The relation of writing and thought
  • Revision – of writing and thinking
  • The relation of grammar and mechanics to rhetorical effect
  • Sentence style
  • The relation of style to audience
  • Disciplinary conventions for style, organization, modes of argument, evidence
  • Authority and voice

            Writing is a very personal process and the things we write are expressions of who we are, so it is important for everyone in the class to keep focused on being constructive in their comments and being tolerant of criticism from others.  Criticism from others is very important for writers, but no one likes it.  I have a ritual every time I get feedback on a paper or manuscript – whether blind reviews from journals or publishers or personal comments from colleagues.  I let the review sit for a while until I’m in the right mood.  Then I open it and skim it quickly to get the overall impression of how positive or negative it is.  At that point I set it aside, cursing the editors for sending the paper to such an incompetent reviewer or reconsidering my formerly high opinion of the particular colleague-critic, then finally coming back a few days later (after a vodka or two) to read the thing carefully and assess the damage.  Neurotic I know, but most writers are neurotic about their craft.  It’s hard not to take criticism personally.  Beyond all reason, I always expect the reviewers to say, “Don’t change a word; publish it immediately!”  But somehow they never do.  So I’m asking all members of the class both to recognize the vulnerability of their fellow writers and to open themselves up to the criticism of these colleagues in the craft. 

Course Texts

Books listed with an * are ones where older editions are available; it’s ok to use one of these editions instead of the most recent version.

*Williams, Joseph M. & Bizup, Joseph.  (2016). Style: Lessons in clarity and grace (12th ed.).  New York: Longman.  

*Becker, Howard S.  (2007).  Writing for social scientists:  How to start and finish your thesis, book, or article (2nd ed.).  Chicago: University of Chicago Press.

*Graff, Gerald, & Birkenstein, Cathy. (2014). “They say, I say:” The moves that matter in academic writing (3rd ed.). New York: Norton.

Sword, Helen.  (2012).  Stylish academic writing. Cambridge: Harvard University Press.

*Garner, Bryan A.  (2016). Garner’s modern English usage (4th ed.). New York: Oxford University Press.  (Any earlier edition is fine to use.)

Other required readings are available in PDF on a Google drive. 

Course Outline

Week 1:  Introduction to Course; Writing Rituals; Writing Well, or at Least Less Badly

Zinnser, William. (2010). Writing English as a second language.  Point of Departure (Winter). Americanscholar.org.

Munger, Michael C. (2010). 10 tips for how to write less badly. Chronicle of Higher Education (Sept. 6).  Chronicle.com.

Lepore, Jill. (2009). How to write a paper for this class. History Department, Harvard University.

Lamott, Anne. (2005). Bird by bird: Some instructions on writing and life. In English 111 Reader.  Miami University Department of English.

Zuckerman, Ezra W. (2008). Tips to article writers. http://web.mit.edu/ewzucker/www/Tips%20to%20article%20writers.pdf.

Slides for week 1 class

Week 2:  Clarity

Williams, Joseph M. & Bizup, Joseph.  (2016).  Style: Lessons in clarity and grace (12th ed.).  New York: Longman. Lessons One, Two, Three, Four, Five, and Six.  It’s ok to use any earlier edition of this book.

Slides for week 2 class

Week 3:  Structuring the Argument in a Paper

Graff, Gerald, & Birkenstein, Cathy. (2014). “They say, I say:” The moves that matter in academic writing (3rd ed.). New York: Norton.  You can use any earlier edition of this book.

Wroe, Ann. (2011). In the beginning was the sound. Intelligent Life Magazine, Spring. http://moreintelligentlife.com/content/arts/ann-wroe/beginning-was-sound.

Slides for week 3 class

Week 4:  Grace

Williams, Joseph M. & Bizup, Joseph.  (2016).  Style: Lessons in clarity and grace (12th ed.).  New York: Longman. Lessons Seven, Eight, and Nine.

Orwell, George. (1946). Politics and the English Language. Horizon.

Lipton, Peter. (2007). Writing Philosophy.

Slides for week 4 class

Week 5:  Stylish Academic Writing

Sword, Helen.  (2012).  Stylish academic writing. Cambridge: Harvard University Press.

Check out Helen Sword’s website, Writer’s Diet, which allows you to paste in a text of your own and get back an analysis of how flabby or fit it is: http://www.writersdiet.com/WT.php.

Haslett, Adam. (2011). The art of good writing. Financial Times (Jan. 22).  Ft.com.

Slides for week 5 class

Week 6:  Writing in the Social Sciences

Becker, Howard S.  (2007).  Writing for social scientists:  How to start and finish your thesis, book, or article (2nd ed.).  Chicago: University of Chicago Press.  It’s fine to use any earlier edition of this book.

Slides for week 6 class

Week 7:  Usage

Garner, Bryan A.  (2016). Garner’s modern English usage (4th ed.). New York: Oxford University Press.  Selections.  Any earlier edition of this book is fine to use.

Wallace, David Foster. (2001). Tense present: Democracy, English, and the wars over usage. Harpers (April), 39-58.

Slides for week 7 class

Week 8:  Writing with Clarity and Grace

Limerick, Patricia. (1993). Dancing with professors: The trouble with academic prose.

Scott Brauer. (2014). Writing instructor, skeptical of automated grading, pits machine vs. machine. Chronicle of Higher Education, April 28.

Pinker, Steven. (2014). Why academics stink at writing. Chronicle of Education, Sept. 26.

Labaree, David F. (2018). The Five-Paragraph Fetish. Aeon.

Slides for week 8 class

Week 9:  Clarity of Form

Williams, Joseph M. & Bizup, Joseph.  (2016).  Style: Lessons in clarity and grace (12th ed.).  New York: Longman. Lessons Ten, Eleven, and Twelve.

Yagoda, Ben. (2011). The elements of clunk. Chronicle of Higher Education (Jan. 2).  Chronicle.com.

 Slides for week 9 class

Week 10:  Writing with Clarity and Grace

March, James G. (1975). Education and the pursuit of optimism. Texas Tech Journal of Education, 2:1, 5-17.

Gladwell, Malcolm. (2000). The art of failure: Why some people choke and others panic. New Yorker (Aug. 21 and 28).  Gladwell.com

Labaree, David F. (2012). Sermon on educational research. Bildungsgeschichte: International Journal for the Historiography of Education, 2:1, 78-87.

Slides for week 10 class

Posted in Capitalism, Higher Education, Meritocracy, Politics

Sandel: The Tyranny of Merit

This post is a reflection on Michael Sandel’s new book, The Tyranny of Merit: What’s Become of the Common Good?  He’s a philosopher at Harvard and this is his analysis of the dangers posed by the American meritocracy.  The issue is one I’ve been exploring here for the last two years in a variety of posts (here, here, here, here, here, here, and here.)

I find Sandel’s analysis compelling, both in the ways it resonates with other takes on the subject and also in his distinctive contributions to the discussion.  My only complaint is that the whole discussion could have been carried out more effectively in a single magazine article.  The book tends to be repetitive, and it also gets into the weeds on some philosophical issues that blur its focus and undercut its impact.  Here I present what I think are the key points.  I hope you find it useful.

Sandel Cover

Both the good news and the bad news about meritocracy is its promise of opportunity for all based on individual merit rather than the luck of birth.  It’s hard to hate a principle that frees us from the tyranny of inheritance. 

The meritocratic ideal places great weight on the notion of personal responsibility. Holding people responsible for what they do is a good thing, up to a point. It respects their capacity to think and act for themselves, as moral agents and as citizens. But it is one thing to hold people responsible for acting morally; it is something else to assume that we are, each of us, wholly responsible for our lot in life.

The problem is that simply calling the new model of status attainment “achievement” rather than “ascription” doesn’t mean that your ability to get ahead is truly free of circumstances beyond your control.  

But the rhetoric of rising now rings hollow. In today’s economy, it is not easy to rise. Americans born to poor parents tend to stay poor as adults. Of those born in the bottom fifth of the income scale, only about one in twenty will make it to the top fifth; most will not even rise to the middle class. It is easier to rise from poverty in Canada or Germany, Denmark, and other European countries than it is in the United States.

The meritocratic faith argues that the social structure of inequality provides a powerful incentive for individuals to work hard to get ahead in order to escape from a bad situation and move on to something better.  The more inequality, such as in the US, the more incentive to move up.  The reality, however, is quite different.

But today, the countries with the highest mobility tend to be those with the greatest equality. The ability to rise, it seems, depends less on the spur of poverty than on access to education, health care, and other resources that equip people to succeed in the world of work.

Sandel goes on to point out additional problems with meritocracy beyond the difficulties in trying to get ahead all on your own: 1) demoralizing the losers in the race; 2) denigrating those without a college degree; and 3) turning politics into the realm of the expert rather than the citizen.

The tyranny of merit arises from more than the rhetoric of rising. It consists in a cluster of attitudes and circumstances that, taken together, have made meritocracy toxic. First, under conditions of rampant inequality and stalled mobility, reiterating the message that we are responsible for our fate and deserve what we get erodes solidarity and demoralizes those left behind by globalization. Second, insisting that a college degree is the primary route to a respectable job and a decent life creates a credentialist prejudice that undermines the dignity of work and demeans those who have not been to college; and third, insisting that social and political problems are best solved by highly educated, value-neutral experts is a technocratic conceit that corrupts democracy and disempowers ordinary citizens.

Consider the first point. Meritocracy fosters triumphalism for the winners and despair for the losers.  It you succeed or fail, you alone get the credit or the blame.  This was not the case in the bad old days of aristocrats and peasants.

If, in a feudal society, you were born into serfdom, your life would be hard, but you would not be burdened by the thought that you were responsible for your subordinate position. Nor would you labor under the belief that the landlord for whom you toiled had achieved his position by being more capable and resourceful than you. You would know he was not more deserving than you, only luckier.

If, by contrast, you found yourself on the bottom rung of a meritocratic society, it would be difficult to resist the thought that your disadvantage was at least partly your own doing, a reflection of your failure to display sufficient talent and ambition to get ahead. A society that enables people to rise, and that celebrates rising, pronounces a harsh verdict on those who fail to do so.

This triumphalist aspect of meritocracy is a kind of providentialism without God, at least without a God who intervenes in human affairs. The successful make it on their own, but their success attests to their virtue. This way of thinking heightens the moral stakes of economic competition. It sanctifies the winners and denigrates the losers.

One key issue that makes meritocracy potentially toxic is its assumption that we deserve the talents that earn us such great rewards.

There are two reasons to question this assumption. First, my having this or that talent is not my doing but a matter of good luck, and I do not merit or deserve the benefits (or burdens) that derive from luck. Meritocrats acknowledge that I do not deserve the benefits that arise from being born into a wealthy family. So why should other forms of luck—such as having a particular talent—be any different? 

Second, that I live in a society that prizes the talents I happen to have is also not something for which I can claim credit. This too is a matter of good fortune. LeBron James makes tens of millions of dollars playing basketball, a hugely popular game. Beyond being blessed with prodigious athletic gifts, LeBron is lucky to live in a society that values and rewards them. It is not his doing that he lives today, when people love the game at which he excels, rather than in Renaissance Florence, when fresco painters, not basketball players, were in high demand.

The same can be said of those who excel in pursuits our society values less highly. The world champion arm wrestler may be as good at arm wrestling as LeBron is at basketball. It is not his fault that, except for a few pub patrons, no one is willing to pay to watch him pin an opponent’s arm to the table.

He then moves on to the second point, about the central role of college in determining who’s got merit. 

Should colleges and universities take on the role of sorting people based on talent to determine who gets ahead in life?

There are at least two reasons to doubt that they should. The first concerns the invidious judgments such sorting implies for those who get sorted out, and the damaging consequences for a shared civic life. The second concerns the injury the meritocratic struggle inflicts on those who get sorted in and the risk that the sorting mission becomes so all-consuming that it diverts colleges and universities from their educational mission. In short, turning higher education into a hyper-competitive sorting contest is unhealthy for democracy and education alike.

The difficulty of predicting which talents are most socially beneficial is particularly true for the complex array of skills that people pick up in college.  Which ones matter most for determining a person’s ability to make an important contribution to society and which don’t?  How do we know if an elite college provides more of those skills than an open-access college?  This matters because a graduate from the former gets a much higher reward than one from the latter.  Pretending that a prestigious college degree is the best way to measure future performance is particularly difficult to sustain because success and degree are conflated.  Graduates of top colleges get the best jobs and thus seem to have the greatest impact, whereas non-grads never get the chance to show what they can do.

Another sports analogy helps to make this point.

Consider how difficult it is to assess even more narrowly defined talents and skills. Nolan Ryan, one of the greatest pitchers in the history of baseball, holds the all-time record for most strikeouts and was elected on the first ballot to baseball’s Hall of Fame. When he was eighteen years old, he was not signed until the twelfth round of the baseball draft; teams chose 294 other, seemingly more promising players before he was chosen. Tom Brady, one of the greatest quarterbacks in the history of football, was the 199th draft pick. If even so circumscribed a talent as the ability to throw a baseball or a football is hard to predict with much certainty, it is folly to think that the ability to have a broad and significant impact on society, or on some future field of endeavor, can be predicted well enough to justify fine-grained rankings of promising high school seniors.

And then there’s the third point, the damage that meritocracy does to democratic politics.  One element of of this is that it turns politics into an arena for credentialed experts, consigning ordinary citizens to the back seat.  How many political leaders today are without a college degree?  Vanishingly few.  Another is that meritocracy not only bars non-grads from power but they also bars them from social respect.  

Grievances arising from disrespect are at the heart of the populist movement that has swept across Europe and the US.  Sandel calls this a “politics of humiliation.”

The politics of humiliation differs in this respect from the politics of injustice. Protest against injustice looks outward; it complains that the system is rigged, that the winners have cheated or manipulated their way to the top. Protest against humiliation is psychologically more freighted. It combines resentment of the winners with nagging self-doubt: perhaps the rich are rich because they are more deserving than the poor; maybe the losers are complicit in their misfortune after all.

This feature of the politics of humiliation makes it more combustible than other political sentiments. It is a potent ingredient in the volatile brew of anger and resentment that fuels populist protest.

Sandel draws on a wonderful book by Arlie Hochschild, Strangers in Their Own Land, in which she interviews Trump supporters in Louisiana.

Hochschild offered this sympathetic account of the predicament confronting her beleaguered working-class hosts:

You are a stranger in your own land. You do not recognize yourself in how others see you. It is a struggle to feel seen and honored. And to feel honored you have to feel—and feel seen as—moving forward. But through no fault of your own, and in ways that are hidden, you are slipping backward.

Once consequence of this for those left behind is a rise in “deaths of despair.”

The overall death rate for white men and women in middle age (ages 45–54) has not changed much over the past two decades. But mortality varies greatly by education. Since the 1990s, death rates for college graduates declined by 40 percent. For those without a college degree, they rose by 25 percent. Here then is another advantage of the well-credentialed. If you have a bachelor’s degree, your risk of dying in middle age is only one quarter of the risk facing those without a college diploma. 

Deaths of despair account for much of this difference. People with less education have long been at greater risk than those with college degrees of dying from alcohol, drugs, or suicide. But the diploma divide in death has become increasingly stark. By 2017, men without a bachelor’s degree were three times more likely than college graduates to die deaths of despair.

Sandel offers two relatively reforms that might help mitigate the tyranny of meritocracy.  One focuses on elite college admissions.  

Of the 40,000-plus applicants, winnow out those who are unlikely to flourish at Harvard or Stanford, those who are not qualified to perform well and to contribute to the education of their fellow students. This would leave the admissions committee with, say, 30,000 qualified contenders, or 25,000, or 20,000. Rather than engage in the exceedingly difficult and uncertain task of trying to predict who among them are the most surpassingly meritorious, choose the entering class by lottery. In other words, toss the folders of the qualified applicants down the stairs, pick up 2,000 of them, and leave it at that.

This helps get around two problems:  the difficulty in trying to predict merit; and the outsize rewards of a winner-take-all admissions system.  But good luck trying to get this put in place over the howls of outrage from upper-middle-class parents, who have learned how to game the system to their advantage.  Consider this one small example of the reaction when an elite Alexandria high school proposed random admission from a pool of the most qualified.

Another reform is more radical and even harder to imagine putting into practice.  It begins with reconsideration of what we mean by the “common good.”

The contrast between consumer and producer identities points to two different ways of understanding the common good. One approach, familiar among economic policy makers, defines the common good as the sum of everyone’s preferences and interests. According to this account, we achieve the common good by maximizing consumer welfare, typically by maximizing economic growth. If the common good is simply a matter of satisfying consumer preferences, then market wages are a good measure of who has contributed what. Those who make the most money have presumably made the most valuable contribution to the common good, by producing the goods and services that consumers want.

A second approach rejects this consumerist notion of the common good in favor of what might be called a civic conception. According to the civic ideal, the common good is not simply about adding up preferences or maximizing consumer welfare. It is about reflecting critically on our preferences—ideally, elevating and improving them—so that we can live worthwhile and flourishing lives. This cannot be achieved through economic activity alone. It requires deliberating with our fellow citizens about how to bring about a just and good society, one that cultivates civic virtue and enables us to reason together about the purposes worthy of our political community.

If we can carry out this deliberation — a big if indeed — then we can proceed to implement a system for shifting the basis for individual compensation from what the market is willing to pay to what we collectively feel is most valuable to society.  

Thinking about pay, most would agree that what people make for this or that job often overstates or understates the true social value of the work they do. Only an ardent libertarian would insist that the wealthy casino magnate’s contribution to society is a thousand times more valuable than that of a pediatrician. The pandemic of 2020 prompted many to reflect, at least fleetingly, on the importance of the work performed by grocery store clerks, delivery workers, home care providers, and other essential but modestly paid workers. In a market society, however, it is hard to resist the tendency to confuse the money we make with the value of our contribution to the common good.

To implement a system based on public benefit rather than marketability would require completely revamping our structure of determining salaries and taxes. 

The idea is that the government would provide a supplementary payment for each hour worked by a low-wage employee, based on a target hourly-wage rate. The wage subsidy is, in a way, the opposite of a payroll tax. Rather than deduct a certain amount of each worker’s earnings, the government would contribute a certain amount, in hopes of enabling low-income workers to make a decent living even if they lack the skills to command a substantial market wage.

Generally speaking, this would mean shifting the tax burden from work to consumption and speculation. A radical way of doing so would be to lower or even eliminate payroll taxes and to raise revenue instead by taxing consumption, wealth, and financial transactions. A modest step in this direction would be to reduce the payroll tax (which makes work expensive for employers and employees alike) and make up the lost revenue with a financial transactions tax on high-frequency trading, which contributes little to the real economy.

This is how Sandel ends his book:

The meritocratic conviction that people deserve whatever riches the market bestows on their talents makes solidarity an almost impossible project. For why do the successful owe anything to the less-advantaged members of society? The answer to this question depends on recognizing that, for all our striving, we are not self-made and self-sufficient; finding ourselves in a society that prizes our talents is our good fortune, not our due. A lively sense of the contingency of our lot can inspire a certain humility: “There, but for the grace of God, or the accident of birth, or the mystery of fate, go I.” Such humility is the beginning of the way back from the harsh ethic of success that drives us apart. It points beyond the tyranny of merit toward a less rancorous, more generous public life.