Posted in Democracy, Inequality, Meritocracy, Public Good

What the Old Establishment Can Teach the New Tech Elite

It is unlikely that Mark Zuckerberg, Jeff Bezos and the other lords and ladies of Silicon Valley spend any time in English churchyards. But if they were to visit these delightfully melancholic places, the first things that they would encounter would be monuments to the fallen of the Great War. Their initial emotion, like anybody else’s looking at these morbid plinths, would rightly be one of relief. It is good that the West’s young men are no longer herded into uniform and marched toward machine guns.

If they looked harder, however, today’s elite would spot something else in these cemeteries. The whole of society is commemorated in stone: The baronet’s heir was shot to pieces in Flanders alongside the gamekeeper’s son. Recall that in the controversial D.H. Lawrence novel “Lady Chatterley’s Lover,” Lady Chatterley is driven into the arms of the local gamekeeper in part because her husband, Sir Clifford, was paralyzed from the waist down in the Great War.

Such monuments to the dead, which can be found across Europe, are a reminder that a century ago the elite, whatever its other sins, believed in public service. The rich shared common experiences with the poor, rooted in a common love of their country and a common willingness to sacrifice life and limb for something bigger.

That bond survived until the 1960s. Most young men in Europe did a version of what was called “national service”: They had to serve in the armed forces for a couple of years and learned the rudiments of warfare in case total war struck again. The U.S. called on people of all classes to fight in World War II—including John F. Kennedy and George H.W. Bush, who were both nearly killed serving their country—and the Korean War.

The economic elites and the political elites were intertwined. In Britain, a “magic circle” of Old Etonians helped choose the leader of the Conservative Party, convening over lunch at the Beefsteak Club or dinner at Pratt’s to discuss the fate of the nation, as well as the quality of that year’s hunting. What became the European Union was constructed behind closed doors by the continent’s ruling class, while Charles de Gaulle set up the Ecole Nationale d’Administration for the purpose of training a new ruling elite for a new age. American presidents turned to “wise men” of the East Coast Establishment, such as Averell Harriman, the son of a railroad tycoon, or one of the Rockefellers. The “best and the brightest” were supposed to do a stint in Washington.

A memorial to soldiers who died in the two world wars, Oxfordshire, U.K.

PHOTO: TIM GRAHAM/GETTY IMAGES

The Establishment on both sides of the Atlantic was convinced that good government mattered more than anything else. Mess up government and you end up with the Depression and Hitler.

That sense has gone. The New Establishment of Wall Street and the City of London and the New New Establishment of Silicon Valley have precious little to do with Washington or Whitehall. The public sector is for losers. As today’s elite see it, the best thing that government can do is to get out of the way of the really talented people and let them exercise their wealth-creating magic. Pester them too much or tax them too heavily and they will pick up their sticks and take their game elsewhere.

As for common experiences, the smart young people who go from the Ivy League or Oxbridge to work at Google or Goldman Sachs are often as distant from the laboring masses as the class that H.G. Wells, in “The Time Machine,” called the Eloi—pampered, ethereal, childlike creatures that the time traveler discovers at the end of his long journey into the future. Separated from the masses by elite education and pricey lifestyles in fashionable enclaves, today’s elite often have few ties to the country they work in. One former British spy points out that his children are immensely better educated than he was and far more tolerant, but the only time they meet the working class is when their internet shopping arrives; they haven’t shared a barracks with them.

Does this matter? Again, many will point to progress. The old elite was overwhelmingly male and white (with a few exceptions, such as Lady Violet Bonham Carter and Katharine Graham, who often wielded power through dinner parties). It often made a hash of things. Britain’s “magic circle” didn’t cope well with the swinging ‘60s—most catastrophically with the Profumo sex scandal, which greatly damaged the Conservative Party—while America’s whiz kids hardly excelled in Vietnam. By the 1960s, the very term “The Establishment” had become an insult.

Modern money is also far cleaner than old money. The officers who were mowed down at the Somme often came from grand homes, but they were built with the grubby proceeds of coal, slavery and slaughter. (Clifford Chatterley, in his wife’s view, treated miners “as objects rather than men.”) Say what you like against monopolistic tech barons, greedy hedge-fund managers or tax-dodging real estate tycoons, they aren’t sinners in the same league. Men like Mr. Bezos and Mr. Zuckerberg build great businesses and often give away their money to worthy causes. What more should they do?

Quite a lot, actually.

Lieutenant John F. Kennedy, right, and his PT 109 crew in the South Pacific, July 1943.

PHOTO: ASSOCIATED PRESS

The idea that the elite has a responsibility to tend to the state was brilliantly set out by Plato more than 2,000 years ago. In “The Republic” he likened the state to a ship that can easily flounder on the rocks or head in the wrong direction. He argued that for a voyage to succeed, you need a captain who has spent his life studying “the seasons of the years, the skies, the stars and other professional subjects.” He wanted to trust his state to a group of Guardians, selected for their wisdom and character and trained, through an austere and demanding education, in the arts of government.

Covid-19 is a wake-up call for the West, especially for its elite. This year could mark a reverse in history. Five hundred years ago, Europe was a bloody backwater while China was the most advanced country in the world, with the world’s most sophisticated civil service, selected by rigorous examination from across the whole country. The West overtook the East because its leaders mastered the art of government, producing a succession of powerful innovations—the nation-state, the liberal state, the welfare state—while the Chinese state ossified, its Mandarin elite unaware that it was even in competition with anyone else. By the 1960s, America was putting a man on the moon while millions of Chinese were dying of starvation.

Since the 1960s, however, this process has been reversed. Led by Singapore, Asia has been improving its state machinery while the West has ossified. Covid-19 shows just how far this change in the balance of competence has gone. Countries like South Korea, Singapore and even China have done far better at protecting their citizens than either the U.S. or Britain, where governments have conspicuously failed to work.

The elite bears much of the responsibility for this sorry state of affairs. The 1960s was the last time that they had a marked sense of public duty. What followed might be called the great abandonment. The Vietnam War discredited “wise men” such as McGeorge Bundy, a self-styled Platonic Guardian who served as national security adviser to both JFK and LBJ. The Establishment split into warring tribes of progressives and conservatives who were so divided by the culture wars that they seldom come together to fix anything. The explosion of pay in the private sector drew talent away from government. The constant refrain from the Right that the state is a parasite on the productive economy eroded what remained of the public ethic, while the Left, drugged by its ties to public sector unions, lost its appetite for reform. Government became a zombie, preserved and indeed inflated by its staff and clients, but robbed of ideas and talent.

National Service recruits in the U.K. line up to be issued caps, 1953.

PHOTO: POPPERFOTO/GETTY IMAGES

The difference with the East is marked. Singapore has put a Platonic premium on public service. It recruits the brightest young people for the government, makes sure they move frequently between the public and private sectors, and pays them well: Its top civil servants can earn more than a million dollars a year. (It stops short of forbidding its Guardians to marry and laying on orgies for them, as Plato advised, but it does force them to live in public housing.) Other Asian dragons have recruited a cadre of elite civil servants. China’s attempt to follow suit is complicated by the corruption and secrecy that surround the regime, but at its best it is learning from Singapore, creating a new class of mandarins, this time trained in technical fields and science rather than the classics.

What could the West do to rebind the elite to the state? Better pay for civil servants is one answer, especially if it comes with a keenness to shed poor performers in the public sector, as Singapore does. The idea of giving students generous university scholarships in exchange for working for the civil service for a number of years was pioneered by Thomas Jefferson. An even more ambitious idea would be to reintroduce nonmilitary national service, an idea that Emmanuel Macron has raised for France.

But the biggest change that is needed is a change of mind-set. Unlike the dead aristocrats in the churchyards, the geeks who run Google and Facebook have no sense of guilt to give them pause and few ties of blood and soil to connect them to a particular patch of land. They believe that their fortunes are the product of nothing but their own innate genius. They owe the rest of us nothing.

This needs to change. Over the past decade both the Democratic Party and the Republican Party have been shaken by the forces of populism. The shaking will only get worse if the elites don’t play a more active role in politics. Since the Covid-19 outbreak, we have been reminded that good government can make the difference between life and death. Look at the two cities where the Western elite feel most at home: New York has lost more than 20,000 people, London 6,000 (at times the mortality rate was higher than the Blitz). By contrast, in Seoul, a bigger city with subways, nightclubs and everything else, only around 30 have died.

We live in a knowledge economy. For elites, exercising social responsibility should mean more than giving away money, though that is an admirable thing. It should mean sharing your brain—serving, not just giving. Michael Bloomberg did that as mayor of New York during the difficult decade after 9/11 (disclosure: Mr. Bloomberg employs one of us), and Bill Gates is the greatest philanthropist of his time not just because of the amount of money he has spent but because he devotes so much time to designing and driving his philanthropic work.

The habit must be set from early adulthood. More bright young things need to remember John F. Kennedy’s call to duty and think not of what their country can do for them but what they can do for their country. If more of the young flowing out of the Ivy League and Oxbridge worked in the public sector, its technology wouldn’t be so shoddy and its ethos so sluggish.

There is a twist in the dystopian tale that H.G. Wells told in “The Time Machine” more than a century ago. The Eloi seem to live wonderful lives. They frolic above the ground, subsisting on a diet of fruit and living in futuristic (if deteriorating) buildings, while the Morlocks, brutish, apelike creatures, lurk underground, tending machinery and occasionally surfacing to feed and clothe the Eloi. But this is an illusion. The Morlocks are in fact farming the Eloi as a food source, just as we farm cattle, sheep and pigs.

Unless the ethic of public service is once again reignited, the American world order will ossify, just as other empires did before it. That is the message today’s Eloi should take from English churchyards.

Mr. Micklethwait is the editor in chief of Bloomberg and Mr. Wooldridge is the political editor of The Economist. This essay is adapted from their new book, “The Wake Up Call: Why the Pandemic Has Exposed the Weakness of the West, and How to Fix It,” published by Harper Via (which, like The Wall Street Journal, is owned by News Corp).

Posted in Higher Education

Kroger: In Praise of American Higher Education

This post is my effort to be upbeat for a change, looking at what’s good about US education.  It’s a recent essay by John Kroger, “In Praise of American Higher Education,” which was published in Inside Higher Ed.  Here’s a link to the original.  

Hope you enjoy it.  All is not bleak.

In Praise of American Higher Education

By

John Kroger

 September 14, 2020

These are grim times, filled with bad news. Nationally, the death toll from COVID-19 has passed 190,000. Political polarization has reached record levels, with some scholars openly fearing a fascist future for America. In my hometown of Portland, Ore., we have been buffeted by business closures, violent clashes between protesters and police, and out-of-control wildfires that have killed an unknown number of our fellow citizens, destroyed over a thousand homes and filled our streets with smoke. And in the higher education community, we are struggling. Our campuses are now COVID-19 hot spots, hundreds of institutions have implemented layoffs and furloughs impacting a reported 50,000 persons, and many commentators predict a complete financial meltdown for the sector. As I started to write this essay, a friend asked, “Is there any good news to report?”

In America today, we love to bash higher education. The negative drumbeat is incessant. Tuition, we hear, is too high. Students have to take too many loans. College does not prepare students for work. Inequality and racism are widespread. Just look at recent book titles: The Breakdown of Higher EducationCrisis in Higher EducationIntro to FailureThe Quiet Crisis, How Higher Education is Failing AmericaHigher Education Under FireThe Dream Is OverCracks in the Ivory Tower, The Moral Mess of Higher Education; and The Coddling of the American Mind. Jeesh.

So, for good news today, I want to remind everyone that despite all the criticism, the United States possesses a remarkable higher education system. Yes, we have our problems, which we need to address. The government and our colleges and universities need to partner to expand access to college, make it more affordable and decrease loan burdens; we need to ensure that our students graduate with valuable job skills; we need to tackle inequality and systemic racism in admission, hiring and the curriculum. But let us not lose sight of the remarkable things we have achieved and the very real strengths our system possesses — the very strengths that will allow us to tackle and solve the problems we have identified. Consider the following:

The United States has, by far, the largest number of great universities in the world. In the latest Times World University Rankings, the United States is dominant, possessing 14 of the top 20 universities in the world. These universities — places like Yale, UC Berkeley and Johns Hopkins — provide remarkable undergraduate and graduate educations combined with world-leading research outcomes. That reputation for excellence has made the United States the international gold standard for higher education.

We provide remarkable value to our students. As a recent Brookings Institution report noted, “Higher education provides extensive benefits to students, including higher wages, better health, and a lower likelihood of requiring disability payments. A population that is more highly educated also confers wide-ranging benefits to the economy, such as lower rates of unemployment and higher wages even for workers without college degrees. A postsecondary degree can also serve as a buffer against unemployment during economic downturns. Those with postsecondary degrees saw more steady employment through the Great Recession, and the vast majority of net jobs created during the economic recovery went to college-educated workers.”

Our higher education capacity is massive. At last count, almost 20 million students are enrolled in college. This is one reason we are fourth (behind Canada, Japan and South Korea) out of all OECD nations in higher education degree attainment, far ahead of nations like Germany and France. If we believe that mass education is critical to the future of our economy and democracy, this high number — and the fact that most of our institutions could easily grow — should give us great hope.

The United States dominates global research (though China is gaining). As The Economist reported in 2018, “Since the first Nobel prizes were bestowed in 1901, American scientists have won a whopping 269 medals in the fields of chemistry, physics and physiology or medicine. This dwarfs the tallies of America’s nearest competitors, Britain (89), Germany (69) and France (31).” In a recent global ranking of university innovation — “a list that identifies and ranks the educational institutions doing the most to advance science, invent new technologies and power new markets and industries” — U.S. institutions grabbed eight out of the top 10 spots.

We possess an amazing network of community colleges offering very low-cost, high-quality foundational and continuing education to virtually every American. No matter where you live in the United States, a low-cost community college and a world of learning is just a few miles away. This network provides a great foundation for our effort to expand economic opportunity and reach underserved populations. As Secretary of Education Arne Duncan once remarked, “About half of all first-generation college students and minority students attend community colleges. It is a remarkable record. No other system of higher education in the world does so much to provide access and second-chance opportunities as our community colleges.”

We are nimble. Though higher education is often bashed for refusing to change, our ability to do so is remarkable. When COVID-19 broke out in spring 2020, almost every U.S. college and university pivoted successfully to online education in a matter of weeks. Faculty, staff and administrators, often criticized for failing to work together, collectively made this happen overnight. Now, no matter what the future holds, our colleges and universities have the ability to deliver education effectively through both traditional in-person and new online models.

We have a great tradition, starting with the GI Bill, of federal government support for college education. No one in Congress is calling for an end to Pell Grants, one of the few government programs to enjoy overwhelming bipartisan government support in this highly fractured political era. Instead, the only question is the degree to which those grants need to increase and whether that increase should be linked to cost containment by institutions or not. This foundation of political support is vital as we look to ways to expand college access and affordability.

Finally, we have amazing historically Black colleges and universities, with excellent academic programs, outstanding faculty and proud histories. As the nation begins to confront its history of racism and discrimination, these institutions provide a remarkable asset to help the nation come to terms with its past, provide transformational education in the present and move toward a better future.

So, as we go through tough times, and we continue to subject our institutions to necessary and valuable self-criticism, it is important to keep our failures and limitations in perspective. Yes, American higher education could be better. But it is remarkable, valuable and praiseworthy all the same.

Posted in Higher Education, History of education, Organization Theory, Sociology

College: What Is It Good For?

This post is the text of a lecture I gave in 2013 at the annual meeting of the John Dewey Society.  It was published the following year in the Society’s journal, Education and Culture.  Here’s a link to the published version.           

The story I tell here is not a philosophical account of the virtues of the American university but a sociological account about how those virtues arose as unintended consequences of a system of higher education that arose for less elevated reasons.  Drawing my the analysis in the book I was writing at the time, A Perfect Mess, I show how the system emerged in large part out two impulses that had nothing to do with advancing knowledge.  One was in response to the competition among religious groups, seeking to plant the denominational flag on the growing western frontier and provide clergy for the newly arriving flock.  Another was in response to the competition among frontier towns to attract settlers who would buy land, using a college as a sign that this town was not just another dusty farm village but a true center of culture.

The essay then goes on to explore how the current positive social benefits of the US higher ed system are supported by the peculiar institutional form that characterizes American colleges and universities. 

My argument is that the true hero of the story is the evolved form of the American university, and that all the good things like free speech are the side effects of a structure that arose for other purposes.  Indeed, I argue that the institution – an intellectual haven in a heartless utilitarian world – depends on attributes that we would publicly deplore:  opacity, chaotic complexity, and hypocrisy.

In short, I’m portraying the system as one that is infused with irony, from its early origins through to its current functions.  Hope you enjoy it.

A Perfect Mess Cover

College — What Is It Good For

David F. Labaree

            I want to say up front that I’m here under false pretenses.  I’m not a Dewey scholar or a philosopher; I’m a sociologist doing history in the field of education.  And the title of my lecture is a bit deceptive.   I’m not really going to talk about what college is good for.  Instead I’m going to talk about how the institution we know as the modern American university came into being.  As a sociologist I’m more interested in the structure of the institution than in its philosophical aims.  It’s not that I’m opposed to these aims.  In fact, I love working in a university where these kinds of pursuits are open to us:   Where we can enjoy the free flow of ideas; where we explore any issue in the sciences or humanities that engages us; and where we can go wherever the issue leads without worrying about utility or orthodoxy or politics.  It’s a great privilege to work in such an institution.  And this is why I want to spend some time examining how this institution developed its basic form in the improbable context of the United States in the nineteenth century. 

            My argument is that the true hero of the story is the evolved form of the American university, and that all the good things like free speech are the side effects of a structure that arose for other purposes.  Indeed, I argue that the institution – an intellectual haven in a heartless utilitarian world – depends on attributes that we would publicly deplore:  opacity, chaotic complexity, and hypocrisy.

            I tell this story in three parts.  I start by exploring how the American system of higher education emerged in the nineteenth century, without a plan and without any apparent promise that it would turn out well.  By 1900, I show how all the pieces of the current system had come together.  This is the historical part.  Then I show how the combination of these elements created an astonishingly strong, resilient, and powerful structure.  I look at the way this structure deftly balances competing aims – the populist, the practical, and the elite.  This is the sociological part.  Then I veer back toward the issue raised in the title, to figure out what the connection is between the form of American higher education and the things that it is good for. This is the vaguely philosophical part.  I argue that the form serves the extraordinarily useful functions of protecting those of us in the faculty from the real world, protecting us from each other, and hiding what we’re doing behind a set of fictions and veneers that keep anyone from knowing exactly what is really going on. 

           In this light, I look at some of the things that could kill it for us.  One is transparency.  The current accountability movement directed toward higher education could ruin everything by shining a light on the multitude of conflicting aims, hidden cross-subsidies, and forbidden activities that constitute life in the university.  A second is disaggregation.  I’m talking about current proposals to pare down the complexity of the university in the name of efficiency:  Let online modules take over undergraduate teaching; eliminate costly residential colleges; closet research in separate institutes; and get rid of football.  These changes would destroy the synergy that comes from the university’s complex structure.  A third is principle.  I argue that the university is a procedural institution, which would collapse if we all acted on principle instead of form.   I end with a call for us to retreat from substance and stand shoulder-to-shoulder in defense of procedure.

Historical Roots of the System

            The origins of the American system of higher education could not have been more humble or less promising of future glory.  It was a system, but it had no overall structure of governance and it did not emerge from a plan.  It just happened, through an evolutionary process that had direction but no purpose.  We have a higher education system in the same sense that we have a solar system, each of which emerged over time according to its own rules.  These rules shaped the behavior of the system but they were not the product of Intelligent Design. 

            Yet something there was about this system that produced extraordinary institutional growth.  When George Washington assumed the presidency of the new republic in 1789, the U.S. already had 19 colleges and universities (Tewksbury, 1932, Table 1; Collins, 1979, Table 5.2).  By 1830 the numbers rose to 50 and then growth accelerated, with the total reaching 250 in 1860, 563 in 1870, and 811 in 1880.  To give some perspective, the number of universities in the United Kingdom between 1800 and 1880 rose from 6 to 10 and in all of Europe from 111 to 160 (Rüegg, 2004).  So in 1880 this upstart system had 5 times as many institutions of higher education as did the entire continent of Europe.  How did this happen?

            Keep in mind that the university as an institution was born in medieval Europe in the space between the dominant sources of power and wealth, the church and the state, and it drew  its support over the years from these two sources.  But higher education in the U.S. emerged in a post-feudal frontier setting where the conditions were quite different.  The key to understanding the nature of the American system of higher education is that it arose under conditions where the market was strong, the state was weak, and the church was divided.  In the absence of any overarching authority with the power and money to support a system, individual colleges had to find their own sources of support in order to get started and keep going.  They had to operate as independent enterprises in the competitive economy of higher education, and their primary reasons for being had little to do with higher learning.

            In the early- and mid-nineteenth century, the modal form of higher education in the U.S. was the liberal arts college.  This was a non-profit corporation with a state charter and a lay board, which would appoint a president as CEO of the new enterprise.  The president would then rent a building, hire a faculty, and start recruiting students.  With no guaranteed source of funding, the college had to make a go of it on its own, depending heavily on tuition from students and donations from prominent citizens, alumni, and religious sympathizers.  For college founders, location was everything.  However, whereas European universities typically emerged in major cities, these colleges in the U.S. arose in small towns far from urban population centers.  Not a good strategy if your aim was to draw a lot of students.  But the founders had other things in mind.

            One central motive for founding colleges was to promote religious denominations.  The large majority of liberal arts colleges in this period had a religious affiliation and a clergyman as president.  The U.S. was an extremely competitive market for religious groups seeking to spread the faith, and colleges were a key way to achieve this end.  With colleges, they could prepare its own clergy and provide higher education for their members; and these goals were particularly important on the frontier, where the population was growing and the possibilities for denominational expansion were the greatest.  Every denomination wanted to plant the flag in the new territories, which is why Ohio came to have so many colleges.  The denomination provided a college with legitimacy, students, and a built-in donor pool but with little direct funding.

            Another motive for founding colleges was closely allied with the first, and that was land speculation.  Establishing a college in town was not only a way to advance the faith, it was also a way to raise property values.  If town fathers could attract a college, they could make the case that the town was no mere agricultural village but a cultural center, the kind of place where prospective land buyers would want to build a house, set up a business, and raise a family.  Starting a college was cheap and easy.  It would bear the town’s name and serve as its cultural symbol.  With luck it would give the town leverage to become a county seat or gain a station on the rail line.  So a college was a good investment in a town’s future prosperity (Brown, 1995).

            The liberal arts college was the dominant but not the only form that higher education took in nineteenth century America.  Three other types of institutions emerged before 1880.  One was state universities, which were founded and governed by individual states but which received only modest state funding.  Like liberal arts colleges, they arose largely for competitive reasons.  They emerged in the new states as the frontier moved westward, not because of huge student demand but because of the need for legitimacy.  You couldn’t be taken seriously as a state unless you had a state university, especially if your neighbor had just established one. 

            The second form of institution was the land-grant college, which arose from federal efforts to promote land sales in the new territories by providing public land as a founding grant for new institutions of higher education.  Turning their backs on the classical curriculum that had long prevailed in colleges, these schools had a mandate to promote practical learning in fields such as agriculture, engineering, military science, and mining. 

            The third form was the normal school, which emerged in the middle of the century as state-founded high-school-level institutions for the preparation of teachers.  It wasn’t until the end of the century that these schools evolved into teachers colleges; and in the twentieth century they continued that evolution, turning first into full-service state colleges and then by midcentury into regional state universities. 

            Unlike liberal arts colleges, all three of these types of institutions were initiated by and governed by states, and all received some public funding.  But this funding was not nearly enough to keep them afloat, so they faced similar challenges as the liberal arts colleges, since their survival depended heavily on their ability to bring in student tuition and draw donations.  In short, the liberal arts college established the model for survival in a setting with a strong market, weak state, and divided church; and the newer public institutions had to play by the same rules.

            By 1880, the structure of the American system of higher education was well established.  It was a system made up of lean and adaptable institutions, with a strong base in rural communities, and led by entrepreneurial presidents, who kept a sharp eye out for possible threats and opportunities in the highly competitive higher-education market.  These colleges had to attract and keep the loyalty of student consumers, whose tuition was critical for paying the bills and who had plenty of alternatives in towns nearby.  And they also had to maintain a close relationship with local notables, religious peers, and alumni, who provided a crucial base of donations.

            The system was only missing two elements to make it workable in the long term.  It lacked sufficient students, and it lacked academic legitimacy.  On the student side, this was the most overbuilt system of higher education the world has ever seen.  In 1880, 811 colleges were scattered across a thinly populated countryside, which amounted to 16 colleges per million of population (Collins, 1979, Table 5.2).  The average college had only 131 students and 14 faculty and granted 17 degrees per year (Carter et al., 2006, Table Bc523, Table Bc571; U.S. Bureau of the Census, 1975, Series H 751).  As I have shown, these colleges were not established in response to student demand, but nonetheless they depended on students for survival.  Without a sharp growth in student enrollments, the whole system would have collapsed. 

            On the academic side, these were colleges in name only.  They were parochial in both senses of the word, small town institutions stuck in the boondocks and able to make no claim to advancing the boundaries of knowledge.  They were not established to promote higher learning, and they lacked both the intellectual and economic capital required to carry out such a mission.  Many high schools had stronger claims to academic prowess than these colleges.  European visitors in the nineteenth century had a field day ridiculing the intellectual poverty of these institutions.  The system was on death watch.  If it was going to be able to survive, it needed a transfusion that would provide both student enrollments and academic legitimacy. 

            That transfusion arrived just in time from a new European import, the German research university.  This model offered everything that was lacking in the American system.  It reinvented university professors as the best minds of the generation, whose expertise was certified by the new entry-level degree, the Ph.D., and who were pushing back the frontiers of knowledge through scientific research.  It introduced graduate students to the college campus, who would be selected for their high academic promise and trained to follow in the footsteps of their faculty mentors. 

            And at the same time that the German model offered academic credibility to the American system, the peculiarly Americanized form of this model made university enrollment attractive for undergraduates, whose focus was less on higher learning than on jobs and parties.  The remodeled American university provided credible academic preparation in the cognitive skills required for professional and managerial work; and it provided training in the social and political skills required for corporate employment, through the process of playing the academic game and taking on roles in intercollegiate athletics and on-campus social clubs.  It also promised a social life in which one could have a good time and meet a suitable spouse. 

            By 1900, with the arrival of the research university as the capstone, nearly all of the core elements of the current American system of higher education were in place.  Subsequent developments focused primarily on extending the system downward, adding layers that would make it more accessible to larger numbers of students – as normal schools evolved into regional state universities and as community colleges emerged as the open-access base of an increasingly stratified system.  Here ends the history portion of this account. Now we move on to the sociological part of the story.

Sociological Traits of the System

            When the research university model arrived to save the day in the 1880s, the American system of higher education was in desperate straits.  But at the same time this system had an enormous reservoir of potential strengths that prepared it for its future climb to world dominance.  Let’s consider some of these strengths.  First it had a huge capacity in place, the largest in the world by far:  campuses, buildings, faculty, administration, curriculum, and a strong base in the community.  All it needed was students and credibility. 

            Second, it consisted of a group of institutions that had figured out how to survive under dire Darwinian circumstances, where supply greatly exceeded demand and where there was no secure stream of funding from church or state.  In order to keep the enterprises afloat, they had learned how to hustle for market position, troll for students, and dun donors.  Imagine how well this played out when students found a reason to line up at their doors and donors suddenly saw themselves investing in a winner with a soaring intellectual and social mission. 

            Third, they had learned to be extraordinarily sensitive to consumer demand, upon which everything depended.  Fourth, as a result they became lean and highly adaptable enterprises, which were not bounded by the politics of state policy or the dogma of the church but could take advantage of any emerging possibility for a new program, a new kind of student or donor, or a new area of research.  Not only were they able to adapt but they were forced to do so quickly, since otherwise the competition would jump on the opportunity first and eat their lunch.

            By the time the research university arrived on the scene, the American system of higher education was already firmly established and governed by its own peculiar laws of motion and its own evolutionary patterns.  The university did not transform the system.  Instead it crowned the system and made it viable for a century of expansion and elevation.  Americans could not simply adopt the German university model, since this model depended heavily on strong state support, which was lacking in the U.S.  And the American system would not sustain a university as elevated as the German university, with its tight focus on graduate education and research at the expense of other functions.  American universities that tried to pursue this approach – such as Clark University and Johns Hopkins – found themselves quickly trailing the pack of institutions that adopted a hybrid model grounded in the preexisting American system.  In the U.S., the research university provided a crucial add-on rather than a transformation.  In this institutionally-complex market-based system, the research university became embedded within a convoluted but highly functional structure of cross-subsidies, interwoven income streams, widely dispersed political constituencies, and a bewildering array of goals and functions. 

            At the core of the system is a delicate balance among three starkly different models of higher education.  These three roughly correspond to Clark Kerr’s famous characterization of the American system as a mix of the British undergraduate college, the American land-grant college, and the German research university (Kerr, 2001, p. 14).  The first is the populist element, the second is the practical element, and the third is the elite element.  Let me say a little about each of these and make the case for how they work to reinforce each other and shore up the overall system.  I argue that these three elements are unevenly distributed across the whole system, with the populist and practical parts strongest in the lower tiers of the system, where access is easy and job utility are central, and the elite is strongest in the upper tier.  But I also argue that all three are present in the research university at the top of the system.  Consider how all these elements come together in a prototypical flagship state university.

            The populist element has its roots in the British residential undergraduate college, which colonists had in mind when they established the first American colleges; but the changes that emerged in the U.S. in the early nineteenth century were critical.  Key was the fact that American colleges during this period were broadly accessible in a way that colleges in the U.K. never were until the advent of the red-brick universities after the Second World War.  American colleges were not located in fashionable areas in major cities but in small towns in the hinterland.  There were far too many of them for them to be elite, and the need for students meant that tuition and academic standards both had to be kept relatively low.  The American college never exuded the odor of class privilege to the same degree as Oxbridge; its clientele was largely middle class.  For the new research university, this legacy meant that the undergraduate program provided critical economic and political support. 

            From the economic perspective, undergrads paid tuition, which – through large classes and thus the need for graduate teaching assistants – supported graduate programs and the larger research enterprise.  Undergrads, who were socialized in the rituals of football and fraternities, were also the ones who identified most closely with the university, which meant that in later years they became the most loyal donors.  As doers rather than thinkers, they were also the wealthiest group of alumni donors.  Politically, the undergraduate program gave the university a broad base of community support.  Since anyone could conceive of attending the state university, the institution was never as remote or alien as the German model.  Its athletic teams and academic accomplishments were a point of pride for state residents, whether or not they or their children ever attended.  They wore the school colors and cheered for it on game days.

            The practical element has its root in the land-grant college.  The idea here was that the university was not just an enterprise for providing liberal education for the elite but that it could also provide useful occupational skills for ordinary people.  Since the institution needed to attract a large group of students to pay the bills, the American university left no stone unturned when it came to developing programs that students might want.  It promoted itself as a practical and reliable mechanism for getting a good job.  This not only boosted enrollment, but it also sent a message to the citizens of the state that the university was making itself useful to the larger community, producing the teachers, engineers, managers, and dental hygienists that they needed.  

            This practical bent also extended to the university’s research effort, which was not just focusing on ivory tower pursuits.  Its researchers were working hard to design safer bridges, more productive crops, better vaccines, and more reliable student tests.  For example, when I taught at Michigan State I planted my lawn with Spartan grass seed, which was developed at the university.  These forms of applied research led to patents that brought substantial income back to the institution, but their most important function was to provide a broad base of support for the university among people who had no connection with it as an instructional or intellectual enterprise.  The idea was compelling: This is your university, working for you.

            The elite element has its roots in the German research university.  This is the component of the university formula that gives the institution academic credibility at the highest level.  Without it the university would just be a party school for the intellectually challenged and a trade school for job seekers.  From this angle, the university is the haven for the best thinkers, where professors can pursue intellectual challenges of the first order, develop cutting edge research in a wide array of domains, and train graduate students who will carry on these pursuits in the next generation.  And this academic aura envelops the entire enterprise, giving the lowliest freshman exposure to the most distinguished faculty and allowing the average graduate to sport a diploma burnished by the academic reputations of the best and the brightest.  The problem, of course, is that supporting professorial research and advanced graduate study is enormously expensive; research grants only provide a fraction of the needed funds. 

            So the populist and practical domains of the university are critically important components of the larger university package.  Without the foundation of fraternities and football, grass seed and teacher education, the superstructure of academic accomplishment would collapse of its own weight.  The academic side of the university can’t survive without both the financial subsidies and political support that come from the populist and the practical sides.  And the populist and practical sides rely on the academic legitimacy that comes from the elite side.  It’s the mixture of the three that constitutes the core strength of the American system of higher education.  This is why it is so resilient, so adaptable, so wealthy, and so powerful.  This is why its financial and political base is so broad and strong.  And this is why American institutions of higher education enjoy so much autonomy:  They respond to many sources of power in American society and they rely on many sources of support, which means they are not the captive of any single power source or revenue stream.

The Power of Form

            So my story about the American system of higher education is that it succeeded by developing a structure that allowed it to become both economically rich and politically autonomous.  It could tap multiple sources of revenue and legitimacy, which allowed it to avoid becoming the wholly owned subsidiary of the state, the church, or the market.  And by virtue of its structurally reinforced autonomy, college is good for a great many things.

            At last we come back to our topic.  What is college good for?  For those of us on faculties of research universities, they provide several core benefits that we see as especially important.  At the top of the list is that they preserve and promote free speech.  They are zones where faculty and students can feel free to pursue any idea, any line of argument, and any intellectual pursuit that they wish – free of the constraints of political pressure, cultural convention, or material interest.  Closely related to this is the fact that universities become zones where play is not only permissible but even desirable, where it’s ok to pursue an idea just because it’s intriguing, even though there is no apparent practical benefit that this pursuit would produce.

            This, of course, is a rather idealized version of the university.  In practice, as we know, politics, convention, and economics constantly intrude on the zone of autonomy in an effort to shape the process and limit these freedoms.  This is particularly true in the lower strata of the system.  My argument is not that the ideal is met but that the structure of American higher education – especially in the top tier of the system – creates a space of relative autonomy, where these constraining forces are partially held back, allowing the possibility for free intellectual pursuits that cannot be found anywhere else. 

            Free intellectual play is what we in the faculty tend to care about, but others in American society see other benefits arising from higher education that justify the enormous time and treasure that we devote to supporting the system.  Policymakers and employers put primary emphasis on higher education as an engine of human capital production, which provides the economically relevant skills that drive increases in worker productivity and growth in the GDP.  They also hail it as a place of knowledge production, where people develop valuable technologies, theories, and inventions that can feed directly into the economy.  And companies use it as a place to outsource much of their needs for workforce training and research-and-development. 

            These pragmatic benefits that people see coming from the system of higher education are real.  Universities truly are socially useful in such ways.  But it’s important to keep in mind that these social benefits only can arise if the university remains a preserve for free intellectual play.  Universities are much less useful to society if they restrict themselves to the training of individuals for particular present-day jobs, or to the production of research to solve current problems.  They are most useful if they function as storehouses for knowledges, skills, technologies, and theories – for which there is no current application but which may turn out to be enormously useful in the future.  They are the mechanism by which modern societies build capacity to deal with issues that have not yet emerged but sooner or later are likely to do so.

            But that is a discussion for another speech by another scholar.  The point I want make today about the American system of higher education is that it is good for a lot of things but it was established in order to accomplish none of these things.  As I have shown, the system that arose in the nineteenth century was not trying to store knowledge, produce capacity, or increase productivity.  And it wasn’t trying to promote free speech or encourage play with ideas.  It wasn’t even trying to preserve institutional autonomy.  These things happened as the system developed, but they were all unintended consequences.  What was driving development of the system was a clash of competing interests, all of which saw the college as a useful medium for meeting particular ends.  Religious denominations saw them as a way to spread the faith.  Town fathers saw them as a way to promote local development and increase property values.  The federal government saw them as a way to spur the sale of federal lands.  State governments saw them as a way to establish credibility in competition with other states.  College presidents and faculty saw them as a way to promote their own careers.  And at the base of the whole process of system development were the consumers, the students, without whose enrollment and tuition and donations the system would not have been able to persist.  The consumers saw the college as useful in a number of ways:  as a medium for seeking social opportunity and achieving social mobility; as a medium for preserving social advantage and avoiding downward mobility; as a place to have a good time, enjoy an easy transition to adulthood, pick up some social skills, and meet a spouse; even, sometimes, as a place to learn. 

            The point is that the primary benefits of the system of higher education derive from its form, but this form did not arise in order to produce these benefits.  We need to preserve the form in order to continue enjoying these benefits, but unfortunately the organizational  foundations upon which the form is built are, on the face of it, absurd.  And each of these foundational qualities is currently under attack from the perspective of alternative visions that, in contrast, have a certain face validity.  It the attackers accomplish their goals, the system’s form, which has been so enormously productive over the years, will collapse, and with this collapse will come the end of the university as we know it.  I didn’t promise this lecture would end well, did I?

            Let me spell out three challenges that would undercut the core autonomy and synergy that makes the system so productive in its current form.  On the surface, each of the proposed changes seems quite sensible and desirable.  Only by examining the implications of actually pursuing these changes can we see how they threaten the foundational qualities that currently undergird the system.  The system’s foundations are so paradoxical, however, that mounting a public defense of them would be difficult indeed.  Yet it is precisely these traits of the system that we need to defend in order to preserve the current highly functional form of the university.  In what follows, I am drawing inspiration from the work of Suzanne Lohmann (2004, 2006) a political scientist at UCLA, who is the scholar who has addressed these issues most astutely.

            One challenge comes from prospective reformers of American higher education who want to promote transparency.  Who can be against that?  This idea derives from the accountability movement, which has already swept across K-12 education and is now pounding the shores of higher education.  It simply asks universities to show people what they’re doing.  What is the university doing with its money and its effort?  Who is paying for what?  How do the various pieces of the complex structure of the university fit together?  And are they self-supporting or drawing resources from elsewhere?  What is faculty credit-hour production?  How is tuition related to instructional costs?  And so on.   These demands make a lot of sense. 

            The problem, however, as I have shown today, is that the autonomy of the university depends on its ability to shield its inner workings from public scrutiny.  It relies on opacity.  Autonomy will end if the public can see everything that is going on and what everything costs.  Consider all of the cross subsidies that keep the institution afloat:  undergraduates support graduate education, football supports lacrosse, adjuncts subsidize professors, rich schools subsidize poor schools.  Consider all of the instructional activities that would wilt in the light of day; consider all of the research projects that could be seen as useless or politically unacceptable.  The current structure keeps the inner workings of the system obscure, which protects the university from intrusions on its autonomy.  Remember, this autonomy arose by accident not by design; its persistence depends on keeping the details of university operations out of public view.

            A second and related challenge comes from reformers who seek to promote disaggregation.  The university is an organizational nightmare, they say, with all of those institutes and centers, departments and schools, programs and administrative offices.  There are no clear lines of authority, no mechanisms to promote efficiency and eliminate duplication, no tools to achieve economies of scale.  Transparency is one step in the right direction, they say, but the real reform that is needed is to take apart the complex interdependencies and overlapping responsibilities within the university and then figure out how each of these tasks could be accomplished in the most cost-effective and outcome-effective manner.  Why not have a few star professors tape lectures and then offer Massive Open Online Courses at colleges across the country?  Why not have institutions specialize in what they’re best at – remedial education, undergraduate instruction, vocational education, research production, graduate or student training?  Putting them together into a single institution is expensive and grossly inefficient. 

            But recall that it is precisely the aggregation of purposes and functions – the combination of the populist, the practical, and the elite – that has made the university so strong, so successful, and, yes, so useful.  This combination creates a strong base both financially and politically and allows for forms of synergy than cannot happen with a set of isolated educational functions.  The fact is that this institution can’t be disaggregated without losing what makes it the kind of university that students, policymakers, employers, and the general public find so compelling.  A key organizational element that makes the university so effective is its chaotic complexity.

            A third challenge comes not from reformers intruding on the university from the outside but from faculty members meddling with it from the inside.  The threat here arises from the dangerous practice of acting on academic principle.  Fortunately, this is not very common in academe.  But the danger is lurking in the background of every decision about faculty hires.  Here’s how it works.  You review a finalist for a faculty position in a field not closely connected to your own, and you find to your horror that the candidate’s intellectual domain seems absurd on the face of it (how can anyone take this type of work seriously?) and the candidate’s own scholarship doesn’t seem credible.  So you decide to speak against hiring the candidate and organize colleagues to support your position.  But then you happen to read a paper by Suzanne Lohmann, who points out something very fundamental about how universities work. 

            Universities are structured in a manner that protects the faculty from the outside world (that is, protecting them from the forces of transparency and disaggregation), but it’s also organized in a manner that protects the faculty from each other.  The latter is the reason we have such an enormous array of departments and schools in universities.  If every historian had to meet the approval of geologists and every psychologist had be meet the approval of law faculty, no one would ever be hired. 

           The simple fact is that part of what keeps universities healthy and autonomous is hypocrisy.  Because of the Balkanized structure of university organization, we all have our own protected spaces to operate in and we all pass judgment only on our own peers within that space.  To do otherwise would be disastrous.  We don’t have to respect each other’s work across campus, we merely need to tolerate it – grumbling about each other in private and making nice in public.  You pick your faculty, we’ll pick ours.  Lohmann (2006) calls this core procedure of the academy “log-rolling.”  If we all operated on principle, if we all only approved scholars we respected, then the university would be a much diminished place.  Put another way, I wouldn’t want to belong to a university that consisted only of people I found worthy.  Gone would be the diversity of views, paradigms, methodologies, theories, and world views that makes the university such a rich place.  The result is incredibly messy, and it permits a lot of quirky – even ridiculous – research agendas, courses, and instructional programs.  But in aggregate, this libertarian chaos includes an extraordinary range of ideas, capacities, theories, and social possibilities.  It’s exactly the kind of mess we need to treasure and preserve and defend against all opponents.

            So here is the thought I’m leaving you with.  The American system of higher education is enormously productive and useful, and it’s a great resource for students, faculty, policymakers, employers, and society.  What makes it work is not its substance but its form.  Crucial to its success is its devotion to three formal qualities:  opacity, chaotic complexity, and hypocrisy.  Embrace these forms and they will keep us free.

Posted in Academic writing, Writing, Writing Class

Rothman: Why Is Academic Writing So Academic?

In this post, Joshua Rothman addresses the problem of academic writing by comparing it to what’s going on in journalistic writing.  As a journalist who was once a graduate student in English, he knows both worlds well.  So instead of the usual diatribe against academics for being obscure and deadly, he explores the issue structurally, showing how journalism and academia have drifted apart from each other in the last 50 years.  While academia has become more inward turning and narrow, journalism has become more populist, seeking a large audience at any cost.  In the process, both fields have lost something important.  The piece first appeared in the New Yorker in 2014.  Here’s a link to the original.

Rothman throws up his hands at the end, suggesting that writers in both fields are trapped in a situation that offers no escape for anyone who wants to remain a member in good standing in one field or the other.  But I partially disagree with this assessment.  Yes, the structural pressures in both domains to constrain your writing are strong, but they’re not irresistible.  Journalists can find venues like the New Yorker and Atlantic that allow them to avoid having to pander to the click-happy internet browser.  And academics can push against the pressures to make disinterested research uninteresting and colorless.  

There are still a lot of scholars who publish articles in top academic journals and books with major university presses that incorporate lucid prose, lively style, and a clear personal voice.  Doing so does not tarnish their academic reputation or employability, but it also gets them a broader academic audience, more citations, and more intellectual impact.  I’ve posted some examples here by scholars such as Jim March, Mary Metz, Peter Rossi, E.P. Thompson, and Max Weber.  

For lots of examples of good academic prose and stellar advice about how to become a stylish scholarly writer, you should read Helen Sword’s book, Stylish Academic Writing.  I used this book to good effect in my class on academic writing.  (Here is the syllabus for this class, which includes links to all of the readings and my class slides.)  I also strongly suggest checking out her website, where, among other things, you can plug your own text into the Writer’s Diet Test, which will show how flabby or fit your prose is.

Enjoy.

Why Is Academic Writing So Academic?

by Joshua Rothman

Feb. 21, 2014

Rothman Photo

A few years ago, when I was a graduate student in English, I presented a paper at my department’s American Literature Colloquium. (A colloquium is a sort of writing workshop for graduate students.) The essay was about Thomas Kuhn, the historian of science. Kuhn had coined the term “paradigm shift,” and I described how this phrase had been used and abused, much to Kuhn’s dismay, by postmodern insurrectionists and nonsensical self-help gurus. People seemed to like the essay, but they were also uneasy about it. “I don’t think you’ll be able to publish this in an academic journal,” someone said. He thought it was more like something you’d read in a magazine.

Was that a compliment, a dismissal, or both? It’s hard to say. Academic writing is a fraught and mysterious thing. If you’re an academic in a writerly discipline, such as history, English, philosophy, or political science, the most important part of your work—practically and spiritually—is writing. Many academics think of themselves, correctly, as writers. And yet a successful piece of academic prose is rarely judged so by “ordinary” standards. Ordinary writing—the kind you read for fun—seeks to delight (and, sometimes, to delight and instruct). Academic writing has a more ambiguous mission. It’s supposed to be dry but also clever; faceless but also persuasive; clear but also completist. Its deepest ambiguity has to do with audience. Academic prose is, ideally, impersonal, written by one disinterested mind for other equally disinterested minds. But, because it’s intended for a very small audience of hyper-knowledgable, mutually acquainted specialists, it’s actually among the most personal writing there is. If journalists sound friendly, that’s because they’re writing for strangers. With academics, it’s the reverse.

Professors didn’t sit down and decide to make academic writing this way, any more than journalists sat down and decided to invent listicles. Academic writing is the way it is because it’s part of a system. Professors live inside that system and have made peace with it. But every now and then, someone from outside the system swoops in to blame professors for the writing style that they’ve inherited. This week, it was Nicholas Kristof, who set off a rancorous debate about academic writing with a column, in the Times, called “Professors, We Need You!” The academic world, Kristof argued, is in thrall to a “culture of exclusivity” that “glorifies arcane unintelligibility while disdaining impact and audience”; as a result, there are “fewer public intellectuals on American university campuses today than a generation ago.”

The response from the professoriate was swift, severe, accurate, and thoughtful. A Twitter hashtag, #engagedacademics, sprung up, as if to refute Kristof’s claim that professors don’t use enough social media. Professors pointed out that the brainiest part of the blogosphere is overflowing with contributions from academics; that, as teachers, professors already have an important audience in their students; and that the Times itself frequently benefits from professorial ingenuity, which the paper often reports as news. (A number of the stories in the Sunday Review section, in which Kristof’s article appeared, were written by professors.) To a degree, some of the responses, though convincingly argued, inadvertently bolstered Kristof’s case because of the style in which they were written: fractious, humorless, self-serious, and defensively nerdy. As writers, few of Kristof’s interlocutors had his pithy, winning ease. And yet, if they didn’t win with a knock-out blow, the professors won on points. They showed that there was something outdated, and perhaps solipsistic, in Kristof’s yearning for a new crop of sixties-style “public intellectuals.”

As a one-time academic, I spent most of the week rooting for the profs. But I have a lot of sympathy for Kristof, too. I think his heart’s in the right place. (His column ended on a wistful note: “I write this in sorrow, for I considered an academic career.”) My own theory is that he got the situation backward. The problem with academia isn’t that professors are, as Kristof wrote, “marginalizing themselves.” It’s that the system that produces and consumes academic knowledge is changing, and, in the process, making academic work more marginal.

It may be that being a journalist makes it unusually hard for Kristof to see what’s going on in academia. That’s because journalism, which is in the midst of its own transformation, is moving in a populist direction. There are more writers than ever before, writing for more outlets, including on their own blogs, Web sites, and Twitter streams. The pressure on established journalists is to generate traffic. New and clever forms of content are springing up all the time—GIFs, videos, “interactives,” and so on. Dissenters may publish op-eds encouraging journalists to abandon their “culture of populism” and write fewer listicles, but changes in the culture of journalism are, at best, only a part of the story. Just as important, if not more so, are economic and technological developments having to do with subscription models, revenue streams, apps, and devices.

In academia, by contrast, all the forces are pushing things the other way, toward insularity. As in journalism, good jobs are scarce—but, unlike in journalism, professors are their own audience. This means that, since the liberal-arts job market peaked, in the mid-seventies, the audience for academic work has been shrinking. Increasingly, to build a successful academic career you must serially impress very small groups of people (departmental colleagues, journal and book editors, tenure committees). Often, an academic writer is trying to fill a niche. Now, the niches are getting smaller. Academics may write for large audiences on their blogs or as journalists. But when it comes to their academic writing, and to the research that underpins it—to the main activities, in other words, of academic life—they have no choice but to aim for very small targets. Writing a first book, you may have in mind particular professors on a tenure committee; miss that mark and you may not have a job. Academics know which audiences—and, sometimes, which audience members—matter.

It won’t do any good, in short, to ask professors to become more populist. Academic writing and research may be knotty and strange, remote and insular, technical and specialized, forbidding and clannish—but that’s because academia has become that way, too. Today’s academic work, excellent though it may be, is the product of a shrinking system. It’s a tightly-packed, super-competitive jungle in there. The most important part of Kristof’s argument was, it seemed to me, buried in the blog post that he wrote to accompany his column. “When I was a kid,” he wrote, “the Kennedy administration had its ‘brain trust’ of Harvard faculty members, and university professors were often vital public intellectuals.” But the sixties, when the baby boom led to a huge expansion in university enrollments, was also a time when it was easier to be a professor. If academic writing is to become expansive again, academia will probably have to expand first.

Posted in Higher Education, Meritocracy, Philosophy

Alain de Botton: On Asking People What They ‘Do’?

This lovely essay explores the most common question that modernity prompts strangers to ask each other:  What do you do?  The author is the philosopher Alain de Botton, who explains that this question is freighted with moral judgment.  In a meritocracy, what you do for a living is not only who you are; it’s also where you stand in the hierarchy of public esteem.  Are you somebody or nobody, a winner or a loser?  Should I suck up to you or should I scorn you?

The argument here resonates with a number of recent pieces I’ve posted here about the downside of the academic meritocracy.  At the core is this problem:  when we say the social system is responsive to merit rather than birth, we place personal responsibility on individuals for their social outcomes.  It’s no longer legitimate to blame fate or luck or the gods for your lowly status, because the fault is all yours.

This essay is from his website The School of Life.  Here’s a link to the original.

On Asking People What They ‘Do’?

Alain de Botton

The world became modern when people who met for the first time shifted from asking each other (as they had always done) where they came from – to asking each other what they did.

To try to position someone by their area of origin is to assume that personal identity is formed first and foremost by membership of a geographical community; we are where we are from. We’re the person from the town by the lake, we’re from the village between the forest and the estuary. But to want to know our job is to imagine that it’s through our choice of occupation, through our distinctive way of earning money, that we become most fully ourselves; we are what we do.

The difference may seem minor but it has significant implications for the way we stand to be judged and therefore how pained the question may make us feel. We tend not to be responsible for where we are from. The universe landed us there and we probably stayed. Furthermore, entire communities are seldom viewed as either wholly good or bad; it’s assumed they will contain all sorts of people, about whom blanket judgements would be hard. One is unlikely to be condemned simply on the basis of the region or city one hails from. But we have generally had far more to do with the occupation we are engaged in. We’ll have studied a certain way, gained particular qualifications and made specific choices in order to end up, perhaps, a dentist or a cleaner, a film producer or a hospital porter. And to such choices, targeted praise or blame can be attached. 

It turns out that in being asked what we do, we are not being asked what we do, we’re being asked what we are worth – and more precisely, whether or not we are worth knowing. In modernity, there are right and wrong and answers and the wrong ones will swiftly strip us of the psychological ingredient we crave as much as we do heat, food or rest: respect. We long to be treated with dignity and kindness, for our existence to matter to others and for our particularity to be noticed and honoured. We may do almost as much damage to a person by ignoring them as by punching them in the stomach.

But respect will not be available to those who cannot give a sufficiently elevated answer to the question of what they do. The modern world is snobbish. The term is associated with a quaint aristocratic value system that emphasises bloodlines and castles. But stripped to its essence snobbery merely indicates any way of judging another human whereby one takes a relatively small section of their identity and uses it to come to a total and fixed judgement on their entire worth. For the music snob, we are what we listen to, for the clothes snob, we are our trousers. And according to the predominant kind of snobbery at large in the modern world, which is job snobbery, we are nothing but what is on our business card.

The opposite of a snob might be a parent or lover; someone who cares about who one is, not what one does. But for the majority, our existence will be weighed up according to far narrower criteria. We will exist in so far as we have performed adequately in the market place. Our longing for respect will only be satisfied through the right sort of rank. It is easy to accuse modern humans of being materialistic. This seems wrong. We may have high levels of interest in possessions and salaries, but we are not on that basis ‘materialistic’. We are simply living in a world where the possession of certain material goods has become the only conduit to the emotional rewards that are what, deep down, we crave. It isn’t the objects and titles we are after; it is, more poignantly, the feeling of being ‘seen’ and liked which will only be available to us via material means.

Not only does the modern world want to know what we do, it also has to hand some punitive explanations of why we have done not well. It promotes the idea of ‘meritocracy’, that is, a belief in a system which should allow each person to rise through classes in order to take up the place they deserve. No longer should tradition or family background limit what one can achieve. But the idea of meritocracy carries with it a nasty sting, for if we truly believe in a world in which those who deserve to get to the top get to the top, then by implication, we must also believe in a world in which those who get to the bottom deserve to get to the bottom. In other words, a world which takes itself to be meritocratic will suppose that failure and success in the professional game are not mere accidents, but always and invariably indications of genuine value.

It had not always felt quite as definitive. Premodern societies believed in the intervention of divine forces in human affairs. A successful Roman trader or soldier would have looked up and thanked Mercury or Mars for their good fortune. They knew themselves to be only ever partially responsible for what happened to them, for good or ill, and would remember as much when evaluating others. The poor weren’t necessarily indigent or sinful; the Gods might just have never looked favourably on them. But we have done away with the idea of divine intervention – or of its less directly superstitious cousin, luck. We don’t accept that someone might fail for reasons of mere bad luck. We have little patience for nuanced stories or attenuating facts; narratives that could set the bare bones of a biography in a richer context, that could explain that though someone ended up in a lowly place, they had to deal with an illness, an ailing relative, a stock market crash or a very difficult childhood. Winners make their own luck. And losers their own defeat.

No wonder that the consequences of underachievement feel especially punishing. There are fewer explanations and fewer ways of tolerating oneself. A society that assumes that what happens to an individual is the responsibility of the individual is a society that doesn’t want to hear any so-called excuses that would less closely identify a person with elements of their CV. It is a society that may leave some of the losers feeling – in extremis – that they have no right to exist. Suicide rates rise.

In the past, in the era of group identity, we might value ourselves in part for things which we had not done entirely ourselves. We might feel proud that we came from a society that had built a particularly fine cathedral or temple. Our sense of self could be bolstered by belonging to a city or nation that placed great store on athletic prowess or literary talent. Modernity has sharply weakened our ability to lean on such supports. It has tied us punitively closely to what we have personally done – or not.

At the same time, it has pointed out that the opportunities for individual achievement have never been greater. We – at last – are able to do anything. We might found a fortune, rise to the top of politics, write a hit song. There should be no limits on ambition. And therefore, any failure starts to feel even more of a damning verdict on who we are. It’s one thing to have failed in an era when failure seemed like the norm, quite another to have failed when success has been made to feel like an ongoing and universal possibility.

Even as it raised living standards across the board, the modern world has managed to make the psychological consequences of failure harder to bear. It has eroded our sense that our identity could rest on broader criteria than our professional performance. It has also made it imperative for psychological survival that we try to find a way of escaping the claustrophobia of individualism, that we recall that workplace success and failure are always relative markers, not conclusive judgements, that in reality, no one is in fact ever either a loser or a winner, that we are all bewildering mixtures of the beautiful and the ugly, the impressive and the mediocre, the idiotic and the sharp. Going forward, in a fight against the spirit of the age, we might do well to ask all new acquaintances not so much what they do but – more richly – what they happen to have been thinking about recently.

Posted in Ed schools, Higher Education, History

Too Easy a Target: The Trouble with Ed Schools and the Implications for the University

This post is a piece I published in Academe (the journal of AAUP) in 1999.  It provides an overview of the argument in my 2004 book, The Trouble with Ed Schools. I reproduce it here as a public service:  if you read this, you won’t need to read my book much less buy it.  You’re welcome.  Also, looking through it 20 years later, I was pleasantly surprised to find that it was kind of a fun read.  Here’s a link to the original.

The book and the article tell the story of the poor beleauguered ed school, maligned by one and all.  It’s a story of irony, in which an institution does what everyone asked of it and is thoroughly punished for the effort.  And it’s also a reverse Horatio Alger story, in which the beggar boy never makes it.  Here’s a glimpse of the argument, which starts with the ed school’s terrible reputation:

So how did things get this bad? No occupational group or subculture acquires a label as negative as this one without a long history of status deprivation. Critics complain about the weakness and irrelevance of teacher ed, but they rarely look at the reasons for its chronic status problems. If they did, they might find an interesting story, one that presents a more sympathetic, if not more flattering, portrait of the education school. They would also find, however, a story that portrays the rest of academe in a manner that is less self-serving than in the standard account. The historical part of this story focuses on the way that American policy makers, taxpayers, students, and universities collectively produced exactly the kind of education school they wanted. The structural part focuses on the nature of teaching as a form of social practice and the problems involved in trying to prepare people to pursue this practice.

Enjoy.

Ed Schools Cover

Too Easy a Target:

The Trouble with Ed Schools and the Implications for the University

By David F. Labaree

This is supposed to be the era of political correctness on American university campuses, a time when speaking ill of oppressed minorities is taboo. But while academics have to tiptoe around most topics, there is still one subordinate group that can be shelled with impunity — the sad sacks who inhabit the university’s education school. There is no need to take aim at this target because it is too big to miss, and there is no need to worry about hitting innocent bystanders because everyone associated with the ed school is understood to be guilty as charged.

Of course, education in general is a source of chronic concern and an object of continuous criticism for most Americans. Yet, as the annual Gallup Poll of attitudes toward education shows, citizens give good grades to their local schools at the same time that they express strong fears about the quality of public education elsewhere in the country. The vision is one of general threats to education that have not yet reached the neighborhood school but may do so in the near future. These threats include everything from the multicultural curriculum to the decline in the family, the influence of television, and the consequences of chronic poverty.

One such threat is the hapless education school, whose alleged incompetence and supposedly misguided ideas are seen as producing poorly prepared teachers and inadequate curricula. For the public, this institution is remote enough to be suspect (unlike the local school) and accessible enough to be scorned (unlike the more arcane university). For the university faculty, it is the ideal scapegoat, allowing blame for problems with schools to fall upon teacher education in particular rather than higher education in general.

For years, writers from right to left have been making the same basic complaints about the inferior quality of education faculties, the inadequacy of education students, and, to quote James Koerner’s 1963 classic, The Miseducation of American Teachers, their “puerile, repetitious, dull, and ambiguous” curriculum. This kind of complaining about ed schools is as commonplace as griping about the cold in the middle of winter. But something new has arisen in the defamatory discourse about these beleaguered institutions: the attacks are now coming from their own leaders. The victims are joining the victimizers.

So how did things get this bad? No occupational group or subculture acquires a label as negative as this one without a long history of status deprivation. Critics complain about the weakness and irrelevance of teacher ed, but they rarely look at the reasons for its chronic status problems. If they did, they might find an interesting story, one that presents a more sympathetic, if not more flattering, portrait of the education school. They would also find, however, a story that portrays the rest of academe in a manner that is less self-serving than in the standard account. The historical part of this story focuses on the way that American policy makers, taxpayers, students, and universities collectively produced exactly the kind of education school they wanted. The structural part focuses on the nature of teaching as a form of social practice and the problems involved in trying to prepare people to pursue this practice.

Decline of Normal Schools

Most education schools grew out of the normal schools that emerged in the second half of the nineteenth century. Their founders initially had heady dreams that these schools could become model institutions that would establish high-quality professional preparation for teachers along with a strong professional identity. For a time, some of the normal schools came close to realizing these dreams.

Soon, however, burgeoning enrollments in the expanding common schools produced an intense demand for new teachers to fill a growing number of classrooms, and the normal schools turned into teacher factories. They had to produce many teachers quickly and cheaply, or else school districts around the country would hire teachers without this training — or perhaps any form of professional preparation. So normal schools adapted by stressing quantity over quality, establishing a disturbing but durable pattern of weak professional preparation and low academic standards.

At the same time, normal schools had to confront a strong consumer demand from their own students, many of whom saw the schools as an accessible form of higher education rather than as a site for teacher preparation. Located close to home, unlike the more centrally located state universities and land grant colleges, the normal schools were also easier to get into and less costly. As a result, many students enrolled who had little or no interest in teaching; instead, they wanted an advanced educational credential that would gain them admission to attractive white-collar positions. They resisted being trapped within a single vocational track — the teacher preparation program — and demanded a wide array of college-level liberal arts classes and programs. Since normal schools depended heavily on tuition for their survival, they had little choice but to comply with the demands of their “customers.”

This compliance reinforced the already-established tendency toward minimizing the extent and rigor of teacher education. It also led the normal schools to transform themselves into the model of higher education that their customers wanted, first by changing into teachers’ colleges (with baccalaureate programs for nonteachers), then into state liberal-arts colleges, and finally into the general-purpose regional state universities they are today.

As the evolving colleges moved away from being normal schools, teacher education programs became increasingly marginal within their own institutions, which were coming to imitate the multipurpose university by giving pride of place to academic departments, graduate study, and preparation for the more prestigious professions. Teacher education came to be perceived as every student’s second choice, and the ed school professors came to be seen as second-class citizens in the academy.

Market Pressures in the Present

Market pressures on education schools have changed over the years, but they have not declined. Teaching is a very large occupation in the United States, with about 3 million practitioners in total. To fill all the available vacancies, approximately one in every five college graduates must enter teaching each year. If education schools do not prepare enough candidates, state legislators will authorize alternative routes into the profession (requiring little or no professional education), and school boards will hire such prospects to place warm bodies in empty classrooms.

Education schools that try to increase the duration and rigor of teacher preparation by focusing more intensively on smaller cohorts of students risk leaving the bulk of teaching in the hands of practitioners who are prepared at less demanding institutions or who have not been prepared at all. In addition, such efforts run into strong opposition from within the university, which needs ed students to provide the numbers that bring legislative appropriations and tuition payments. Subsidies from the traditionally cost-effective teacher-education factories support the university’s more prestigious, but less lucrative, endeavors. As a result, universities do not want their ed schools to turn into boutique programs for the preparation of a few highly professionalized teachers.

Another related source of institutional resistance arises whenever education schools try to promote quality over quantity. This resistance comes from academic departments, which have traditionally relied on the ability of their universities to provide teaching credentials as a way to induce students to major in “impractical” subjects. Departments such as English, history, and music have sold themselves to undergraduates for years with the argument that “you can always teach” these subjects. As a result, these same departments become upset when the education school starts to talk about upgrading, downsizing, or limiting access.

Stigmatized Populations and Soft Knowledge

The fact that education schools serve stigmatized populations aggravates the market pressures that have seriously undercut the status and the role of these schools. One such population is women, who currently account for about 70 percent of American teachers. Another is the working class, whose members have sought out the respectable knowledge-based white-collar work of teaching as a way to attain middle-class standing. Children make up a third stigmatized population. In a society that rewards contact with adults more than contact with children, and in a university setting that is more concerned with serious adult matters than with kid stuff, education schools lose out, because they are indelibly associated with children.

Teachers also suffer from an American bias in favor of doing over thinking. Teachers are the largest and most visible single group of intellectual workers in the United States — that is, people who make their living through the production and transmission of ideas. More accessible than the others in this category, teachers constitute the street-level intellectuals of our society. As the only intellectuals with whom most people will ever have close contact, teachers take the brunt of the national prejudice against book learning and those pursuits that are scornfully labeled as “academic.”

Another problem facing education schools is the low status of the knowledge they deal with: it is soft rather than hard, applied rather than pure. Hard disciplines (which claim to produce findings that are verifiable, definitive, and cumulative) outrank soft disciplines (whose central problem is interpretation and whose findings are always subject to debate and reinterpretation by others). Likewise, pure intellectual pursuits (which are oriented toward theory and abstracted from particular contexts) outrank those that are applied (which concentrate on practical work and concrete needs).

Knowledge about education is necessarily soft. Education is an extraordinarily complex social activity carried out by quirky and willful actors, and it steadfastly resists any efforts to reduce it to causal laws or predictive theories. Researchers cannot even count on being able to build on the foundation of other people’s work, since the validity of this work is always only partially established. Instead, they must make the best of a difficult situation. They try to interpret what is going on in education, but the claims they make based on these interpretations are highly contingent. Education professors can rarely speak with unclouded authority about their area of expertise or respond definitively when others challenge their authority. Outsiders find it child’s play to demonstrate the weaknesses of educational research and hold it up for ridicule for being inexact, contradictory, and impotent.

Knowledge about education is also necessarily applied. Education is not a discipline, defined by a theoretical apparatus and a research methodology, but an institutional area. As a result, education schools must focus their energies on the issues that arise from this area and respond to the practical concerns confronting educational practitioners in the field — even if doing so leads them into areas in which their constructs are less effective and their chances for success less promising. This situation unavoidably undermines the effectiveness and the intellectual coherence of educational research and thus also calls into question the academic stature of the faculty members who produce that research.

No Prestige for Practical Knowledge

Another related knowledge-based problem faces the education school. A good case can be made for the proposition that American education — particularly higher education — has long placed a greater emphasis on the exchange value of the educational experience (providing usable credentials that can be cashed in for a good job) than on its use value (providing usable knowledge). That is, what consumers have sought and universities have sold in the educational marketplace is not the content of the education received at the university (what the student actually learns there) but the form of this education (what the student can buy with a university degree).

One result of this commodification process is that universities have a strong incentive to promote research over teaching, for publications raise the visibility and prestige of the institution much more effectively than does instruction (which is less visible and more difficult to measure). And a prestigious faculty raises the exchange value of the university’s diploma, independently of whatever is learned in the process of acquiring this diploma. By relying heavily on its faculty’s high-status work in fields of hard knowledge, the university’s marketing effort does not leave an honored role for an education school that produces soft knowledge about practical problems.

A Losing Status, but a Winning Role?

What all of this suggests is that education schools are poorly positioned to play the university status game. They serve the wrong clientele and produce the wrong knowledge; they bear the mark of their modest origins and their traditionally weak programs. And yet they are pressured by everyone from their graduates’ employers to their university colleagues to stay the way they are, since they fulfill so many needs for so many constituencies.

But consider for a moment what would happen if we abandoned the status perspective in establishing the value of higher education. What if we focus instead on the social role of the education school rather than its social position in the academic firmament? What if we consider the possibility that education schools — toiling away in the dark basement of academic ignominy — in an odd way have actually been liberated by this condition from the constraints of academic status attainment? Is it possible that ed schools may have stumbled on a form of academic practice that could serve as a useful model for the rest of the university? What if the university followed this model and stopped selling its degrees on the basis of institutional prestige grounded in the production of abstract research and turned its focus on instruction in usable knowledge?

Though the university status game, with its reliance on raw credentialism — the pursuit of university degrees as a form of cultural currency that can be exchanged for social position — is not likely to go away soon, it is now under attack. Legislators, governors, business executives, and educational reformers are beginning to declare that indeed the emperor is wearing no clothes: that there is no necessary connection between university degrees and student knowledge or between professorial production and public benefit; that students need to learn something when they are in the university; that the content of what they learn should have some intrinsic value; that professors need to develop ideas that have a degree of practical significance; and that the whole university enterprise will have to justify the huge public and private investment it currently requires.

The market-based pattern of academic life has always had an element of the confidence game, since the whole structure depends on a network of interlocking beliefs that are tenuous at best: the belief that graduates of prestigious universities know more and can do more than other graduates; the belief that prestigious faculty make for a good university; and the belief that prestigious research makes for a good faculty. The problem is, of course, that when confidence in any of these beliefs is shaken, the whole structure can come tumbling down. And when it does, the only recourse is to rebuild on the basis of substance rather than reputation, demonstrations of competence rather than symbols of merit.

This dreaded moment is at hand. The fiscal crisis of the state, the growing political demand for accountability and utility, and the intensification of competition in higher education are all undermining the credibility of the current pattern of university life. Today’s relentless demand for lower taxes and reduced public services makes it hard for the university to justify a high level of public funding on the grounds of prestige alone. State governments are demanding that universities produce measurable beneficial outcomes for students, businesses, and other taxpaying sectors of the community. And, by withholding higher subsidies, states are throwing universities into a highly competitive situation in which they vie with one another to see who can attract the most tuition dollars and the most outside research grants, and who can keep the tightest control over internal costs.

In this kind of environment, education schools have a certain advantage over many other colleges and departments in the university. Unlike their competitors across campus, they offer traditionally low-cost programs designed explicitly to be useful, both to students and to the community. They give students practical preparation for and access to a large sector of employment opportunities. Their research focuses on an area about which Americans worry a great deal, and they offer consulting services and policy advice. In short, their teaching, research, and service activities are all potentially useful to students and community alike. How many colleges of arts and letters can say the same?

But before we get carried away with the counterintuitive notion that ed schools might serve as a model for a university under fire, we need to understand that these brow-beaten institutions will continue to gain little credit for their efforts to serve useful social purposes, in spite of the current political saliency of such efforts. One reason for that is the peculiar nature of the occupation – teaching — for which ed schools are obliged to prepare candidates. Another is the difficulty that faces any academic unit that tries to walk the border between theory and practice.

A Peculiar Kind of Professional

Teaching is an extraordinarily complex job. Researchers have estimated that the average teacher makes upward of 150 conscious instructional decisions during the course of the day, each of which has potentially significant consequences for the students involved. From the standpoint of public relations, however, the key difficulty is that, for the outsider, teaching looks all too easy. Its work is so visible, the skills required to do it seem so ordinary, and the knowledge it seeks to transmit is so generic. Students spend a long time observing teachers at work. If you figure that the average student spends 6 hours a day in school for 180 days a year over the course of 12 years, that means that a high school graduate will have logged about 13,000 hours watching teachers do their thing. No other social role (with the possible exception of parent) is so well known to the general public. And certainly no other form of paid employment is so well understood by prospective practitioners before they take their first day of formal professional education.

By comparison, consider other occupations that require professional preparation in the university. Before entering medical, law, or business school, students are lucky if they have spent a dozen hours in close observation of a doctor, lawyer, or businessperson at work. For these students, professional school provides an introduction to the mysteries of an arcane and remote field. But for prospective teachers, the education school seems to offer at best a gloss on a familiar topic and at worst an unnecessary hurdle for twelve-year apprentices who already know their stuff.

Not only have teacher candidates put in what one scholar calls a long “apprenticeship of observation,” but they have also noted during this apprenticeship that the skills a teacher requires are no big deal. For one thing, ordinary adult citizens already know the subject matter that elementary and secondary school teachers seek to pass along to their students — reading, writing, and math; basic information about history, science, and literature; and so on. Because there is nothing obscure about these materials, teaching seems to have nothing about it that can match the mystery and opaqueness of legal contracts, medical diagnoses, or business accounting.

Of course, this perception by the prospective teacher and the public about the skills involved in teaching leaves out the crucial problem of how a teacher goes about teaching ordinary subjects to particular students. Reading is one thing, but knowing how to teach reading is another matter altogether. Ed schools seek to fill this gap in knowledge by focusing on the pedagogy of teaching particular subjects to particular students, but they do so over the resistance of teacher candidates who believe they already know how to teach and a public that fails to see pedagogy as a meaningful skill.

Compounding this resistance to the notion that teachers have special pedagogical skills is the student’s general experience (at least in retrospect) that learning is not that hard — and, therefore, by the skills a teacher extension, that teaching is not hard either. Unlike doctors and lawyers, who use their arcane expertise for the benefit of the client without passing along the expertise itself, teachers are in the business of giving away their expertise. Their goal is to empower the student to the point at which the teacher is no longer needed and the student can function effectively without outside help. The best teachers make learning seem easy and make their own role in the learning process seem marginal. As a result, it is easy to underestimate the difficulty of being a good teacher — and of preparing people to become good teachers.

Finally, the education school does not have exclusive rights to the subject matter that teachers teach. The only part of the teacher’s knowledge over which the ed school has some control is the knowledge about how to teach. Teachers learn about English, history, math, biology, music, and other subjects from the academic departments at the university in charge of these areas of knowledge. Yet, despite the university’s shared responsibility for preparing teachers, ed schools are held accountable for the quality of the teachers and other educators they produce, often taking the blame for the deficiencies of an inadequate university education.

The Border Between Theory and Practice

The intellectual problem facing American education schools is as daunting as the instructional problem, for the territory in which ed schools do research is the mine-strewn border between theory and practice. Traditionally, the university’s peculiar area of expertise has been theory, while the public school is a realm of practice.  In reality, the situation is more complicated, since neither institution can function without relying on both forms of knowledge. Education schools exist, in part, to provide a border crossing between these two countries, each with its own distinctive language and culture and its own peculiar social structure. When an ed school is working well, it presents a model of fluid interaction between university and school and encourages others on both sides of the divide to follow suit. The ideal is to encourage the development of teachers and other educators who can draw on theory to inform their instructional practice, while encouraging university professors to become practice-oriented theoreticians, able to draw on issues from practice in their theory building and to produce theories with potential use value.

In reality, no education school (or any other institution, for that matter) can come close to meeting this ideal. The tendency is to fall on one side of the border or the other — where life is more comfortable and the responsibilities more clear cut — rather than to hold the middle ground and retain the ability to work well in both domains.

But because of their location in the university and their identification with elementary and secondary schools, ed schools have had to keep working along the border. In the process, they draw unrelenting fire from both sides. The university views colleges of education as nothing but trade schools, which supply vocational training but no academic curriculum. Students, complaining that ed-school courses are too abstract and academic, demand more field experience and fewer course requirements. From one perspective, ed-school research is too soft, too applied, and totally lacking in academic rigor, while from another, it is impractical and irrelevant, serving a university agenda while being largely useless to the schools.

Of course, both sides may be right. After years of making and attending presentations at the annual meeting of the American Educational Research Association, I am willing to concede that much of the work produced by educational researchers is lacking in both intellectual merit and practical application. But I would also argue that there is something noble and necessary about the way that the denizens of ed schools continue their quest for a workable balance between theory and practice. If only others in the academy would try to accomplish a marriage of academic elegance and social impact.

A Model for Academe

So where does this leave us in thinking about the poor beleaguered ed school? And what lessons, if any, can be learned from its checkered history?

The genuine instructional and intellectual weakness of ed schools results from the way the schools did what was demanded of them, which, though understandable, was not exactly honorable. Even so, much of the scorn that has come down on the ed school stems from its lowly status rather than from any demonstrable deficiencies in the educational role it has played. But then institutional status has circular quality about it, which means that predictions of high or low institutional quality become self-fulfilling.

In some ways, ed schools have been doing things right. They have wrestled vigorously (if not always to good effect) with the problems of public education, an area that is of deep concern to most citizens. This has meant tackling social problems of great complexity and practical importance, even though the university does not place much value on the production of this kind of messy, indeterminate, and applied knowledge.

Oddly enough, the rest of the university could learn a lot from the example of the ed school. The question, however, is whether others in the university will see the example of the ed school as positive or negative. If academics consider this story in light of the current political and fiscal climate, then the ed school could serve as a model for a way to meet growing public expectations for universities to teach things that students need to know and to generate knowledge that benefits the community.

But it seems more likely that academics will consider this story a cautionary tale about how risky and unrewarding such a strategy can be. After all, education schools have demonstrated that they are neither very successful at accomplishing the marriage of theory and practice nor well rewarded for trying. In fact, the odor of failure and disrespect continues to linger in the air around these institutions. In light of such considerations, academics are likely to feel more comfortable placing their chips in the university’s traditional confidence game, continuing to pursue academic status and to market educational credentials. And from this perspective, the example of the ed school is one they should avoid like the plague. 

Posted in Meritocracy, Politics

Sandel: Disdain for the Less Educated Is the Last Acceptable Prejudice

This post is an op-ed by Michael Sandel, drawing on his new book, The Tyranny of MeritIt was originally published in the New York Times on September 2, 2020.  Here’s a link to the original.

He’s talking about a critical problem that arises from the American meritocracy. What it’s supposed to do is allocate positions according to individual merit instead of birth.  But the default approach is to measure merit by the amount of education you acquire (AA, BA, MA, MD, etc.) and, within each degree level, by the prestige (read: exclusivity) of the college you attended.  Not only does this make for a situation where small differences in merit (college rank, for example) yield large differences in reward.  It also constructs a value system that grants inflated esteem to the work done by the highly educated and denigrates the work done by the less educated.  As a result, Sandel points out, we find ourselves in a cultural setting where disdain for the less educated is a perfectly acceptable form of prejudice.  This has poisoned our politics, fueling populist anger and dysfunctional government.

This critique of the meritocracy resonates with a number of pieces I have post on this subject:  here, here, here, here, and here.

It’s having a corrosive effect on American life — and hurting the Democratic Party.

By 

Joe Biden has a secret weapon in his bid for the presidency: He is the first Democratic nominee in 36 years without a degree from an Ivy League university.

This is a potential strength. One of the sources of Donald Trump’s political appeal has been his ability to tap into resentment against meritocratic elites. By the time of Mr. Trump’s election, the Democratic Party had become a party of technocratic liberalism more congenial to the professional classes than to the blue-collar and middle-class voters who once constituted its base. In 2016, two-thirds of whites without a college degree voted for Mr. Trump, while Hillary Clinton won more than 70 percent of voters with advanced degrees.

Being untainted by the Ivy League credentials of his predecessors may enable Mr. Biden to connect more readily with the blue-collar workers the Democratic Party has struggled to attract in recent years. More important, this aspect of his candidacy should prompt us to reconsider the meritocratic political project that has come to define contemporary liberalism.

At the heart of this project are two ideas: First, in a global, technological age, higher education is the key to upward mobility, material success and social esteem. Second, if everyone has an equal chance to rise, those who land on top deserve the rewards their talents bring.

This way of thinking is so familiar that it seems to define the American dream. But it has come to dominate our politics only in recent decades. And despite its inspiring promise of success based on merit, it has a dark side.

Building a politics around the idea that a college degree is a precondition for dignified work and social esteem has a corrosive effect on democratic life. It devalues the contributions of those without a diploma, fuels prejudice against less-educated members of society, effectively excludes most working people from elective government and provokes political backlash.

Here is the basic argument of mainstream political opinion, especially among Democrats, that dominated in the decades leading up to Mr. Trump and the populist revolt he came to represent: A global economy that outsources jobs to low-wage countries has somehow come upon us and is here to stay. The central political question is not to how to change it but how to adapt to it, to alleviate its devastating effect on the wages and job prospects of workers outside the charmed circle of elite professionals.

The answer: Improve the educational credentials of workers so that they, too, can “compete and win in the global economy.” Thus, the way to contend with inequality is to encourage upward mobility through higher education.

The rhetoric of rising through educational achievement has echoed across the political spectrum — from Bill Clinton to George W. Bush to Barack Obama to Hillary Clinton. But the politicians espousing it have missed the insult implicit in the meritocratic society they are offering: If you did not go to college, and if you are not flourishing in the new economy, your failure must be your own fault.

Posted in Educational goals, Inequality, Schooling, Social mobility

Guhin: How Covid Can Change What Schools Are For

This post is a short essay by Jeffrey Guhin published on August 27, 2020 in Hedgehog Review.  In it he puts forth an argument about the purpose of schooling that resonates with some of my own work, including recent posts here such as this, this, and this.  Here’s a link to the original.

Guhin Image

How COVID Can Change What Schools Are For

What if the purpose of education has nothing to do with social mobility?

Jeffrey Guhin

As school years start up again, just about everyone agrees COVID is hurting education, except, perhaps, those with healthy financial stakes in online learning. Parents are exhausted, students are bored, and teachers and staff are overwhelmed, plus terrified of getting sick. Yet if COVID does nothing else for education, it might force all of us to spend a bit more time examining what all this educational effort is actually for.

Listen to most policymakers, and you’d probably guess the purpose of education is social mobility—lifting poor kids out of poverty and getting middle-class kids into Harvard, even if schools aren’t achieving that goal. Meanwhile, the more radical among us argue that schools are simply doing what they were always intended to do, not fix inequality but maintain it, all the while convincing both winners and losers that where they wind up is where they deserve to be.

But what if—and just go with me here for a second—the purpose of education has nothing to do with social mobility? What if we let schools off the hook for fixing social inequalities and just fixed those inequalities instead? What if we took money from the wealthy and took privileges from the entrenched and we gave these boons to those who needed them more? There are dozens of ways such changes could take shape: wealth taxes, reparations for black and indigenous Americans, stronger unions, and universal child care, to name just a few.

As things are, a focus on social mobility pits students, families, and schools against each other for ever-diminishing resources, making it easy to forget that education could just as easily be about community as it is about competition. Whether it’s school choice and vouchers or simply ensuring it’s your kids who gets the best teachers and COVID pods, schooling as the solution to social mobility helps to reinforce that education, and, well, life, are about each individual getting ahead.  And that cynicism boils down to the experience of learning itself.

When we’re obsessed with schools as the primary solution to social inequality, the content of learning—everything from fractions to Franz Ferdinand—becomes a means to an end. Some of that content might be useful (like learning to read or touch-type). Students might even find some of it meaningful, like the elegance of a complex math solution or the thrill of a well-crafted experiment. Yet all too often that content is treated as a checkbox to complete, with each lesson bringing students one step closer to the degree, the credential they really need.

Education becomes no longer about what students do but rather where students arrive, and so it’s no wonder young people feel alienated by all the time between beginning and end. And, because of our unwillingness to consider more dramatic solutions to inequality, these students and their families know they have no other choice. Families might believe the strangely radical idea that your value as a person is entirely separate from your achievement in school. But until we live in a society that sees inequality as a problem rather than a justification, that kind of commitment to human dignity is a pretty idea that can’t pay the bills.

Concerns about social mobility also dominate discussions about COVID and education. Marginalized students fall behind benchmarks while privileged students get further ahead, whether via pods, readily available parents, or simply the certainty they have the medical care and financial support to handle whatever COVID University sends their way. Fixing these inequalities is as necessary as ever, and they highlight the vital role schools play in the lives of our nation’s children. Schools, often quite literally, feed those who need to be fed.

Yet too often schools are tasked not simply with caring for their students but with repairing an entire social order. Schools can do so much we do not ask of them, like developing solidarity, fostering political responsibility, and ensuring a love of learning for its own sake. Yet the one thing we are most insistent they accomplish, the ensuring of “equal opportunity,” is something even the best school is simply not capable of achieving. We’re asking our schools to do the wrong things, and then blaming them, and their students, when they fail. And now we’re going to try it all again while remote learning.

It doesn’t have to be that way. The potential freedom new COVID syllabi and pedagogies provide us could give room for different ways to think about education, both while we’re in this mess and maybe even when we get back to our classrooms. In her vital work on the purpose of schoolsbell hooks provides a model for such rethinking, echoing the landmark work of John Dewey, Anna J. Cooper, and W.E.B. Du Bois.

hooks helps us recognize that what we learn must be connected to how we live, centered within the relationships students forge with their teachers and with each other. Learning should always be meaningful—capable, at any moment, of bringing us to moral and political transformations. It is through this context of a safe and trusting community that students can learn about injustice and privilege, about problems they can see in their own stories or recognize as their own responsibilities. Ironically, educating for justice becomes much easier when education is no longer considered the only means of building a just society.

As we get ready for another COVID semester, the stress of combining full-time work and full-time de-facto homeschooling is matched by the sadness of smushing all the power and beauty of education into the meritocratic ideology it has come to represent. Our students, our children, are more than achieving automatons. Yet this is where our focus on schools as agents of social mobility has brought them, and us. Don’t let the crisis go to waste. Fix inequality in whatever ways we can. And then we can let education actually be about education, even if we’re still just doing it at home.

Posted in Credentialing, Higher Education, Meritocracy

Rampell — It Takes a B.A. to Find a Job as a File Clerk

This blog post is a still salient 2013 article from the New York Times about credential inflation in the American job market. Turns out that if you want to be a file clerk or runner at a law firm these days, you’re going to need a four-year college degree. Here’s a link to the original.

It Takes a BA Photo

February 19, 2013

It Takes a B.A. to Find a Job as a File Clerk

By CATHERINE RAMPELL

ATLANTA —The college degree is becoming the new high school diploma: the new minimum requirement, albeit an expensive one, for getting even the lowest-level job.

Consider the 45-person law firm of Busch, Slipakoff & Schuh here in Atlanta, a place that has seen tremendous growth in the college-educated population. Like other employers across the country, the firm hires only people with a bachelor’s degree, even for jobs that do not require college-level skills.

This prerequisite applies to everyone, including the receptionist, paralegals, administrative assistants and file clerks. Even the office “runner” — the in-house courier who, for $10 an hour, ferries documents back and forth between the courthouse and the office — went to a four-year school.

“College graduates are just more career-oriented,” said Adam Slipakoff, the firm’s managing partner. “Going to college means they are making a real commitment to their futures. They’re not just looking for a paycheck.”

Economists have referred to this phenomenon as “degree inflation,” and it has been steadily infiltrating America’s job market. Across industries and geographic areas, many other jobs that didn’t used to require a diploma — positions like dental hygienists, cargo agents, clerks and claims adjusters — are increasingly requiring one, according to Burning Glass, a company that analyzes job ads from more than 20,000 online sources, including major job boards and small- to midsize-employer sites.

This up-credentialing is pushing the less educated even further down the food chain, and it helps explain why the unemployment rate for workers with no more than a high school diploma is more than twice that for workers with a bachelor’s degree: 8.1 percent versus 3.7 percent.

Some jobs, like those in supply chain management and logistics, have become more technical, and so require more advanced skills today than they did in the past. But more broadly, because so many people are going to college now, those who do not graduate are often assumed to be unambitious or less capable.

Plus, it’s a buyer’s market for employers.

“When you get 800 résumés for every job ad, you need to weed them out somehow,” said Suzanne Manzagol, executive recruiter at Cardinal Recruiting Group, which does headhunting for administrative positions at Busch, Slipakoff & Schuh and other firms in the Atlanta area.

Of all the metropolitan areas in the United States, Atlanta has had one of the largest inflows of college graduates in the last five years, according to an analysis of census data by William Frey, a demographer at the Brookings Institution. In 2012, 39 percent of job postings for secretaries and administrative assistants in the Atlanta metro area requested a bachelor’s degree, up from 28 percent in 2007, according to Burning Glass.

“When I started recruiting in ’06, you didn’t need a college degree, but there weren’t that many candidates,” Ms. Manzagol said.

Even if they are not exactly applying the knowledge they gained in their political science, finance and fashion marketing classes, the young graduates employed by Busch, Slipakoff & Schuh say they are grateful for even the rotest of rote office work they have been given.

“It sure beats washing cars,” said Landon Crider, 24, the firm’s soft-spoken runner.

He would know: he spent several years, while at Georgia State and in the months after graduation, scrubbing sedans at Enterprise Rent-a-Car. Before joining the law firm, he was turned down for a promotion to rental agent at Enterprise — a position that also required a bachelor’s degree — because the company said he didn’t have enough sales experience.

His college-educated colleagues had similarly limited opportunities, working at Ruby Tuesday or behind a retail counter while waiting for a better job to open up.

“I am over $100,000 in student loan debt right now,” said Megan Parker, who earns $37,000 as the firm’s receptionist. She graduated from the Art Institute of Atlanta in 2011 with a degree in fashion and retail management, and spent months waiting on “bridezillas” at a couture boutique, among other stores, while churning out office-job applications.

“I will probably never see the end of that bill, but I’m not really thinking about it right now,” she said. “You know, this is a really great place to work.”

The risk with hiring college graduates for jobs they are supremely overqualified for is, of course, that they will leave as soon as they find something better, particularly as the economy improves.

Mr. Slipakoff said his firm had little turnover, though, largely because of its rapid expansion. The company has grown to more than 30 lawyers from five in 2008, plus a support staff of about 15, and promotions have abounded.

“They expect you to grow, and they want you to grow,” said Ashley Atkinson, who graduated from Georgia Southern University in 2009 with a general studies degree. “You’re not stuck here under some glass ceiling.”

Within a year of being hired as a file clerk, around Halloween 2011, Ms. Atkinson was promoted twice to positions in marketing and office management. Mr. Crider, the runner, was given additional work last month, helping with copying and billing claims. He said he was taking the opportunity to learn more about the legal industry, since he plans to apply to law school next year.

The firm’s greatest success story is Laura Burnett, who in less than a year went from being a file clerk to being the firm’s paralegal for the litigation group. The partners were so impressed with her filing wizardry that they figured she could handle it.

“They gave me a raise, too,” said Ms. Burnett, a 2011 graduate of the University of West Georgia.

The typical paralegal position, which has traditionally offered a path to a well-paying job for less educated workers, requires no more than an associate degree, according to the Labor Department’s occupational handbook, but the job is still a step up from filing. Of the three daughters in her family, Ms. Burnett reckons that she has the best job. One sister, a fellow West Georgia graduate, is processing insurance claims; another, who dropped out of college, is one of the many degree-less young people who still cannot find work.

Besides the promotional pipelines it creates, setting a floor of college attainment also creates more office camaraderie, said Mr. Slipakoff, who handles most of the firm’s hiring and is especially partial to his fellow University of Florida graduates. There is a lot of trash-talking of each other’s college football teams, for example. And this year the office’s Christmas tree ornaments were a colorful menagerie of college mascots — GatorsBlue DevilsYellow JacketsWolvesEagles,TigersPanthers — in which just about every staffer’s school was represented.

“You know, if we had someone here with just a G.E.D. or something, I can see how they might feel slighted by the social atmosphere here,” he says. “There really is something sort of cohesive or binding about the fact that all of us went to college.”

Posted in Democracy, History, Liberty, Race

The Central Link between Liberty and Slavery in American History

In this post, I explore insights from two important books about the peculiar way in which liberty and slavery jointly emerged from the context of colonial America. One is a new book by David Stasavage, The Decline and Rise of Democracy. The other is a 1992 book by Toni Morrison, Playing in the Dark: Whiteness and the Literary Imagination. The core point I draw from Stasavage is that the same factors that nurtured the development of political liberty in the American context also led to the development of slavery. The related point I draw from Morrison is that the existence of slavery was fundamental in energizing the colonists’ push for self rule.

Stasavage Cover

The Stasavage book explores the history of democracy in the world, starting with early forms that emerged in premodern North America, Europe, and Africa and then fell into decline, followed by the rise of modern parliamentary democracy.  He contrasts this with an alternative form of governance, autocracy, which grew up in a large number of times and places but appeared earliest and most enduringly in China.

He argues that three conditions were necessary for the emergence of early democracy. One is small scale, which allows people to confer as a group instead of relying on a distant leader.  Another is that rulers lack the knowledge about what people were producing, such as an administrative bureaucracy could provide, which means they needed to share power in order to be able to levy taxes effectively.  But I want to focus on the third factor — the existence of an exit option — which is most salient to the colonial American case.  Here’s how he describes it:

The third factor that led to early democracy involved the balance between how much rulers needed their people and how much people could do without their rulers. When rulers had a greater need for revenue, they were more likely to accept governing in a collaborative fashion, and this was even more likely if they needed people to fight wars. With inadequate means of simply compelling people to fight, rulers offered them political rights. The flip side of all this was that whenever the populace found it easier to do without a particular ruler—say by moving to a new location—then rulers felt compelled to govern more consensually. The idea that exit options influence hierarchy is, in fact, so general it also applies to species other than humans. Among species as diverse as ants, birds, and wasps, social organization tends to be less hierarchical when the costs of what biologists call “dispersal” are low.

The central factor that supported the development of democracy in the British colonies was the scarcity of labor:

A broad manhood suffrage took hold in the British part of colonial North America not because of distinctive ideas but for the simple reason that in an environment where land was abundant and labor was scarce, ordinary people had good exit options. This was the same fundamental factor that had favored democracy in other societies.

And this was also the factor that promoted slavery:  “Political rights for whites and slavery for Africans derived from the same underlying environmental condition of labor scarcity.”  Because of this scarcity, North American agricultural enterprises in the colonies needed a way to ensure a flow of laborers to the colonies and a way to keep them on the job once they got there.  The central mechanisms for doing that were indentured servitude and slavery.  Some indentured servants were recruited in Britain with the promise of free passage to the new world in return for a contract to work for a certain number of years.  Others were simply kidnapped, shipped, and then forced to work off their passage.  At the same time Africans initially came to the colonies in a variety of statuses, but this increasingly shifted toward full slavery.  Here’s how he describes the situation in Tidewater colonies.

The early days of forced shipment of English to Virginia sounds like it would have been an environment ripe for servitude once they got there. In fact, it did not always work that way. Once they finished their period of indenture, many English migrants established farms of their own. This exit option must have been facilitated by the fact that they looked like Virginia’s existing British colonists, and they also sounded like them. They would have also shared a host of other cultural commonalities. In other words, they had a good outside option.

Now consider the case of Africans in Virginia, Maryland, and the other British colonies in North America who began arriving in 1619. The earliest African arrivals to Virginia and Maryland came in a variety of situations. Some were free and remained so, some were indentured under term contracts analogous to those of many white migrants, and some came entirely unfree. Outside options also mattered for Africans, and for several obvious reasons they were much worse than those for white migrants. Africans looked different than English people, they most often would not have arrived speaking English, or being aware of English cultural practices, and there is plenty of evidence that people in Elizabethan and Jacobean England associated dark skin with inferiority or other negative qualities. Outside options for Africans were remote to nonexistent. The sustainability of slavery in colonies like Virginia and Maryland depended on Africans not being able to escape and find labor elsewhere. For slave owners it of course helped that they had the law on their side. This law evolved quickly to define exactly what a “slave” was, there having been no prior juridical definition of the term. Africans were now to be slaves whereas kidnapped British boys were bound by “the custom of the country,” meaning that eventual release could be expected.

So labor scarcity and the existence of an attractive exit option provided the formative conditions for developing both white self-rule and Black enslavement.

Morrison Book Cover

Toni Morrison’s book is an reflection on the enduring impact of whiteness and blackness in shaping American literature.  In the passage below, from the chapter titled “Romancing the Shadow,” she is talking about the romantic literary tradition in the U.S.

There is no romance free of what Herman Melville called “the power of blackness,” especially not in a country in which there was a resident population, already black, upon which the imagination could play; through which historical, moral, metaphysical, and social fears, problems, and dichotomies could be articulated. The slave population, it could be and was assumed, offered itself up as surrogate selves for meditation on problems of human freedom, its lure and its elusiveness. This black population was available for meditations on terror — the terror of European outcasts, their dread of failure, powerlessness, Nature without limits, natal loneliness, internal aggression, evil, sin, greed. In other words , this slave population was understood to have offered itself up for reflections on human freedom in terms other than the abstractions of human potential and the rights of man.

The ways in which artists — and the society that bred them — transferred internal conflicts to a “blank darkness,” to conveniently bound and violently silenced black bodies, is a major theme in American literature. The rights of man, for example, an organizing principle upon which the nation was founded, was inevitably yoked to Africanism. Its history, its origin is permanently allied with another seductive concept: the hierarchy of race…. The concept of freedom did not emerge in a vacuum. Nothing highlighted freedom — if it did not in fact create it — like slavery.

Black slavery enriched the country’s creative possibilities. For in that construction of blackness and enslavement could be found not only the not-free but also, with the dramatic polarity created by skin color, the projection of the not-me. The result was a playground for the imagination. What rose up out of collective needs to allay internal fears and to rationalize external exploitation was an American Africanism — a fabricated brew of darkness, otherness, alarm, and desire that is uniquely American.

Such a lovely passage describing such an ugly distinction.  She’s saying that for Caucasian plantation owners in the Tidewater colonies, the presence of Black slaves was a vivid and visceral reminder of what it means to be not-free and thus decidedly not-me.  For people like Jefferson and Washington and Madison, the most terrifying form of unfreedom was in their faces every day.  More than their pale brethren in the Northern colonies, they had a compelling desire to never be treated by the king even remotely like the way they treated their own slaves.  

“The concept of freedom did not emerge in a vacuum. Nothing highlighted freedom — if it did not in fact create it — like slavery.”