How the Ivy League Broke America
NEWS | 16 November 2024
This article was featured in the One Story to Read Today newsletter. Sign up for it here. Every coherent society has a social ideal—an image of what the superior person looks like. In America, from the late 19th century until sometime in the 1950s, the superior person was the Well-Bred Man. Such a man was born into one of the old WASP families that dominated the elite social circles on Fifth Avenue, in New York City; the Main Line, outside Philadelphia; Beacon Hill, in Boston. He was molded at a prep school like Groton or Choate, and came of age at Harvard, Yale, or Princeton. In those days, you didn’t have to be brilliant or hardworking to get into Harvard, but it really helped if you were “clubbable”—good-looking, athletic, graceful, casually elegant, Episcopalian, and white. It really helped, too, if your dad had gone there. Explore the December 2024 Issue Check out more from this issue and find your next story to read. View More Once on campus, studying was frowned upon. Those who cared about academics—the “grinds”—were social outcasts. But students competed ferociously to get into the elite social clubs: Ivy at Princeton, Skull and Bones at Yale, the Porcellian at Harvard. These clubs provided the well-placed few with the connections that would help them ascend to white-shoe law firms, to prestigious banks, to the State Department, perhaps even to the White House. (From 1901 to 1921, every American president went to Harvard, Yale, or Princeton.) People living according to this social ideal valued not academic accomplishment but refined manners, prudent judgment, and the habit of command. This was the age of social privilege. And then a small group of college administrators decided to blow it all up. The most important of them was James Conant, the president of Harvard from 1933 to 1953. Conant looked around and concluded that American democracy was being undermined by a “hereditary aristocracy of wealth.” American capitalism, he argued, was turning into “industrial feudalism,” in which a few ultrarich families had too much corporate power. Conant did not believe the United States could rise to the challenges of the 20th century if it was led by the heirs of a few incestuously interconnected Mayflower families. So Conant and others set out to get rid of admissions criteria based on bloodlines and breeding and replace them with criteria centered on brainpower. His system was predicated on the idea that the highest human trait is intelligence, and that intelligence is revealed through academic achievement. By shifting admissions criteria in this way, he hoped to realize Thomas Jefferson’s dream of a natural aristocracy of talent, culling the smartest people from all ranks of society. Conant wanted to create a nation with more social mobility and less class conflict. He presided during a time, roughly the middle third of the 20th century, when people had lavish faith in social-engineering projects and central planning—in using scientific means to, say, run the Soviet economy, or build new cities like Brasília, or construct a system of efficiency-maximizing roadways that would have cut through Greenwich Village. When universities like Harvard shifted their definition of ability, large segments of society adjusted to meet that definition. The effect was transformative. In trying to construct a society that maximized talent, Conant and his peers were governed by the common assumptions of the era: Intelligence, that highest human trait, can be measured by standardized tests and the ability to do well in school from ages 15 to 18. Universities should serve as society’s primary sorting system, segregating the smart from the not smart. Intelligence is randomly distributed across the population, so sorting by intelligence will yield a broad-based leadership class. Intelligence is innate, so rich families won’t be able to buy their kids higher grades. As Conant put it, “At least half of higher education, I believe, is a matter of selecting, sorting, and classifying students.” By reimagining college-admissions criteria, Conant hoped to spark a social and cultural revolution. The age of the Well-Bred Man was vanishing. The age of the Cognitive Elite was here. At first, Conant’s record did not match his rhetoric. He couldn’t afford to offend the rich families who supplied Harvard with its endowment. In 1951, 18 years into his presidency, the university was still accepting 94 percent of its legacy applicants. When Jews with high grades and test scores began to flood in, Harvard limited the number of applicants it would consider from New Jersey and parts of New York—places that had a lot of Jews. But eventually Conant’s vision triumphed and helped comprehensively refashion American life. If you control the choke points of social mobility, then you control the nation’s culture. And if you change the criteria for admission at places such as Harvard, Yale, and Princeton, then you change the nation’s social ideal. When universities like Harvard shifted their definition of ability, large segments of society adjusted to meet that definition. The effect was transformative, as though someone had turned on a powerful magnet and filaments across wide swaths of the culture suddenly snapped to attention in the same direction. Status markers changed. In 1967, the sociologist Daniel Bell noted that the leadership in the emerging social order was coming from “the intellectual institutions.” “Social prestige and social status,” he foresaw, “will be rooted in the intellectual and scientific communities.” Family life changed as parents tried to produce the sort of children who could get into selective colleges. Over time, America developed two entirely different approaches to parenting. Working-class parents still practice what the sociologist Annette Lareau, in her book Unequal Childhoods, called “natural growth” parenting. They let kids be kids, allowing them to wander and explore. College-educated parents, in contrast, practice “concerted cultivation,” ferrying their kids from one supervised skill-building, résumé-enhancing activity to another. It turns out that if you put parents in a highly competitive status race, they will go completely bonkers trying to hone their kids into little avatars of success. Elementary and high schools changed too. The time dedicated to recess, art, and shop class was reduced, in part so students could spend more of their day enduring volleys of standardized tests and Advanced Placement classes. Today, even middle-school students have been so thoroughly assessed that they know whether the adults have deemed them smart or not. The good test-takers get funneled into the meritocratic pressure cooker; the bad test-takers learn, by about age 9 or 10, that society does not value them the same way. (Too often, this eventually leads them to simply check out from school and society.) By 11th grade, the high-IQ students and their parents have spent so many years immersed in the college-admissions game that they, like 18th-century aristocrats evaluating which family has the most noble line, are able to make all sorts of fine distinctions about which universities have the most prestige: Princeton is better than Cornell; Williams is better than Colby. Universities came to realize that the more people they reject, the more their cachet soars. Some of these rejection academies run marketing campaigns to lure more and more applicants—and then brag about turning away 96 percent of them. America’s opportunity structure changed as well. It’s gotten harder to secure a good job if you lack a college degree, especially an elite college degree. When I started in journalism, in the 1980s, older working-class reporters still roamed the newsroom. Today, journalism is a profession reserved almost exclusively for college grads, especially elite ones. A 2018 study found that more than 50 percent of the staff writers at The New York Times and The Wall Street Journal had attended one of the 34 most elite universities or colleges in the nation. A broader study, published in Nature this year, looked at high achievers across a range of professions—lawyers, artists, scientists, business and political leaders—and found the same phenomenon: 54 percent had attended the same 34 elite institutions. The entire upper-middle-class job market now looks, as the writer Michael Lind has put it, like a candelabrum: “Those who manage to squeeze through the stem of a few prestigious colleges and universities,” Lind writes, “can then branch out to fill leadership positions in almost every vocation.” When Lauren Rivera, a sociologist at Northwestern, studied how elite firms in finance, consulting, and law select employees, she found that recruiters are obsessed with college prestige, typically identifying three to five “core” universities where they will do most of their recruiting—perhaps Harvard, Yale, Princeton, Stanford, and MIT. Then they identify five to 15 additional schools—the likes of Amherst, Pomona, and Berkeley—from which they will more passively accept applications. The résumés of students from other schools will almost certainly never even get read. “Number one people go to number one schools” is how one lawyer explained her firm’s recruiting principle to Rivera. That’s it, in a sentence: Conant’s dream of universities as the engines of social and economic segregation has been realized. Did We Get a Better Elite? Conant’s reforms should have led to an American golden age. The old WASP aristocracy had been dethroned. A more just society was being built. Some of the fruits of this revolution are pretty great. Over the past 50 years, the American leadership class has grown smarter and more diverse. Classic achiever types such as Hillary Clinton, Barack Obama, Jamie Dimon, Ketanji Brown Jackson, Lin-Manuel Miranda, Pete Buttigieg, Julián Castro, Sundar Pichai, Jeff Bezos, and Indra Nooyi have been funneled through prestigious schools and now occupy key posts in American life. The share of well-educated Americans has risen, and the amount of bigotry—against women, Black people, the LGBTQ community—has declined. Researchers at the University of Chicago and Stanford measured America’s economic growth per person from 1960 to 2010 and concluded that up to two-fifths of America’s increased prosperity during that time can be explained by better identification and allocation of talent. From the May 1946 issue: America remakes the university And yet it’s not obvious that we have produced either a better leadership class or a healthier relationship between our society and its elites. Generations of young geniuses were given the most lavish education in the history of the world, and then decided to take their talents to finance and consulting. For instance, Princeton’s unofficial motto is “In the nation’s service and the service of humanity”—and yet every year, about a fifth of its graduating class decides to serve humanity by going into banking or consulting or some other well-remunerated finance job. Would we necessarily say that government, civic life, the media, or high finance work better now than in the mid-20th century? We can scorn the smug WASP blue bloods from Groton and Choate—and certainly their era’s retrograde views of race and gender—but their leadership helped produce the Progressive movement, the New Deal, victory in World War II, the Marshall Plan, NATO, and the postwar Pax Americana. After the meritocrats took over in the 1960s, we got quagmires in Vietnam and Afghanistan, needless carnage in Iraq, the 2008 financial crisis, the toxic rise of social media, and our current age of political dysfunction. Today, 59 percent of Americans believe that our country is in decline, 69 percent believe that the “political and economic elite don’t care about hard-working people,” 63 percent think experts don’t understand their lives, and 66 percent believe that America “needs a strong leader to take the country back from the rich and powerful.” In short, under the leadership of our current meritocratic class, trust in institutions has plummeted to the point where, three times since 2016, a large mass of voters has shoved a big middle finger in the elites’ faces by voting for Donald Trump. The Six Sins of the Meritocracy I’ve spent much of my adult life attending or teaching at elite universities. They are impressive institutions filled with impressive people. But they remain stuck in the apparatus that Conant and his peers put in place before 1950. In fact, all of us are trapped in this vast sorting system. Parents can’t unilaterally disarm, lest their children get surpassed by the children of the tiger mom down the street. Teachers can’t teach what they love, because the system is built around teaching to standardized tests. Students can’t focus on the academic subjects they’re passionate about, because the gods of the grade point average demand that they get straight A’s. Even being a well-rounded kid with multiple interests can be self-defeating, because admissions officers are seeking the proverbial “spiky” kids—the ones who stand out for having cultivated some highly distinct skill or identity. All of this militates against a childhood full of curiosity and exploration. Most admissions officers at elite universities genuinely want to see each candidate as a whole person. They genuinely want to build a campus with a diverse community and a strong learning environment. But they, like the rest of us, are enmeshed in the mechanism that segregates not by what we personally admire, but by what the system, typified by the U.S. News & World Report college rankings, demands. (In one survey, 87 percent of admissions officers and high-school college counselors said the U.S. News rankings force schools to take measures that are “counterproductive” to their educational mission.) In other words, we’re all trapped in a system that was built on a series of ideological assumptions that were accepted 70 or 80 years ago but that now look shaky or just plain wrong. The six deadly sins of the meritocracy have become pretty obvious. 1. The system overrates intelligence. Conant’s sorting mechanism was based primarily on intelligence, a quality that can ostensibly be measured by IQ tests or other standardized metrics. Under the social regime that Conant pioneered, as the historian Nathaniel Comfort has put it, “IQ became a measure not of what you do, but of who you are—a score for one’s inherent worth as a person.” Today’s elite school admissions officers might want to look at the whole person—but they won’t read your beautiful essay if you don’t pass the first threshold of great intelligence, as measured by high grades and sparkling SAT or ACT scores. Ricardo Rey Intelligence is important. Social scientists looking at large populations of people consistently find that high IQ correlates with greater academic achievement in school and higher incomes in adulthood. The Study of Mathematically Precocious Youth, based at Vanderbilt, found that high SAT scores at 12 or 13 correlate with the number of doctorates earned and patents issued. Many elite colleges that had dropped standardized testing as an application requirement are now mandating it again, precisely because the scores do provide admissions officers with a reliable measure of the intellectual abilities that correlate with academic performance and with achievement later in life. But intelligence is less important than Conant and his peers believed. Two people with identical IQ scores can vary widely in their life outcomes. If you rely on intelligence as the central proxy for ability, you will miss 70 percent of what you want to know about a person. You will also leach some of the humanity from the society in which you live. Starting in the 1920s, the psychologist Lewis Terman and his colleagues at Stanford tracked roughly 1,500 high-IQ kids through life. The Termites, as the research subjects were known, did well in school settings. The group earned 97 Ph.D.s, 55 M.D.s, and 92 law degrees. But as the decades went on, no transcendent geniuses emerged from the group. These brilliant young people grew up to have perfectly respectable jobs as doctors, lawyers, and professors, but there weren’t any transformational figures, no world changers or Nobel Prize winners. The whiz kids didn’t grow up to become whiz adults. As the science journalist Joel Shurkin, who has written a book on the Terman study, concluded, “Whatever it was the IQ test was measuring, it was not creativity.” Similarly, in a 2019 paper, the Vanderbilt researchers looked at 677 people whose SAT scores at age 13 were in the top 1 percent. The researchers estimated that 12 percent of these adolescents had gone on to achieve “eminence” in their careers by age 50. That’s a significant percentage. But that means 88 percent did not achieve eminence. (The researchers defined eminence as reaching the pinnacle of a field—becoming a full professor at a major research university, a CEO of a Fortune 500 company, a leader in biomedicine, a prestigious judge, an award-winning writer, and the like.) The bottom line is that if you give somebody a standardized test when they are 13 or 18, you will learn something important about them, but not necessarily whether they will flourish in life, nor necessarily whether they will contribute usefully to society’s greater good. Intelligence is not the same as effectiveness. The cognitive psychologist Keith E. Stanovich coined the term dysrationalia in part to describe the phenomenon of smart people making dumb or irrational decisions. Being smart doesn’t mean that you’re willing to try on alternative viewpoints, or that you’re comfortable with uncertainty, or that you can recognize your own mistakes. It doesn’t mean you have insight into your own biases. In fact, one thing that high-IQ people might genuinely be better at than other people is convincing themselves that their own false views are true. 2. Success in school is not the same thing as success in life. University administrators in the Conant mold assumed that people who could earn high grades would continue to excel later in their career. But school is not like the rest of life. Success in school is about jumping through the hoops that adults put in front of you; success in life can involve charting your own course. In school, a lot of success is individual: How do I stand out? In life, most success is team-based: How can we work together? Grades reveal who is persistent, self-disciplined, and compliant—but they don’t reveal much about emotional intelligence, relationship skills, passion, leadership ability, creativity, or courage. In short, the meritocratic system is built on a series of non sequiturs. We train and segregate people by ability in one setting, and then launch them into very different settings. “The evidence is clear,” the University of Pennsylvania organizational psychologist Adam Grant has written. “Academic excellence is not a strong predictor of career excellence. Across industries, research shows that the correlation between grades and job performance is modest in the first year after college and trivial within a handful of years.” For that reason, Google and other companies no longer look at the grade point average of job applicants. Students who got into higher-ranking colleges, which demand high secondary-school GPAs, are not substantially more effective after they graduate. In one study of 28,000 young students, those attending higher-ranking universities did only slightly better on consulting projects than those attending lower-ranked universities. Grant notes that this would mean, for instance, that a Yale student would have been only about 1.9 percent more proficient than a student from Cleveland State when measured by the quality of their work. The Yale student would also have been more likely to be a jerk: The researchers found that students from higher-ranking colleges and universities, while nominally more effective than other students, were more likely to pay “insufficient attention to interpersonal relationships,” and in some instances to be “less friendly,” “more prone to conflict,” and “less likely to identify with their team.” Also, we have now, for better or worse, entered the Age of Artificial Intelligence. AI is already good at regurgitating information from a lecture. AI is already good at standardized tests. AI can already write papers that would get A’s at Harvard. If you’re hiring the students who are good at those things, you’re hiring people whose talents might soon be obsolete. 3. The game is rigged. The meritocracy was supposed to sort people by innate ability. But what it really does is sort people according to how rich their parents are. As the meritocracy has matured, affluent parents have invested massively in their children so they can win in the college-admissions arms race. The gap between what rich parents and even middle-class parents spend—let’s call it the wealth surplus—is huge. According to the Yale Law professor Daniel Markovits, the author of The Meritocracy Trap, if the typical family in the top 1 percent of earners were to take that surplus—all the excess money they spend, beyond what a middle-class family spends, on their child’s education in the form of private-school tuition, extracurricular activities, SAT-prep courses, private tutors, and so forth—and simply invest it in the markets, it would be worth $10 million or more as a conventional inheritance. But such is the perceived status value of a fancy college pedigree that rich families believe they’ll be better able to transmit elite standing to their kids by spending that money on education. The system is rigged: Students from families in the top 1 percent of earners were 77 times more likely to attend an Ivy League–level school than students from families making $30,000 a year or less. Many elite schools draw more students from the top 1 percent than the bottom 60. The children of the affluent have advantages every step of the way. A 3-year-old who grows up with parents making more than $100,000 a year is about twice as likely to attend preschool as a 3-year-old with parents who make less than $60,000. By eighth grade, children from affluent families are performing four grade levels higher than children from poor families, a gap that has widened by 40 to 50 percent in recent decades. According to College Board data from this year, by the time students apply to college, children from families making more than $118,000 a year score 171 points higher on their SATs than students from families making $72,000 to $90,000 a year, and 265 points higher than children from families making less than $56,000. As Markovits has noted, the academic gap between the rich and the poor is larger than the academic gap between white and Black students in the final days of Jim Crow. From the September 2019 issue: Daniel Markovits on how life became an endless, terrible competition Conant tried to build a world in which colleges weren’t just for the children of the affluent. But today’s elite schools are mostly for the children of the affluent. In 1985, according to the writer William Deresiewicz, 46 percent of the students at the most selective 250 colleges came from the top quarter of the income distribution. By 2000, it was 55 percent. By 2006 (based on a slightly smaller sample), it was 67 percent. Research findings by the Harvard economist Raj Chetty and others put this even more starkly: In a 2017 paper, they reported that students from families in the top 1 percent of earners were 77 times more likely to attend an Ivy League–level school than students who came from families making $30,000 a year or less. Many elite schools draw more students from the top 1 percent of earners than from the bottom 60 percent. In some ways, we’ve just reestablished the old hierarchy rooted in wealth and social status—only the new elites possess greater hubris, because they believe that their status has been won by hard work and talent rather than by birth. The sense that they “deserve” their success for having earned it can make them feel more entitled to the fruits of it, and less called to the spirit of noblesse oblige. Those early administrators dreamed that talent, as they defined it, would be randomly scattered across the population. But talent is rarely purely innate. Talent and even effort cannot, as the UCLA Law School professor Joseph Fishkin has observed, “be isolated from circumstances of birth.” 4. The meritocracy has created an American caste system. After decades of cognitive segregation, a chasm divides the well educated from the less well educated. The average high-school graduate will earn about $1 million less over their lifetime than the average four-year-college graduate. The average person without a four-year college degree lives about eight years less than the average four-year-college grad. Thirty-five percent of high-school graduates are obese, compared with 27 percent of four-year-college grads. High-school grads are much less likely to get married, and women with high-school degrees are about twice as likely to divorce within 10 years of marrying as women with college degrees. Nearly 60 percent of births to women with a high-school degree or less happen out of wedlock; that’s roughly five times higher than the rate for women with at least a bachelor’s degree. The opioid death rate for those with a high-school degree is about 10 times higher than for those with at least a bachelor’s degree. The most significant gap may be social. According to an American Enterprise Institute study, nearly a quarter of people with a high-school degree or less say they have no close friends, whereas only 10 percent of those with college degrees or more say that. Those whose education doesn’t extend past high school spend less time in public spaces, less time in hobby groups and sports leagues. They’re less likely to host friends and family in their home. The advantages of elite higher education compound over the generations. Affluent, well-educated parents marry each other and confer their advantages on their kids, who then go to fancy colleges and marry people like themselves. As in all caste societies, the segregation benefits the segregators. And as in all caste societies, the inequalities involve inequalities not just of wealth but of status and respect. Read: The growing college-degree wealth gap The whole meritocracy is a system of segregation. Segregate your family into a fancy school district. If you’re a valedictorian in Ohio, don’t go to Ohio State; go to one of the coastal elite schools where all the smart rich kids are. It should be noted that this segregation by education tends to overlap with and contribute to segregation by race, a problem that is only deepening after affirmative action’s demise. Black people constitute about 14 percent of the U.S. population but only 9 percent of Princeton’s current freshman class, according to the school’s self-reported numbers, and only 3 percent of Amherst’s and 4.7 percent of Tufts’s, according to federal reporting guidelines. (Princeton has declined to reveal what that number would be based on those federal guidelines.) In the year after the Supreme Court ended affirmative action, MIT says that the number of Black people in its freshman class dropped from 15 percent to 5 percent. For the past 50 years or so, the cognitive elite has been withdrawing from engagement with the rest of American society. Since about 1974, as the Harvard sociologist Theda Skocpol has noted, college-educated Americans have been leaving organizations, such as the Elks Lodge and the Kiwanis Club, where they might rub shoulders with non-educated-class people, and instead have been joining groups, such as the Sierra Club and the ACLU, that are dominated by highly educated folks like themselves. Ricardo Rey “We now have a single route into a single dominant cognitive class,” the journalist David Goodhart has written. And because members of the educated class dominate media and culture, they possess the power of consecration, the power to determine what gets admired and what gets ignored or disdained. Goodhart notes further that over the past two decades, it’s been as though “an enormous social vacuum cleaner has sucked up status from manual occupations, even skilled ones,” and reallocated that status to white-collar jobs, even low-level ones, in “prosperous metropolitan centers and university towns.” This has had terrible social and political consequences. 5. The meritocracy has damaged the psyches of the American elite. The meritocracy is a gigantic system of extrinsic rewards. Its gatekeepers—educators, corporate recruiters, and workplace supervisors—impose a series of assessments and hurdles upon the young. Students are trained to be good hurdle-clearers. We shower them with approval or disapproval depending on how they measure up on any given day. Childhood and adolescence are thus lived within an elaborate system of conditional love. Students learn to ride an emotional roller coaster—congratulating themselves for clearing a hurdle one day and demoralized by their failure the next. This leads to an existential fragility: If you don’t keep succeeding by somebody else’s metrics, your self-worth crumbles. Some young people get overwhelmed by the pressure and simply drop out. Others learn to become shrewd players of the game, interested only in doing what’s necessary to get good grades. People raised in this sorting system tend to become risk-averse, consumed by the fear that a single failure will send them tumbling out of the race. At the core of the game is the assumption that the essence of life fulfillment is career success. The system has become so instrumentalized—How can this help me succeed?—that deeper questions about meaning or purpose are off the table, questions like: How do I become a generous human being? How do I lead a life of meaning? How do I build good character? 6. The meritocracy has provoked a populist backlash that is tearing society apart. Teachers behave differently toward students they regard as smart. Years of research has shown that they smile and nod more at those kids, offer them more feedback, allow them more time to ask questions. Students who have been treated as smart since elementary school may go off to private colleges that spend up to $350,000 per student per year. Meanwhile many of the less gifted students, who quickly perceive that teachers don’t value them the same way, will end up at community colleges that may spend only $17,000 per pupil per year. By adulthood, the highly educated and the less educated work in different professions, live in different neighborhoods, and have different cultural and social values. From the April 2021 issue: Private schools have become truly obscene Many people who have lost the meritocratic race have developed contempt for the entire system, and for the people it elevates. This has reshaped national politics. Today, the most significant political divide is along educational lines: Less educated people vote Republican, and more educated people vote Democratic. In 1960, John F. Kennedy lost the white college-educated vote by two to one and rode to the White House on the backs of the working class. In 2020, Joe Biden lost the white working-class vote by two to one and rode to the White House on the backs of the college-educated. Wherever the Information Age economy showers money and power onto educated urban elites, populist leaders have arisen to rally the less educated: not just Donald Trump in America but Marine Le Pen in France, Viktor Orbán in Hungary, Recep Tayyip Erdoğan in Turkey, Nicolás Maduro in Venezuela. These leaders understand that working-class people resent the know-it-all professional class, with their fancy degrees, more than they do billionaire real-estate magnates or rich entrepreneurs. Populist leaders worldwide traffic in crude exaggerations, gross generalizations, and bald-faced lies, all aimed at telling the educated class, in effect: Screw you and the epistemic regime you rode in on. When income level is the most important division in a society, politics is a struggle over how to redistribute money. When a society is more divided by education, politics becomes a war over values and culture. In country after country, people differ by education level on immigration, gender issues, the role of religion in the public square, national sovereignty, diversity, and whether you can trust experts to recommend a vaccine. Read: Why Americans are so polarized: education and evolution As working-class voters have shifted to the right, progressivism has become an entry badge to the elite. To cite just one example, a study of opinion pieces in The Harvard Crimson found that they became three and a half times more progressive from 2001 to 2023. By 2023, 65 percent of seniors at Harvard, the richest school in the world, identified as progressive or very progressive. James Conant and his colleagues dreamed of building a world with a lot of class-mixing and relative social comity; we ended up with a world of rigid caste lines and pervasive cultural and political war. Conant dreamed of a nation ruled by brilliant leaders. We ended up with President Trump. How to Replace the Current Meritocracy From time to time, someone, usually on the progressive left, will suggest that we dismantle the meritocracy altogether. Any sorting system, they argue, is inherently elitist and unjust. We should get rid of selective admissions. We should get rid of the system that divides elite from non-elite. All students should be treated equally and all schools should have equal resources. I appreciate that impulse. But the fact is that every human society throughout history has been hierarchical. (If anything, that’s been especially true for those societies, such as Soviet Russia and Maoist China, that professed to be free of class hierarchy.) What determines a society’s health is not the existence of an elite, but the effectiveness of the elite, and whether the relationship between the elites and everybody else is mutually respectful. And although the current system may overvalue IQ, we do still need to find and train the people best equipped to be nuclear physicists and medical researchers. If the American meritocracy fails to identify the greatest young geniuses and educate them at places such as Caltech and MIT, China—whose meritocracy has for thousands of years been using standardized tests to cull the brightest of the bright—could outpace us in chip manufacturing, artificial intelligence, and military technology, among other fields. And for all the American education system’s flaws, our elite universities are doing pioneering research, generating tremendous advances in fields such as biotech, launching bright students into the world, and driving much of the American economy. Our top universities remain the envy of the world. The challenge is not to end the meritocracy; it’s to humanize and improve it. A number of recent developments make this even more urgent—while perhaps also making the present moment politically ripe for broad reform. First, the Supreme Court’s ending of affirmative action constrained colleges’ ability to bring in students from less advantaged backgrounds. Under affirmative action, admissions officers had the freedom to shift some weight from a narrow evaluation of test scores to a broader assessment of other qualities—for instance, the sheer drive a kid had to possess in order to accomplish what they did against great odds. If colleges still want to compose racially diverse classes, and bring in kids from certain underrepresented backgrounds, they will have to find new ways to do that. Second, as noted, much of what the existing cognitive elite do can already be done as well as or better by AI—so shouldn’t colleges be thinking about how to find and train the kind of creative people we need not just to shape and constrain AI, but to do what AI (at least as of now) cannot? Third, the recent uproar over Gaza protests and anti-Semitism on campus has led to the defenestration of multiple Ivy League presidents, and caused a public-relations crisis, perhaps even lasting brand damage, at many elite universities. Some big donors are withholding funds. Republicans in Congress are seizing the opportunity to escalate their war on higher education. Now would be a good time for college faculty and administrators to revisit first principles in service of building a convincing case for the value that their institutions provide to America. Fourth, the ongoing birth dearth is causing many schools to struggle with enrollment shortfalls. This demographic decline will require some colleges not just to rebrand themselves, but to reinvent themselves in creative ways if they are to remain financially afloat. In a reformed meritocracy, perhaps colleges now struggling with declining enrollments might develop their own distinctive niches in the ecosystem, their own distinctive ways of defining and nurturing talent. This in turn could help give rise to an educational ecosystem in which colleges are not all arrayed within a single status hierarchy, with Harvard, Yale, and Princeton on top and everyone else below. If we could get to the point where being snobby about going to Stanford seems as ridiculous as being snobby about your great-grandmother’s membership in the Daughters of the American Revolution, this would transform not just college admissions but American childhood. The crucial first step is to change how we define merit. The history of the meritocracy is the history of different definitions of ability. But how do we come up with a definition of ability that is better and more capacious than the one Conant left us? We can start by noting the flaws at the core of his definition. He and his peers were working at a time when people were optimistic that the rational application of knowledge in areas such as statistics, economics, psychology, management theory, and engineering could solve social problems. They admired technicians who valued quantification, objectification, optimization, efficiency. They had great faith in raw brainpower and naturally adopted a rationalist view of humans: Reason is separate from emotions. Economists and political scientists of the era gravitated toward models that were based on the idea that you could view people as perfectly rational actors maximizing their utility, and accurately predict their behavior based on that. Social engineers with this mindset can seem impressively empirical. But over the course of the 20th century, the rationalist planning schemes—the public-housing projects in America’s cities, the central economic planning in the Soviet Union—consistently failed. And they failed for the same reason: The rationalists assumed that whatever can’t be counted and measured doesn’t matter. But it does. Rationalist schemes fail because life is too complex for their quantification methods. In Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, James C. Scott, the late political scientist and anthropologist, describes a 19th-century German effort to improve the nation’s lumber industry. To make forests amenable to scientific quantification, planners had to redefine what forest meant. Trees became timber, and everything not a tree was designated as underbrush—useless stuff that got in the way when workers tried to efficiently harvest the timber. The German rationalists reorganized the forests, planting new trees in neat rows and clearing away all the underbrush. At first, everything seemed to go well. But as the Germans discovered too late, the trees needed the underbrush to thrive. Without the organic messiness that the rationalists had deemed superfluous, the trees’ nutrient cycle got out of whack. They began ailing. A new word entered the German language—Waldsterben, or “forest death.” By focusing on only those parts of the forest that seemed instrumental to their uses, the planners failed to see the forest accurately. In trying to standardize and control the growth process, the planners murdered the trees. The modern meritocracy misunderstands human beings the same way the German rationalists misunderstood trees. To make people legible to the sorting system, researchers draw a distinction between what they call “cognitive” and “noncognitive” skills. Cognitive skills are the “hard” ones that can be easily measured, such as IQ and scores on an algebra test. Noncognitive skills are fuzzier, harder-to-quantify things, such as emotional flexibility, grit, social agility, and moral qualities. But of course all mental actions are cognitive. What this categorization method reveals is how little the rationalists care about the abilities that lie beyond IQ. The modern meritocracy treats the noncognitive realm the way the German planners treated the underbrush; it discounts it. But the putatively “noncognitive” skills can be more important than cognitive ones. Having a fast mental processor upstairs is great, but other traits may do more to determine how much you are going to contribute to society: Do you try hard? Can you build relationships? Are you curious? Are you trustworthy? How do you perform under pressure? The meritocracy as currently constituted seems to want you to be self-centered and manipulative. We put students in competitive classrooms, where the guiding questions are “How am I measuring up?” and “Where am I on the curve?” The importance of noncognitive traits shows up everywhere. Chetty, the Harvard economist, wanted to understand the effect that good teachers have on their pupils. He and his colleagues discovered that what may most differentiate good teachers is not necessarily their ability to produce higher math and reading scores. Rather, what the good teachers seem to impart most effectively are “soft skills”—how to get along with others, how to stay on task. In fact, the researchers found that these soft skills, when measured in the fourth grade, are 2.4 times more important than math and reading scores in predicting a student’s future income. The organizational-leadership expert Mark Murphy discovered something similar when he studied why people get fired. In Hiring for Attitude, he reports that only 11 percent of the people who failed at their jobs—that is, were fired or got a bad performance review—did so because of insufficient technical competence. For the other 89 percent, the failures were due to social or moral traits that affected their job performance—sour temperament, uncoachability, low motivation, selfishness. They failed because they lacked the right noncognitive skills. Murphy’s study tracked 20,000 new hires and found that 46 percent of them failed within 18 months. Given how painful and expensive it is for an organization to replace people, this is a cataclysmic result. Why aren’t firms better at spotting the right people? Why do we have such a distorted and incomplete view of what constitutes human ability? The Humanist Turn In reconceiving the meritocracy, we need to take more account of these noncognitive traits. Our definition of ability shouldn’t be narrowly restricted to who can ace intelligence tests at age 18. We need to stop treating people as brains on a stick and pay more attention to what motivates people: What does this person care about, and how driven are they to get good at it? We shouldn’t just be looking for skillful teenage test-takers; we want people with enough intrinsic desire to learn and grow all the days of their life. Leslie Valiant, a computer-science professor at Harvard who has studied human cognition for years, has written that “notions like smartness and intelligence are almost like nonsense,” and that what matters more for civilizational progress is “educability,” the ability to learn from experience. If I were given the keys to the meritocracy, I’d redefine merit around four crucial qualities. Curiosity. Kids are born curious. One observational study that followed four children between the ages of 14 months and 5 years found that they made an average of 107 inquiries an hour. Little kids ask tons of questions. Then they go to school, and the meritocracy does its best to stamp out their curiosity. In research for her book The Hungry Mind, the psychologist Susan Engel found that in kindergarten, students expressed curiosity only 2.4 times every two hours of class time. By fifth grade, that was down to 0.48 times. What happened? Although teachers like the idea of curiosity, our current system doesn’t allow it to blossom. A typical school wants its students to score well on standardized tests, which in turn causes the school to encourage teachers to march through a certain volume of content in each class period. If a student asks a question because she is curious about something, she threatens to take the class off course. Teachers learn to squelch such questions so the class can stay on task. In short, our current meritocracy discourages inquiry in favor of simply shoveling content with the goal of improving test scores. And when children have lost their curiosity by age 11, Engel believes, they tend to remain incurious for the rest of their life. From the January/February 2005 issue: Lost in the meritocracy This matters. You can sometimes identify a bad leader by how few questions they ask; they think they already know everything they need to. In contrast, history’s great achievers tend to have an insatiable desire to learn. In his study of such accomplished creative figures, the psychologist Frank Barron found that abiding curiosity was essential to their success; their curiosity helped them stay flexible, innovative, and persistent. Our meritocratic system encourages people to focus narrowly on cognitive tasks, but curiosity demands play and unstructured free time. If you want to understand how curious someone is, look at how they spend their leisure time. In their book, Talent: How to Identify Energizers, Creatives, and Winners Around the World, the venture capitalist Daniel Gross and the economist Tyler Cowen argue that when hiring, you should look for the people who write on the side, or code on the side, just for fun. “If someone truly is creative and inspiring,” they write, “it will show up in how they allocate their spare time.” In job interviews, the authors advise hiring managers to ask, “What are the open tabs on your browser right now?” A sense of drive and mission. When the Austrian neurologist and psychiatrist Viktor Frankl was imprisoned in Nazi concentration camps, he noticed that the men who tended to survive the longest had usually made a commitment to something outside the camps—a spouse, a book project, a vision of a less evil society they hoped to create. Their sense that life had meaning, Frankl concluded, sustained them even in the most dehumanizing circumstances. A sense of meaning and commitment has value even in far less harrowing conditions. People with these qualities go to where the problems are. They’re willing to run through walls. Some such people are driven by moral emotions—indignation at injustice, compassion for the weak, admiration for an ideal. They have a strong need for a life of purpose, a sense that what they are doing really matters. As Frankl recognized, people whose lives have a transcendent meaning or a higher cause have a sense of purpose that drives them forward. You can recognize such people because they have an internal unity—the way, say, the social-justice crusader Bryan Stevenson’s whole life has a moral coherence to it. Other people are passionate about the pursuit of knowledge or creating beautiful tools that improve life: Think of Albert Einstein’s lifelong devotion to understanding the universe, or Steve Jobs’s obsession with merging beauty and function. I once asked a tech CEO how he hires people. He told me that after each interview, he asks himself, “Is this person a force of nature? Do they have spark, willpower, dedication?” A successful meritocracy will value people who see their lives as a sacred mission. Social intelligence. When Boris Groysberg, an organizational-behavior professor at Harvard Business School, looked at the careers of hundreds of investment analysts who had left one financial firm to work at another, he discovered something surprising: The “star equity analysts who switched employers paid a high price for jumping ship relative to comparable stars who stayed put,” he reports in Chasing Stars: The Myth of Talent and the Portability of Performance. “Overall, their job performance plunged sharply and continued to suffer for at least five years after moving to a new firm.” These results suggest that sometimes talent inheres in the team, not the individual. In an effective meritocracy, we’d want to find people who are fantastic team builders, who have excellent communication and bonding skills. Coaches sometimes talk about certain athletes as “glue guys,” players who have that ineffable ability to make a team greater than the sum of its parts. This phenomenon has obvious analogies outside sports. The Harvard economist David Deming has shown that across recent decades, the value of social skills—of being a workplace “glue guy”—has increased as a predictor of professional success, while the value of cognitive ability has modestly declined. David Deming: The single biggest fix for inequality at elite colleges The meritocracy as currently constituted seems to want you to be self-centered and manipulative. We put students in competitive classrooms, where the guiding questions are “How am I measuring up?” and “Where am I on the curve?” Research has shown, however, that what makes certain teams special is not primarily the intelligence of its smartest members but rather how well its leaders listen, how frequently its members take turns talking, how well they adjust to one another’s moves, how they build reciprocity. If even one team member hogs airtime, that can impede the flow of interaction that teams need to be most effective. Based on cognitive skills alone, Franklin D. Roosevelt, probably the greatest president of the 20th century, would never get into Harvard today. As Oliver Wendell Holmes Jr. observed, he had only “a second-class intellect.” But that was paired, Holmes continued, with a “first-class temperament.” That temperament, not his IQ, gave Roosevelt the ability to rally a nation. Agility. In chaotic situations, raw brainpower can be less important than sensitivity of perception. The ancient Greeks had a word, metis, that means having a practiced eye, the ability to synthesize all the different aspects of a situation and discern the flow of events—a kind of agility that enables people to anticipate what will come next. Academic knowledge of the sort measured by the SATs doesn’t confer this ability; inert book learning doesn’t necessarily translate into forecasting how complex situations will play out. The University of Pennsylvania psychologist and political scientist Philip E. Tetlock has found that experts are generally terrible at making predictions about future events. In fact, he’s found that the more prominent the expert, the less accurate their predictions. Tetlock says this is because experts’ views are too locked in—they use their knowledge to support false viewpoints. People with agility, by contrast, can switch among mindsets and riff through alternative perspectives until they find the one that best applies to a given situation. Possessing agility helps you make good judgments in real time. The neuroscientist John Coates used to be a financial trader. During the bull-market surges that preceded big crashes, Coates noticed that the traders who went on to suffer huge losses had gotten overconfident in ways that were physically observable. They flexed their muscles and even walked differently, failing to understand the meaning of the testosterone they felt coursing through their bodies. Their “assessment of risk is replaced by judgments of certainty—they just know what is going to happen,” Coates writes in The Hour Between Dog and Wolf. The traders, in other words, got swept up in an emotional cascade that warped their judgment. The ones who succeeded in avoiding big losses were not the ones with higher IQs but the ones who were more sensitively attuned to their surging testosterone and racing hearts, and were able to understand the meaning of those sensations. Good traders, Coates observes, “do not just process information, they feel it.” Ricardo Rey The physicist and science writer Leonard Mlodinow puts the point more broadly. “While IQ scores may correlate to cognitive ability,” he writes in Emotional: How Feelings Shape Our Thinking, “control over and knowledge of one’s emotional state is what is most important for professional and personal success.” If we can orient our meritocracy around a definition of human ability that takes more account of traits like motivation, generosity, sensitivity, and passion, then our schools, families, and workplaces will readjust in fundamental ways. Rebuilding the Meritocracy When the education scholars Jal Mehta and Sarah Fine toured America’s best high schools for their book, In Search of Deeper Learning, they found that even at many of these top schools, most students spent the bulk of their day bored, disengaged, not learning; Mehta and Fine didn’t find much passionate engagement in classrooms. They did, however, find some in noncore electives and at the periphery of the schools—the debate team, the drama club, the a cappella groups, and other extracurriculars. During these activities, students were directing their own learning, teachers served as coaches, and progress was made in groups. The students had more agency, and felt a sense of purpose and community. As it happens, several types of schools are trying to make the entire school day look more like extracurriculars—where passion is aroused and teamwork is essential. Some of these schools are centered on “project-based learning,” in which students work together on real-world projects. The faculty-student relationships at such schools are more like the one between a master and an apprentice than that between a lecturer and a listener. To succeed, students must develop leadership skills and collaboration skills, as well as content knowledge. They learn to critique one another and exchange feedback. They teach one another, which is a powerful way to learn. Mehta and Fine profiled one high school in a network of 14 project-based charter schools serving more than 5,000 students. The students are drawn by lottery, representing all social groups. They do not sit in rows taking notes. Rather, grouped into teams of 50, they work together on complicated interdisciplinary projects. Teachers serve as coaches and guides. At the school Mehta and Fine reported on, students collaborated on projects such as designing exhibits for local museums and composing cookbooks with recipes using local ingredients. At another project-based-learning school, High Tech High in San Diego, which is featured in the documentary Most Likely to Succeed, one group of students built a giant wooden model with gears and gizmos to demonstrate how civilizations rise and fall; another group made a film about how diseases get transmitted through the bloodstream. In these project-based-learning programs, students have more autonomy. These schools allow students to blunder, to feel like they are lost and flailing—a feeling that is the predicate of creativity. Occasional failure is a feature of this approach; it cultivates resilience, persistence, and deeper understanding. Students also get to experience mastery, and the self-confidence that comes with tangible achievement. Most important, the students get an education in what it feels like to be fully engaged in a project with others. Their school days are not consumed with preparing for standardized tests or getting lectured at, so their curiosity is enlarged, not extinguished. Of course, effective project-based learning requires effective teachers, and as a country we need to invest much more in teacher training and professional development at the elementary- and secondary-school levels. But emerging evidence suggests that the kids enrolled in project-based-learning programs tend to do just as well as, if not better than, their peers on standardized tests, despite not spending all their time preparing for them. This alone ought to convince parents—even, and perhaps especially, those parents imprisoned in the current elite college-competition mindset—that investing aggressively in project-based and other holistic learning approaches across American education is politically feasible. Building a school system geared toward stimulating curiosity, passion, generosity, and sensitivity will require us to change the way we measure student progress and spot ability. Today we live in the world of the transcript—grades, test scores, awards. But a transcript doesn’t tell you if a student can lead a dialogue with others, or whether a kid is open-minded or closed-minded. Helpfully, some of these project-based-learning schools are pioneering a different way to assess kids. Students don’t graduate with only report cards and test scores; they leave with an electronic portfolio of their best work—their papers, speeches, projects—which they can bring to prospective colleges and employers to illustrate the kind of work they are capable of. At some schools, students take part in “portfolio defenses,” comparable to a grad student’s dissertation defense. The portfolio method enlarges our understanding of what assessment can look like. Roughly 400 high schools are now part of an organization called the Mastery Transcript Consortium, which uses an alternative assessment mechanism. Whereas a standard report card conveys how much a student knows relative to their classmates on a given date, the mastery transcript shows with much greater specificity how far the student has progressed toward mastering a given content area or skill set. Teachers can determine not only who’s doing well in math, but who’s developing proficiency in statistical reasoning or getting good at coming up with innovative experiment designs. The mastery report also includes broader life skills—who is good at building relationships, who is good at creative solutions. No single assessment can perfectly predict a person’s potential. The best we can do is combine assessment techniques: grades and portfolios, plus the various tests that scholars have come up with to measure noncognitive skills—the Grit Scale, the Moral Character Questionnaire, social-and-emotional-learning assessments, the High Potential Trait Indicator. All of these can be informative, but what’s important is that none of them is too high-stakes. We are using these assessments to try to understand a person, not to rank her. Data are good for measuring things, but for truly knowing people, stories are better. In an ideal world, high-school teachers, guidance counselors, and coaches would collaborate each year on, say, a five-page narrative about each student’s life. Some schools do this now, to great effect. College-admissions officers may not have time to carefully study a five-page narrative about each applicant, nor will every high-school teacher or college counselor have time to write one. But a set of tools and institutions is emerging that can help with this. In Australia, for example, some schools use something called the Big Picture Learning Credential, which evaluates the traits that students have developed in and out of the classroom—communication skills, goal setting, responsibility, self-awareness. Creating a network of independent assessment centers in this country that use such tools could help students find the college or training program best suited to their core interests. The centers could help college-admissions officers find the students who are right for their institution. They could help employers find the right job applicants. In short, they could help everybody in the meritocracy make more informed decisions. These assessment methods would inevitably be less “objective” than an SAT or ACT score, but that’s partly the point. Our current system is built around standardization. Its designers wanted to create a system in which all human beings could be placed on a single scale, neatly arrayed along a single bell curve. As the education scholar Todd Rose writes in The End of Average, this system is built upon “the paradoxical assumption that you could understand individuals by ignoring their individuality.” The whole system says to young people: You should be the same as everyone else, only better. The reality is that there is no single scale we can use to measure human potential, or the capacity for effective leadership. We need an assessment system that prizes the individual over the system, which is what a personal biography and portfolio would give us—at least in a fuller way than a transcript does. The gatekeepers of a more effective meritocracy would ask not just “Should we accept or reject this applicant?” and “Who are the stars?” but also “What is each person great at, and how can we get them into the appropriate role?” A new, broader definition of merit; wider adoption of project-based and similar types of learning; and more comprehensive kinds of assessments—even all of this together gets us only so far. To make the meritocracy better and fairer, we need to combine these measures with a national overhaul of what UCLA’s Joseph Fishkin calls the “opportunity structure,” the intersecting lattice of paths and hurdles that propel people toward one profession or way of life and away from others. Right now, America’s opportunity structure is unitary. To reach commanding heights, you have to get excellent grades in high school, score well on standardized tests, go to college, and, in most cases, get a graduate degree. Along the way, you must navigate the various channels and bottlenecks that steer and constrain you. Historically, when reformers have tried to make pathways to the elite more equal, they’ve taken the existing opportunity structure for granted, trying to give select individuals, or groups of individuals, a boost. This is what affirmative action did. Fishkin argues that we need to refashion the opportunity structure itself, to accommodate new channels and create what he calls opportunity pluralism. “The goal needs to be to give people access to a broader range of paths they can pursue,” Fishkin writes in Bottlenecks: A New Theory of Equal Opportunity, “so that each of us is then able to decide—in a more autonomous way and from a richer set of choices—what combinations of things we actually want to try to do with our lives.” With greater opportunity pluralism, the gatekeepers will have less power and the individuals striving within the structure will have more. If the meritocracy had more channels, society would no longer look like a pyramid, with a tiny, exclusive peak at the top; it would look like a mountain range, with many peaks. Status and recognition in such a society would be more broadly distributed, diminishing populist resentment and making cultural cohesion more likely. As a social ideal to guide our new meritocracy, we could do worse than opportunity pluralism. It aspires to generate not equal opportunity but maximum opportunity, a wide-enough array of pathways to suit every living soul. Achieving that ideal will require a multifaceted strategy, starting with the basic redefinition of merit itself. Some of the policy levers we might pull include reviving vocational education, making national service mandatory, creating social-capital programs, and developing a smarter industrial policy. Let’s consider vocational education first. From 1989 to 2016, every single American president took measures to reform education and prepare students for the postindustrial “jobs of the future.” This caused standardized testing to blossom further while vocational education, technical education, and shop class withered. As a result, we no longer have enough skilled workers to staff our factories. Schools should prepare people to build things, not just to think things. Second, yes, trotting out national service as a solution to this or that social ailment has become a cliché. But a true national-service program would yield substantial benefits. Raj Chetty and his colleagues have found that cross-class friendships—relationships between people from different economic strata—powerfully boost social mobility. Making national service a rite of passage after high school might also help shift how status gets allocated among various job categories. Third, heretical though this may sound, we should aim to shrink the cultural significance of school in American society. By age 18, Americans have spent only 13 percent of their time in school. Piles of research across 60 years have suggested that neighborhoods, peers, and family background may have a greater influence on a person’s educational success than the quality of their school. Let’s invest more in local civic groups, so a greater number of kids can grow up in neighborhoods with community organizations where they can succeed at nonacademic endeavors—serving others, leading meetings, rallying neighbors for a cause. Fourth, although sending manufacturing jobs overseas may have pleased the efficiency-loving market, if we want to live in an economy that rewards a diversity of skills, then we should support economic policies, such as the CHIPS and Science Act, that boost the industrial sector. This will help give people who can’t or don’t want to work in professional or other office jobs alternative pathways to achievement. If we sort people only by superior intelligence, we’re sorting people by a quality few possess; we’re inevitably creating a stratified, elitist society. We want a society run by people who are smart, yes, but who are also wise, perceptive, curious, caring, resilient, and committed to the common good. If we can figure out how to select for people’s motivation to grow and learn across their whole lifespan, then we are sorting people by a quality that is more democratically distributed, a quality that people can control and develop, and we will end up with a fairer and more mobile society. In 1910, the U.S. ambassador to the Netherlands wrote a book in which he said: “The Spirit of America is best known in Europe by one of its qualities—energy.” What you assess is what you end up selecting for and producing. We should want to create a meritocracy that selects for energy and initiative as much as for brainpower. After all, what’s really at the core of a person? Is your IQ the most important thing about you? No. I would submit that it’s your desires—what you are interested in, what you love. We want a meritocracy that will help each person identify, nurture, and pursue the ruling passion of their soul.
Author: David Brooks.
Source