The Case Against Credentialism

The Atlantic Monthly

December, 1985

"The Case Against Credentialism"

By James Fallows

In 1961, David McClelland, a psychologist at Harvard, published The Achieving Society, an extravagantly ambitious attempt to discover why certain cultures "worked" better than others. Why, among West African tribes were the Ashanti and the Ibo so economically dominant? Why was so much of the commerce of Southeast Asia run by expatriate Chinese, and so little by the Malays among whom they lived? Why had Jewish immigrants to the United States risen faster than southern Italians?

McClelland's answer involved a value he called n Achievement, which varied from culture to culture and gave members of different societies ways to view the working of fate. Some cultures taught that struggle was fruitless, since success or failure ultimately depended on destiny and the gods. Others conveyed to their children the view that every person could control or at least influence his outcome in life. Luck mattered, but a prudent man could make his own luck. The odds might be long but they were rarely insuperable. Indeed, in "achieving" societies people regularly underestimated the odds against them and launched ventures that on the facts seemed quite likely to fail. For indications about the n Achievement level of various cultures McClelland looked at nursery rhymes, children's stories, folktales, and other vehicles for the unconscious transmission of values.

When American culture was viewed through this lens, its a folktales seemed to promote an astronomically high level of n Achievement. Benjamin Franklin, Abraham Lincoln, Ulysses S. Grant, Thomas Edison, Andrew and even Dale Carnegie—these and countless other self-made men prove that hard work might well be rewarded and sights could never be set too high. Staring wth Poor Richard's Almanac and running through the Horatio Alger series and such inspirational business tracts as Acres of Diamonds and A Message to Garcia, the flourishing self-improvement industry reflected the American faith that each person held the keys to success in his own hands.

With their regression charts and biographical data, sociologists have demonstrated that the saga of the self-made success was partly myth. The Americans business titan of the late nineteenth century was more likely to have been born to a comfortable, educated, urban family than to have been a son of toil. Still, McClelland never claimed that the folktales he analyzed—about bad fairies and magic spiders and friendly giants—were literally true. What mattered was that they were told and heard, and that they shaped a culture's attitude. Repeated in schoolrooms and parlors, emphasized in speeches, novels, and popular magazines, the folktales of American business successes emboldened the impressionable public to try. When the sociologist Ely Chinoy studied a community of autoworkers in the 1950s, many people told him that they viewed their jobs in the factory as temporary. Their real dream was to strike out on their own, with a farm, a gas station, or a store.

McClelland's most important point was probably his initial one: that there is a deep connection between ways we hope to advance as individuals and the economic resiliency the entire culture displays. The nation's bookshelves now groan with analyses of America's productivity problems the competitive woes. Might part of what they seek to explain lie in the changing folklore of success and the private concepts of ambition?

To judge by the recent celebration of entrepreneurs, the American business folklore would seem to be robustly n Achievement-laden as ever. Not since the 1920s has there been so little cynicism and so much public piety about the person who takes a risk, goes out on his own, makes it all work. But once we move past the admiring profiles of software titans and biotechnology kings, the idea that the United States has given itself over to a resurgent entrepreneurial culture is hard to believe. In fact, we are seeing a war between two quite different cultures of achievement, with quite different implications for America's economic ability to adapt and pay its way.

One is the assortment of informal, outside-normal-channels, no-guarantee, and low-prestige activities that is glossed over and glamorized by the term entrepreneurialism. Most of the entrepreneurs who rise to public notice have, of course, already proved themselves successful. When we read the inspiring chronicles of Jack Kilby, a co-inventor of the silicon chip, or Fred Smith, who founded Federal Express, we know the early risks will eventually seem prudent and the early scoffers will have the joke turned on them. But the thousands of people who are trying to develop tomorrow's new industries have no such certainty: they can't be sure whether they're starting the next Xerox or the next Osborne Computers. Perhaps more important, the world seems to suspect the worst of entrepreneurs. The term inventor still conjures up a character with a garage full of gadgets; how much more dignified is the sound of banker, lawyer, or manager at IBM. No one brags to friends about children who have signed up for the Kearn Computer repair schools advertised on matchbook covers, even though such self-help courses epitomize the n Achievement idea that individuals can improve their standing and control their own fate.

Parents brag, instead, about the son who has finished college or the daughter who has been accepted to law school. Even as modern America honors the successful entrepreneur, it reflects the tremendous pull exerted by the security, dignity, and order of the professionalized world. The basic tenet of this culture of achievement is that he who goes further in school will go further in life. American society is often described as a meritocracy, in the sense that those who show the most pluck and academic merit will prevail. The Houston housewife who labored in obscure solitude on her first novel, picked an agent's name out of a magazine, and then sold her book last summer for $350,000 is a figure from the first culture, that of self-help; if she uses the money to send her son to Andover, Yale, and Harvard Law, he will be citizen of the second, the meritocracy.

The rise to professional status is one of the most familiar and cherished parts of the American achievement ideal. What immigrant saga would be complete without the peddler's grandson receiving his M.D.? But such an ideal is also at odds with most analyses of what the society as a whole needs if it is to continue to achieve. If everyone has the tenure and security that come with professional status, who will take the risks?

NOWHERE IS THE TENSION BETWEEN THE TWO CULTURES, the entrepreneurial and the professional, more evident at the moment than in American business. At just the time when American business is said to need the flexibility and the lack of hierarchy that an entrepreneurial climate can create, more and more businessmen seem to feel that their chances for personal success will be greatest if they become not entrepreneurs but professionals, with advanced educational degrees.

In the past twenty years enrollment in graduate business schools has increased by a factor of ten. Next spring 67,000 new M.B.A.s will take their degrees to the marketplace. Alert to the workings of supply and demand, some business-school officials have predicted a glut; already, newer, weaker schools have been retrenching,and some recent graduates have settled for less attractive jobs than they might once have hoped to get. Still, overall enrollment continues to rise, and graduates of the most prominent schools are heavily in demand. The business-school community closely studies each school's "return on investment" or "value added" ratio—how much an M.B.A. degree adds to a person's salary, compared with how much it costs to obtain. At Dartmouth's Amos Tuck School, the nation's oldest graduate business college, tuition this year is $11,000, and the average starting salary for graduates is around $43,000. "That four-to-one ratio has been constant for atleast the fifteen or twenty years. I've been aware of it," Colin Blaydon, Amos Tuck's dean, says. Harvard also reports a four-to-one ratio, down from the heady seven-to-one ratio of 1969, but not so far that Harvard has any trouble filing its admissions quotas.

The rise of the M.B.A. has occurred during precisely the era in which, as anyone who follows business magazines is aware, the content of graduate business training has come under increasing attack. "We have created a monster," H. Edward Wrapp, of the University of Chicago's business school, wrote in 1980, in Dun's Review. "The business schools have done more to insure the success of the Japanese and West German invasion of America than any one thing I can think of." I'd close every one of the graduate schools of business," Michael Thomas, an investment banker and author, wrote in The New York Times.

The specific case against business schools is that they have neglected certain skills and outlooks that are essential to America's commercial renaissance while inculcating values that can do harm. The traditional strength of business education has been to provide students with a broad view of many varied business functions—marketing, finance, production, and so forth. But like sociology and political science, business training has gotten all wrapped up in mathematical models and such ideas as can be boiled down to numbers. This shift has led schools to play down two fundamental but hard-to-quantify business imperatives: creating the conditions that will permit the design and production of high-quality goods, and waging the constant struggle to inspire, cajole, discipline, lead, and in general persuade employees to work in common cause.

IT IS BY NOW CONVENTIONAL WISDOM THAT AN UNDER-emphasis on production and on leadership lay behind many of America's industrial difficulties in the 1970s. "Business is principally about design and building and selling, and basic management is not paying attention to those functions," says Thomas Peters, a co-author of the legendarily successful In Search of Excellence. "In business and in business schools we don't focus enough on how to lead people. Until you have managed people, you don't have any idea how complex it is. The difficulty now is that the youngster comes to business school without any experience or taste about managing people, so he or she can't ask challenging questions. So in the class you've got the standard business-school professor, who got a Ph.D. in statistics at age twenty and a half, talking to students who got 800s on their GMA's. It sure feels good to both parties, but it doesn't have much to do with business. I wouldn't let anybody in theplace before age thirty."

In addition to what business school, because of its academic and theoretical emphasis, neglects, it is said to inculcate attitudes hostile to the flexibility and daring celebrated in today's entrepreneurial heroes. "The students I see are very concerned about resume value, and very risk-averse," says Roger Muller, who got his M.B.A. in 1973 and is now the director of placement at Amos Tuck. "They don't want to take any chances. Someone will come in wanting to be an investment banker. He decides, I've got to work for Goldman, Sachs this summer, because that will give me a better shot at the job with Goldman, Sachs when I get out of here. They want everything to line up just right and are easily frustrated when things don't."

Goldman, Sachs is not an idly chosen example: investment banking and consulting firms are the most popular outlet for M.B.A.s, especially from the top schools, and they pay the highest salaries. When the Harvard Business School surveyed its class of 1984, nearly 40 percent said they had taken jobs in consulting or investment banking. Consulting, with a median starting salary of $52,300, was the most lucrative field. Investment banking at $45,000, was second highest. Manufacturing attracted only a quarter of the class, and its most common starting salary was under $40,000. The perversity of such a preference is that students are hoping to find security in the very pursuits that add such instability to the American financial structure. This fall Business Week featured a report on the "Casino Economy"—the tremendous increase in speculation, merger, corporate rearrangement, tax avoidance, and other forms of financial churning that makes fortunes for investment bankers while ratcheting up the level of corporate debt. To such efforts are the best and brightest now drawn.

From the student's point of view, the continuing migration into business school and from there onward to consulting firms and banks is hardly mysterious. This is where the money is. But when we think about our culture and its parables of ambition, the rise of M.B.A.s and consultants raises a question like that posed by the prestige and prominence of the legal establishment. Why is so much raw talent creamed off for pursuits of such dubious economic value? Why are so many of our smartest people induced to spend their adult lives waging merger wars against one another and doing battle over the tax code? Even the factory workers who once dreamed of opening their own stores have, it seems, resent their sights. When Richard Sennett and Jonathan Cobb interviewed a group of working-class parents in the 1970s, the parents "did not speak about the good life for their children in terms of small business. It exists. most of them believe, in the professions, in medicine or college teaching or architecture...."

Why did the professions become so attractive, and independent business so unappealing? Why has there been such a surge in the most "professionalized" form of business, investment banking, and such a decline (despite the current romance of high tech) in designing, building, and selling America's goods? Through the years certain cultures have rewarded behavior that eventually proved ruinous to the society as a whole—the British upper class's desire to be free of the taint of commerce is the most famous example. Is a similar perverse process at work here?

One way to understand the professionalization of business is to step back from strictly commercial concerns and follow the course of an enormous change in American society over the past hundred years. The connection between education and occupation is now so firmly ingrained as to seem almost a fact of nature. To get a good job, you get a diploma: at once time a high school diploma stuffed, and then a B.A., but now you're better off with a J.D. or an M.B.A. When Richard Hernstein, a HArvard psychologist, wrote a book called I.Q. in the Meritocracy, in 1973, parts of his argument were controversial but not his assertion that success in school was and should be a prerequiste to success in later working life. "The gradient of occupation is, then, a natural, essentially unconscious expression of the current social consensus." "Hermstein said. Society had to select and conserve its talent, and the best way to do that was through the schools.

Yet this familiar system, far from evolving "naturally" or "unconsciously," is the product of distinct cultural changes in American history. The process that left it in our landscape is less like the slow raising of a mountain range or the growth of oxbows on the Mississippi, and more like the construction of a dam. Three changes, which took place in the past hundred years, produced the system that is now producing M.B.A.s. They were the conversion of jobs into "professions," the scientific measurement of intelligence, and the use of government power to "channel" people toward certain occupations.

THE FIRST CHANGE WAS PROVOKED BY THE GENERAL social chaos of the late nineteenth century. In fond recollection this is the era of ice-cream socials and horse-and-buggy outings and white linen suits, but for those alive then, it seems to have featured one moment of terrifying uncertainty after another. Between the end of the Civil War and the beginning of the First World War the nation's population grew faster and migrated more frequently than ever before or since. Tens of millions of people poured through Ellis Island and into the New World; millions more left farms in Wisconsin and Tennessee to work in stockyards and steel mills in such brash new boomtowns as Chicago, Cleveland, and Detroit. Men and women who had grown up on farms or in small towns where everyone knew his neighbors, and where behavior was constrained by the knowledge that nothing could be kept secret for very long, now found themselves cow-towing to impersonal foremen and brushing shoulders with people who has only recently lived in Calabria or Minsk.

The social order and the traditional sources of security were repeatedly called into question. When the transcontinental railroad network was completed, the United States was for the first time something like a national market. Small-town merchants found they couldn't compete with the big chains operating out of Chicago and New York. With the growth of steamship lines and the cultivation of vast new tracts in Australia, Canada, and South America, farmers were exposed not just to a national but to a world-wide economy. A farm family in Kansas could till, sow, pray for rain, and harvest—only to find that a bumper crop in Argentina had destroyed the price for wheat. At the time of the Civil War more than half of the American work force could still be found on the farm. By the turn of the century only a third was still there. With the decline of the village and the farm, doors were closing on the man who wanted to work for himself and opening to those who were willing to sign on with Armour or Union Pacific or Standard Oil.

"An age never lent itself more readily to sweeping, uniform description: nationalization, industrialization, mechanization, urbanization," the historian Robert Wiebe wrote in his classic study of the era.

Yet to almost all of the people who created them, these themes meant only dislocation and bewilderment. America in the late nineteenth century was a society without a core... A feeling was suddenly acute across the land that local America stood at bay, besieged by giant forces abroad and beset by subversion at home.

Wiebe's book was called The search for Order, is stressed the different ways in which different groups struggled to recover the social and economic seurity they had lost. The farmers joined ranks in the anti-foreign, anti-immigrant, anti-bank, and eventually anti-black protests of the Populist movement. Immigrant and other industrial workers fought for protection through labor unions. The traditional American aristocracy of Roosevelts and Cabots treid both to hold off the immigrants who were reaching for control of city politics and to erect barriers of snobbery and taste with which to separate themselves form the grasping plutocrats of the gilded Age.

For the middling rank of dislocated merchants, craftsmen, and semi-professionals, the most promising route to security was to enhance the prestige of their occupations. Through the nineteenth century "anyone with a bag of pills and a bottle of syrup could pass for a doctor," as Wiebe put it; many doctors were socially ill-regarded beings, with earnings that fluctuated wildly and were chronically below those of businessmen. Lawyers, teachers, and engineers had similar problems. But a more complicated society had more demand for technical skills, and in the decades after the Civil War nearly every group now thought of as "professional," from lawyers to librarians to accountants to mechanical engineers, organized itself in an attempt to raise its standards and its status.

The economic advantages to be had from professional organization were most concisely explained by Mark Twain, who in Life on the Mississippi described the riverboat pilots' attempt to make themselves into a monopoly. At mid-century, when westward expansion caused the steamboat business to boom, the pilots' pay unaccountably began to fall. The reason, as thepilots soon deduced, was that any fool off the farm could sign on as an apprentice pilot, increasing competition and depressing the market. A few of the pilots formed a guild, or "association," asking an inflated wage. They slowly recruited members and agreed to exchange information about the river's constantly changing snags and sandbars only with other members of the guild.

"Now come the perfectly logical result, "Twain wrote, with admiration. "The outsiders began to ground steamboats, sink them, and get into all sorts of trouble, whereas accidents seemed to keep entirely away from the association men." Insurance companies began to plump for association pilots; the steamship owners agreed to one wage raise after another, passing on the diffeence (and then some) in freight. Since no one could become a pilot without the recommendation of two existing pilots, the association could regulate its own competition. The pilots prospered until the entire, now overpriced industry was destroyed by the railroads and "the association and the noble science of piloting were things of the dead andpathetic past!"

The difference between the pilots' association and the countless other guilds that sprang up and survived was that the pilots were tied to one specialized industry and could be completely displaced, unlike doctors, teachers, lawyers, and engineers. But the economic logic that lay behind the pilots' association shaped the other orgnizations as well. They controlled entry into their fields, they often raised professional standards, and they sheltered their members from the more chaotic side of the marketplace.

The newly organizing groups could call themselves professions, and not simply resurrected medieval guilds, because their members' mastery of a new body of knowledge gave them claims to a competence beyond the amateur's reach. Doctors could take advantage of the new breakthroughs in germ theory and anesthesia, engineers of refinements in industrial technology. "A strong profession requires a real technical skill that produces demonstrable results and can be taught," a sociologist named Randall Collins wrote in a history of educational credentials. "the skill must be difficult enough to require training and reliable enough to produce results. But it cannot be too reliable enough to produce results. But it cannot be too reliable, for then outsiders can judge work by its results." Indeed, when historians try to explain why engineers have never become as pretigious and independent as doctors or lawyers, one of their answers is that the engineer's competence is too clearly on display. (When a patient dies, the doctor might not to be blame, but if a bridge, falls down, the engineer is.)

As a means of transmitting the knowledge on which their authority was based--and reserving to themselves control over who would enter the field--the professions dramatically increased the educational requirements for new aspirants near the turn of the century. Before, practically anyone could declare himself a doctor or a teacher or a lawyer, and the choice about who prospered and who failed would be left to "the market," including people who died after trying to cure their cholera and with snake oil. Afterward, those who wanted to enter the professions had to go to school, and once theyhad their credentials they enjoyed a near-tenured status they had previously been denied. Before the First World War not a single state required that its lawyers have attneded law school, and fewer than a third of all north American medical schools required even a high school diploma for admission. By the Second World War professionals without advanced degree were becoming an odity.

Business managers began "professionalizing" about the same time that the other groups did, but their alliance with educational institutions developed more slowly. The new body of knowledge that turned business into a profession was created by the rise of huge, complex, integrated corporations. With the coming of railroads and telegraphs and nationwide trading firms, businessmen couldn't keep schedules or accounts in their heads any longer, as the small-town merchant had done. Resources had to be coordinated, inventory traced from place to place, new systems of accounting worked out. In his history of American management, The Visible Hand, Alfred Chandler, of the Harvard Business School, described how the rise of multi-unit coporations killed off the owner-managers of a simpler era and created a demand for salaried, "scientific" management. Soon after the turn of the century professional management societies and scientific-management journals sprouted up everywhere. The early generation of professionally trained managers was mainly from engineering schools like MIT. "You needed an engineering background to know what was going on inside the factories," Chandler told me. "But when the merger movement began and you needed skills for more than just production, you had the first wave of business schools. At that point, they were indeed meeting a need."

By 1910 graduate business schools had been founded at Dartmouth and Harvard, as had undergraduate schools of business at New York University and the universities of Chicago, California, and Pennsylvania (the Wharton School). Still, until the eve of the Second World War specialized training in business was the exception. According to a national survey conducted in 1937-1938, only about half of all employers required that prospective managers have even a high school diploma, and only one eighth required a college degree. Thirty years later a regional study found that nearly half of all managerial jobs formally required either a B.A. or a graduate degree.

The first cultural change, then, was the evolution of distinct professions, requiring proof of academic training from those who hope to join. In part the rise of credentialed professions reflected the greater precision of scientific knowledge and the greater complexity of modern business operations, but it also arose from a social choice. When it came to determining professional status, the trial and error of the marketplace would not suffice. Objective standards must be found. Shortly after the Civil War, Charles William Eliot, newly installed as the president of Harvard, had complained in his inaugural address that "as a people we have but a halting faith in special training for high professional employments." There was "national danger" in the "vulgar conceit that the Yankee can turn into high places, anything which we insensibly carry into high places, where it is preposterous and criminal. We are accustomed to seeing men leap from farm or shop to courtroom or pulpit, and we half believe that common men can safely use the seven-league boots of genius." The new ethic of self-regulating professions was the answer to this vulgar Yankee conceit.

Because meeting "objective" standards so often meant getting an academic degree, professional competence soon was measured by "input," not "output." That isM anyone who brought the right educational credentials and could pass the entry test was certified and from that point on was shielded from further formal tests of competence. Once a professional, always a professional, barring felony conviction or grotesque error. As part of the movement for professionalization, the U.S. Civil Service was converted from a high-turnover political-spoils system to a "merit" system, based on objective entry tests. In the old days practically anyone could be hired for a government job, but no one could count on staying very long. After the civil Service was reformed, only those who met the standards could sign on--and once hired, they couldpractically never be dislodged. The corruption of the spoils system symbolized the social chaos that the professional guilds hoped to combat, not only in the government but also in business and theprofessions. The rigidity of the modern Civil Service illustrates how far the idea of professional tenure has gone. In five years in office Ronald Reagan has managed to replace fewer federal employees than Abraham Lincoln did in four, and in Lincoln's day the government was one seventieth its current size.

THE SECOND HISTORIC STEP TOWARD A MERITOCRACY occurred at about the time as the wave of professionalization. It was the invention of IQ tests and the dawning of the idea that "intelligence" was a single, real, measurable, and unchanging trait that severely limited each person's occupational choice.

To the creator of the first intelligence test, the French psychologist Alfred Binet, IQ meant something very different from what it has come to imply. As has often been told, Binet was commissioned by the French Ministry of Instruction to develop a test to identify children in need of remedial schooling. He came up with a list of simple tasks that would illustrate the child's "mental age" -- a normal three-year-old should be able to point to his nose, eyes, and mouth, a normal ten-year-old should be able to make a sentence with the words Paris, fortune, and gutter, and so forth. The ratio between mental age and chronological age, of course, yielded the "intelligence quotient," or IQ, with 100 defined as normal.

Binet never viewed "normal" children as appropriate subjects for his test, which, like the white-blood-cell count, was designed to indicate the presence of disease, not to rank degrees of health. He went out of his way to denounce the idea that IQ could be thought of as a fixed, innate value. As he saw it, an IQ test was, to use another analogy, something like a physical-fitness exam given before a conditioning program, which would indicate areas of weakness and serve as a benchmark for future progress. He prescribed a course of "mental therapeutics" to build mental strength and raise IQ. He began his chapter "The Training of Intelligence" by saying, "After the illness, the remedy." As a young student, Binet himself had been told he would never have a truly philosophical spirit.

Never! What a momentous word. Some recent thinkers seem to have given their moral support to these deplorable verdicts by affirming that an indiviual's intelligence is a fixed quantity, a quantitity that cannot be increased; we must protest and react against this brutal pessimism; we must try to demonstrate that it is founded upon nothing.

Something happened to Binet's concept of IQ when it was translated into English. In both England and the United States, IQ was seized upon as a way of quantifying the long-suspected mental differences among individuals and races. In response to the seemingly unstoppable flow of immigrants, American theorists had developed elaborate schema of the mental standing of different ethnic groups -- "Nordics" highest, Eastern Europeans and blacks lowest--but for proof they had had to get by with comparative cranium measurements and photographs of deviant physiognomies. The IQ tests gave the new science of psychometrics--mental measurement--the kind of objective, hard data it had so sorely lacked.

By the beginning of the First World War psychometrics had come so far that millions of American recruits were screened for IQ with the famous Army Alpha and Army Beta tests. (The first ten questions from an Army Alpha exam are listed below.) When the results were correlated with the recruits' social and ethnic backgrounds, they confirmed what everyone had suspected: the immigrants and blacks were overwhelmingly subnormal, with the most recent arrivals proving to be the most defective. The only unforseen and unsettling wrinkle was that most people were subnormal: the average mental age for white draftees was thirteen. The chief administrator of the tests, Robert Yerkes, noted if the results were taken seriously, 47 percent of white draftees must be classified as morons. He concluded, "Thus it appears that feeble-mindedness. . . is of much greater frequency of occurrence than had been originally supposed."

the 900-page analysis of the Army exams was made public in 1921. Ever since then arguments about intelligence tests have centered on whether the tests are "fair." If the IQ test and all its progeny, from the Army Alpha to the Scholastic Aptitude Test, really did seek out raw talent "fairly," regardless of social setting, why have all of them, from the beginning, shown that the people with the best jobs, the most money, and the best bloodlines also have the highest IQs? Had American (and English) society become so perfectly meritocratic by the early 1900s that the smartest people had already reached the occupational summit, despite nativist passions, Jim crow laws, and the brutalized condition of the urban working class?

But beneath the drawn-out arguments about fair and unfair measures of IQ a more powerful concept has often lain unchallenged. Everyone seems to agree that if only we could find a way to determine IQ "objectively," we would be more than halfway to detemining where people should end up in life. Even the most critics of the tests don't question the current structure of the professions. Their concern is giving everyone a "fair" shot at an M.B.A.

Forging a link between intelligence andoccuption was explicitly the goal of the early psychometricians, even through it was not a goal of Alfred Bineths. Lewis Terman, one of the movement's leaders, wrote in 1923 that

preliminary investigations indicate that an IQ below 70 rarely permits anything better than unskilled labor; that the range from 70 to 80 is preeminently that of semi-skilled labor, from 80 to 100 that of the skilled or ordinary clerical labor, from 100 to 110 or 115 that of the semi-professional pursuits; and that above all these are the grades of intelligence which permit one to enter the professions or the larger fields of business. Intelligence tests can tell us to which group a child's native brightness corresponds...

The most important word here is pemits. If the first major social change, the rise of professions based on advanced educational degrees, dramtically increased the importance of higher education, the second change implied that only a few people would be recognized as having the raw intelligence to handle long years in school and the careers that would follow. The results of this perception were spelled out by Richard Hernstein, in his book on the meritocracy. "The ties among I.Q., occupation, and social standing make practical sense," he wrote. "If virtually anyone is smart enough to be a ditch digger, and only half thepeople are smart enough to be engineers, then society is, in effect, husbanding its intellectual resources by holding engineers in greater esteem, and on the average, paying them more."

Surely some people are more talented than others, and some are not fit to be doctors or artists or musicians. Still, theare are reasons to be skeptical of the idea that IQ is usually the limit on occupational ascent. For example, one of sociology's longest-running and most thorough surveys, known as the "Kalamazoo Brothers" study, followed thousands of boys from their childhood in Kalamazoo well into adulthood. A recent analysis of its results revealed that of the men who ended up as professionals, 10 percent had as children been considred "high-grade morons." (That is, their IQs were 85 or below, placing them in the bottom sixth of the population. During the first half century of intelligence testing, people with scores below 85 were known, in descending order of inteligence, as morons, imbeciles, and idiots. Now scores below 70 are associated with severity of retardation, from "mild" to "profound.") Michael Olnec and James Crouse, who analyzed the Kalamazoo data, found that a third of all the professionals and 42 percent of the managers had childhood IQ below 100, which is by definition subnormal. As a group the managers had above-average IQs, but a large number of individual managers did not. According to pure meritocractic theory, Olneck and Crouse observed, the greatest diversity of IQ scores should be found at the bottom of the occupational pyramid (since some people have the brains but not the gumption or the opportunity to move) and the least diversity at the top (where everyone would have to be smart to make the grade). When Richard Herrnstein compared the IQ scores of Second World War recruits with their occupations before induction, he discovered just such a pattern. But Herrnstein's subject were young, starting out in their careers; the Kalamazoo study, which traced its subjects until much later in life, found that the IQ-and-occupation pattern was in fact the reverse. The greatest diversity of IQ scores was found not among unskilled laborers but among professionals. "It appears that the capacity to succeed in professional and managerial jobs is rather widespread, and is not confined to men who score well on tests," Olneck and Crouse concluded.

Another illustration that people can often do more than their IQ "limits" suggest: After the Second World War the GI Bill financed a college education for 2.3 million men, including half a million whose backgrounds suggested that they were not "college material" and who said they would not otherwise have gone to school. James B. Conant, the president of Harvard, called the bill "distressing," because it did not "distinguish between those who can profit most by advanced education and those who cannot." In the same spirit Robert Hutchins, of the University of Chicago, warned that when the GIs came home, "colleges and universities will find themselfes converted into eudcational hobo jungles." In other words, this was a scheme to push people beyond what their intelligence would permit. Of course, when the returning GI enrolled, they confounded all predictions and proved to be famously mature and successful in class. Researchers found that those who would not have gone to college without help from the GI Bill did slightly better in course work than other equally able veterans.

If the linkage between jobs and IQ were as strong and automatic as the meritocratic theories proposed, how could the Kalamazoo morons have succeeded in business and the professions? How could the population of Europe have switched from an overwhelmingly agricultural to an industrial society, with its more demanding skill level, within three or four generations? Where could the United States have found the extra talent to manage an even more rapid transition--the proportion of professional and managerial jobs has quadrupled just since 1900--at precisely the time that its gene pool was "deteriorating" because of dysgenic flows from overseas? Obviously, during the agricultural era the limit on human performance was not the stockpile of native intelligence but the primitive level of technology and social organization. Through most of his history most people have been capable of far more than economic organization has permitted them to do . It would be remarkable indeed if in the 1980s we had reached the precise point of equilibrium at which the supply of human talent exactly matched the high-skill jobs that exist to be done.

nonetheless, the lasting effect of this second social change was the belief that an individual's IQ placed firm limits on how extensively he could be educated--and, because of the emerging link between education and wok, on the jobs to which he could aspire. Since a person's intellectual ability was generally fixed, predictions about his specific limits could be made early in life, as soon as he reached school. The third change began as a logical sequence of the first two: the coversion of the schools into a "channeling" mechanism."

UNTIL THE EARLY TWENTIETH CENTURY "REFORMING" America's schools meant persuading more people are attend. Though the mid-nineteenth century compulsory-school-attenance laws were all but unknown, and only about two percent of the high-schoolage population was enrolled in high school. By the turn of the century more than half the states had passed school-attendance laws, and the long nineteenth-century crusade for publicly financed "common schools" serving the general public had been victorious. But the very success of this crusade created new complications. What was to be done about the "plain people" who were being given the once-rare gift of a diploma but would find that it took them no further than to factory or field.

The resolution of this conflict involved the creation of different classroom "tracks" and vocational, as well as academic, schools. But, like IQ testing, manpower channeling took sudden leaps, because of the demands of war. During the First World War, which the United States entered late, mass mobilization did more the psychometricians than they did for the war effort, since it gave them their first opportunity to collect data on a grand sale. Twenty-five years later, as the United Stats girded up for total war, its strategic planners knew they had to use human resources as efficiently as rubber or tin. Their principal tool for deploying manpower was the power to draft or defer, and for thirty years, from 1940 to 1970, the Selective Service system played a crucial role in, and offered a window on, the evolution of the meritocracy.

General Lewis B. Hershey, whose military career had begun not at West Point but with a National Guard unit in Angola, Indiana, had as the first Selective Service director virgorously advanced a "no deferments" policy during the Second World War. He was especialy hostile to student deferments, arguing that they would turn into a collaboration between colleges (which wanted to keep their enrollments up) and privileged students (who preferred to stay away from the front lines). But in such sentiments Hershey soon proved to be on the wrong side of social history. In the Cold War era the prevailing view was that the United States could not afford to misallocate its intelligence and talent if it hoped it hoped to prevail against the Soviet Union.

In 1948 an advisory group assembled by Hershey recommended the creation of a new draft classification, covering any young man "whose educational aptitude suggest he is of potential special value." Men could qualify for the deferment on the basis of their grades in school and their score on what was essentially an IQ test. The plan represented everything that Hershey detested, but he accepted it, apparently out of his bureaucratic desire to keep the Selective Service system alive. He contracted with the Educational Testing Service to write the test, and when he began calling men for service in Korea in 1951, anyone who scored above 70 (out of possible 100) on the test could remain in college and be sheltered from the draft. Eventually the IQ-test deferment evolved into the 2-s deferment that proved so catastrophically divisive during the vietnam War.

In a way, the IQ deferment plan was merely symbolic. The number of deferments for married men and fathers, members of ROTC, and those classified 4-F vastly exceeded deferments granted through the IQ test. Still, as symbolism it was potent indeed. By the middle of the twentieth century differences in legal standing based on wealth and skin color were on their way out. The time was long past when a slave was legally three fifths of a man or only property owners could vote. Such distinctions had come to seem unacceptable--but not the idea that the state would scientifically seek out its most ntelligent people and grant them extra rights.

This third change, then, instituted the idea that the statE, through its school system and its ability to compel military service, would put the science of mental measurement to work, by helping to steer people toward their proper level of education and most appropriate jobs. By the 1950s the evolution of manpower channelng had, along with the two other changes, given us the modern meritocracy.

WITH ITS EMPHASIS ON THE EARLY DETECTION OF intelligence and on extended education as the route to professional success, the meritocratic order has produced such familiar symptoms as the frenzied competition for places in private nusery school (presumably to improve the odds of admission to Harvard Law School twenty years hence), the bleak prospects that laidoff and uncredentialed industrial workers face when the mills close down, and even the proliferation of consultants and M.B.A.s. but has it in any fundamental way affected America's prospects as a functioning economy or a cohesive democratic state?

If anything in David McClelland's model makes sense--if an earlier national folklore of wide-open opportunities persuaded Americans to take risks that sheer logic might have ruled out--then the riseof the meritocracy has to have had an impact. As the definition of success has been altered to give more encouragement to the professional and less to the rough-and-ready entrepreneur, the achievement motive has also changed. If talent is unchangeable and genetic endownment so precisely limits what eachperson can do, then why fight the inevitable? The logic response can do, then why fight the inevitable? The logical response to a low IQ score would be resignation to fate.

In measurable economic terms the rate of social mobility in the United States has changed very little in at least a hundred years: people still rise out of poverty and fall from affluence about as frequently as in the days when nonone had heard of IQ or tracking or M.B.A.s. American society is more open than most others, but it still reqres the wisely chosen birth. Researhers who dug through estate records in cleveland in the 1960s, for example, foud that if a man was born into the wealthiest five percent of families, the odds were two out of three that his own adult annual earnings would exceed $ 47,000 (in 1985) dollars). If he was born into the poorest 10 percent, the odds were one in a hundred. As far as economic historias can determine, at most points in American history actual mobility rates have been about the same as they are now.

What has changed with the coming of the meritocracy is the air of scientific inevitability that surrounds the results. If only one man in a hundred makes it out of the lowest rank, is it because the other ninety-nine just aren't smart enough? Even while angrily denying that a college degree is necessarily a sign of intelligence, or that executives and members of the clean-hands class deserve the privileges the fear that they really aren't good enough to make it anymore. If the "famous self-confidence" of the businessman, as David McClelland put it, made a tangible difference in the growth of American industry, might not this induced self-doubt do equivalent harm?

"I used to go past John Hopkins all the time, practically everyday." Robert Ward told me earlier this year. Ward is a gruff, wisecracking novelist in his early forties who had recently published Red Baker, a book about the travails of a laid-off steelworker. Ward himself grew up in a working-class Baltimore neighborhood similar to the one he described in the novel.

"I went past there probably a thousand times, and it just never occurred to me that somebody like me could go there. It wasn't like, Gee, I wish I could go there and isn't it too bad I can't. It never entered my mind! I wasn't ever bitter about it, because I just understood deep down in my soul that of course I'd never go to a place like that." In the end World applied at the last minute to Towson State, "only because my mom asked at the end of the summer what I'd think about going to college." He moved on to teaching English at a variety of private schools, wrote his novels, and this year became a story editor in Los Angeles for Hill Street Blues.

"When I'd seen a little more of the world, I started thinking. Hey, I could've gone there! I'm as smart as these people! But it wasn't till years later that I saw how you're tracked unless somebody happens to push you in a different direction. One of my teachers used to tell me, 'You're smart, and the only person we've got to convince of that is you."

"You're taught never to be certain about what you know," Peggy Miller had told me in Baltimore several years earlier. Miller was a slight woman in her early thirties, with dark hair, round dark-rimmed glasses, and a garve air. She had grown up in a working-class area of Pennsylvania, ahd earned a doctorate in psychology, and was studying certain aspects of how parents raised children in the neighborhoods that surround Baltimore's steel mills.

"Myself, I feel compelled to be a hundred percent sure of something before I'll say that it's so, when many other people say it's the case if they're fifty-one percent convinced. One of the reasons, of course, is that a standard of success in the professional world is a kind of glibness and self-confidence. When you ask a worker about something he actually knows in detail, what he'll say is, 'I know a little about that' or 'I have a little bit of experience with it.'

Their own life stories might seem to contradict what Ward and Miller say--after all, each of them has risen in the world. But they offer testimony about an attitude to which most of their friends succumbed. Surrounded by indications that they just weren't good enough to earn a berth in the college-degree world, many were persuaded not even to try. There is more powerful illustration of this destruction of human capital: the behavior of lower-class black teenagers, especially boys, who inspire from most of their fellow citizens a mixture of fear, despair, and a desire that they deserve to fail?

"When you watch these young men playing sports, you know the enthusiasm, the creativity, the competition, and the standards are all there," Irving Hamer told me one afternoon last year. Hamer is the headmaster of the Park Heights Street Academy, in Baltimore, a private school designed to give a second chance to students who seem bright but have run into trouble in the public schools. The Street Academy is located in a cleaned-up row house in the Baltimore ghetto. Hamer, who was raised in central Harlem by his mother, is a tall black man in his late thirties with broad shoulders and a slender and a slender waist.

"Sports is different, because it's the one place where adolescent black males believe there is an outlet for themselves. The determination and energy they show there doesn't translate itself into other areas, because they think they're unavailable. Apart from sports, there is nothing that brings them the message that an upbeat appraoch can pay off. The subtle message that leaps from their experience and reinforces a sense of self-hate is that they shouldn't even try. How do you get a handle on a scial pathology that makes people hate themselves?" Hamer ran own a list of his graduates and what had become of them in the few years since Park Heights had opened. About half have gone on to the nearby community colleges, and many have have joined the Army. "The military has become a convenient way out for a lot of them, and it kills me. The military simply doesn't demand the performance of level of achievement they should be capable of. And those CETA programs--what terrible, unkind assumptions they make about young people, that they can only make it if all standards are lowered for them. Those kids figured out fairly early that little was expected of them."

When I talked with Park Heights students and asked why they had quit or floundered in public school, I nearly always got the same reply. The teachers were robots; nobody cared about anything except the paychecks; it was a waste of time even to show up. With aoo proper allowances for teenagers' vast capacity to defect responsibility away from themselves, by the twentieth time I hears such an account I was convinced--convinced not simply that the urban public schools, deserted by the middle class, have also that when people are told they will fail, most of them do. Is it merely a coincidence that so many immigrants, whose potential has not been ascertained, rise as if they do not know where they are supposed to stop? At the time of my visit to Park Heights, in the Spring of 1984, JEsse Jackson's campaign for the presidency was beginning to gather stream. One wall of Hamerhs office was dominated by a super-life-size portrait, in which Jackson stood resplendent in a business suit. His dimensions and his beatific smile made him look like a happy god. "Why do you think he's getting all the black votes?" Hamer said. "He sends a message that you can succeed."

"People are always saying, 'Why don't these local blacks try harder, when so many of the black-skinned immigrants do so well?" Juan Williams said early this year. Williams, a young reporter for The Washington Post, is himself a black-skinned immigrant, born in Panama and brought by his mother to Bedford-Stuyvesant when he was two. "When people do well, it's because their parents gave them the feeling that great things were expected of them and were within their grasp. My older sister went off to this fancy college and came home all fine and uppity. You start thinking, I want that too. What mattered as having practical models of what you could achieve."

By persuading people on the bottom of theheap that they probably can't succeed, then, the educationa meritocracy destroys talent on which we might otherwise draw. By teaching people that they are struck where they deserve to be, it promotes the resentment that it so destructive to economic and democratic life. Within the past decade, as american businesses have looked with anxiety at Japan and with envious curiosity at successful domestic firms, the conventional business wisdom has emphasized the danger of creating a rigid class structure within a firm. From the Delta executives who handle baggage at Christimastime to the GM Saturn workers whose pay will depend on the plant's profitability, the anecdotes on which the new folk wisdom is based have had a Frank Capra-like democratic theme. Everyone has to feel important, has to think that his efforts are needed and will be rewarded. These days the "us-against-them" mentality of recalcitrant unions and thickheaded manages is widely denounced, but the caste system created by educational credentials has a similarly divisive effect.

For much of my adult life I have lived among those who have "had it good" on the meritocracyhs terms. Because of their intellectual promise, they were better educated than most others, and given longer to explore options and make choices. What I find striking about this class is how few of its members are involved in the sort of creative economic efforts that nearly everyone now professes to admire. From college and graduate school I know lawyers, consultants, and analysts aplenty, but few people who have started their own businesses or created jobs foranyone besides themselves. There are exceptions, but most of the real entrepreneurs I know lack the track record of impeccable schooling and early academic success that is supposed to distinguish the meritocracy's most productive members. What kind of merit system is this, if it discounts the activity on which the collective wealth depends?

A few years ago it was fashionable to blame the distate for enterprise ont he anti-business attitudes of an over-educated "new class." I wonder whether such an explanation is necessary or sensible--especially since the behavior persists even while the well educated have become the main cheerleaders for America's entrepreneurs. Isn't there a more obvious reason, based on caculations of risk and reward? Despite all the pious encomiums that risk-takers now receive, few people seek risk when they can rely on a sure thing. To a degree only dreamed of by Mark Twain's river pilots, the professions now represent America's surest thing. Not many professionals become truly rich, but neither do many doctors, lawyers, consultants, and (today's business students hope) M.B.A.s fall out of the upper tier of income and status. An entrepreneurial society is like a game of draw poker; you take a lot of chances, because you're rarely dealt a pat hand and you never know exactly what you have to beat. A professionalized society is more like blackjack, and getting a degree is like being dealt nineteen. You could try for more, but why?

Thus, in addition to depressing the "unmeritorious" a meritocracy can corrupt its professionals, making them care more about keeping what they have than creating something new. For at least thirty years after the Depression families refused to borrow, socked away their extra dollars, dared not give up tedious but secure jobs, lived in dread that bad times might return. Such caution was based on a fear of ruination; the lack of entrepreneurial daring in today's professional class seems to come instead from a sense of entitlement. Nearly everyone admitted to a professional school graduates; most of those accredited live well. If an "achieving" society requires a balance between confidence and anxiety, can it afford a swelling class whose chief ambition is one day to "make partner?"

"ALL OF OUR WORK HAS GIVEN ME A VERY STRONG view," Richard Boyatzis told me one afternoon. The consulting firm Boyatzis heads, McBer and Company, was founded by David McClelland in 1963. Its specialty has been analyzing what people actually do in business jobs--not what their job descriptions say, but how they spend their time and which skills seem most important to their success. "I've come to see that whenever a group institutes a credentialing process, whether by licensing or insisting on advanced degrees, the espoused rhetoric is to enforce the standards of professionalism. This is true whether it's among accountants or plumbers or physicians. But the observed consequences always seem to be these two: the exclusion of certain groups, whether by intention or not, and the establishment of mediocre performance standards."

Mediocre performance is a grave charge, since the principal justification for a meritocracy is that it sends the right talent to the right jobs. The baleful consequences for working-class morale, the professionals' quest for tenure--these might seem to be the costs we inevitably pay for competence. But the implication of work done at McBer, along with other studies, is that the academic-credentialing system that has evolved over the past century is deficient by its own most basic standard, that of guaranteeing high performance. At every step of the way what is rewarded is excellence in school, which is related to excellence on the job only indirectly and sometimes not at all.

"Because the credentialing and licensing process uses input measures, mainly years of schooling, to determine who gets into the field, we end up licensing people who are good at studying law or business, which is not necessarily the same thing as being good at the job," Boyatzis said. "Occasionally a licensing procedure will require a demonstration of relevant skills--craft unions or accountants, for example. But even in those cases they have no way of assessing whether the skills and knowledge have atrophied in all the years afterwards. The physicians are a perfect example. They've agreed to a system for continuing education--which they can satisfy not by passing a test again but by showing that they've done to a few courses each year."

Within the professions there are abundant illustrations that the skills on which credentials are granted are different from the performance that matters most. For example, in 1979 Daniel Hogan, a lawyer and social psychologist at Harvard, published a four-volume study called The Regulation of Psychotherapists. Its ambition was to examine the day-by-day workings of psychotherapy at every level, from social worker to licensed psychoanalyst.

Hogan devoted his first several hundred pages to an analysis of the traits and qualities that distinguish effective psychotherapists from ineffective ones. In judging effectiveness he concentrated on "output"--changes in the patient's condition--rather than "input," such as how much effort the therapist applies, how much he charges, or how long he spent in school. Then, in the second half of that volume, and with the same painstaking thoroughness, Hogan went through the qualities demanded of those who want to be certified as psychotherapists. There was little overlap between the two lists.

"Contrary to much professional opinion ...," he said, "the effectiveness of therapists is more determined by the presence or absence of certain personality characteristics and interpersonal skills than technical abilities and theoretical knowledge." The skills that make a superb psychotherapist are mainly common-sense human skills--warmth, empathy, reliability, a lack of pretentiousness or defensiveness, an alertness to human subtlety, an ability to draw people out. "The necessary qualities are very similar to those one looks for in a good friend." These are not traits that can be detected on a multiple-choice exam, but they are real, and can be measured in creative ways. In half of the "effectiveness" studies that Hogan surveyed, non-professional therapists did better than professionals in helping patients, despite their lack of formal education. In one study conducted in 1965, for example, five laymen (only one of whom had finished college) were given less than 100 hours of training in therapy skills. Then they were put in charge of patients who had been hospitalized, on average, for more than thirteen years. Under their treatment more than half the patients improved.

Hogan contrasted such subjective skills with the traits the profession considered essential before issuing a license, most of which were based on academic proficiency. "For traditional psychotherapy, psychiatrists stress an understanding of human biology, neurology, and psychopharmacology; psychologists stress personality dynamics and interpersonal behavior; and social workers believe that a theoretical understanding of environmental influences on behavior is essential." As Hogan pointed out, such "hard" scientific preparation was necessary in some cases, to be sure that the patient's complaint did not arise from chemical imbalance, from injury, or from a tumor. But once those possibilities had been eliminated, Hogan's findings showed, advanced technical training counted for nothing in restoring most mental patients to health.

If psychotherapy seems too "soft" a discipline to provide a fair test of meritocratic standards, what about air-traffic control? In 1970 Ivar Berg reported on a study conducted by the Federal Aviation Administration, which wanted to understand what made 507 highly competent air-traffic controllers good at their jobs. The question was whether advanced educational requirements would produce competent controllers; the answer was no. As Berg explained,

This complicated job ... might well reuire, not merely the details of engineering or management science or mathematics, but all the supposed "correlates" of education--a disciplined mind, for example--and the more personal qualities that education is supposed to produce--reliability, steadfastness, responsibility, ability to think quickly, motivation, etc.

Common sense might suggest that the better controllers would be more educated--but the FAA found that fully half the top-ranked controllers had no formal education beyond high school. Many of them had come directly to the FAA for rigorous technical training specifically related to the jobs they were expected to do. Berg said,

Because it was "stuck with" less educated men ... the FAA became a little laboratory in which the relevance of education for attainment of, and achievement in, important managerial and technical positions would be examined. Education proves not to be a factor in the daily performance of one of the most demanding decision-making jobs in America.

The implication of examples such as these is not that talent is equally distributed or that minds are limitlessly malleable or that advanced training is always destructive. A liberal education is good for its own sake, and schooling of any sort can impart a broad perspective that can help in any job. Rather, the charge against credential requirements is that they are simultaneously too restrictive and too lax. They are too restrictive in giving a huge advantage to those who booked early passage on the IQ train and too lax in their sloppy relation to the skills that truly make for competence. No nurse is allowed to hang out a shingle and collect professional fees for the many medical functions she can competently perform; any psychiatrist is legally entitled to perform open-heart surgery or read x-rays of your knee. If sports were run like the meritocracy, the Miami Dolphins would choose their starting lineup on the basis of high-school times in the forty-yard dash and analyses of the players' muscle tissues to see who had the highest proportion of "quick-twitch" fibers. If the Dolphins actually did this, they'd face a long losing season: the coach cares about speed but finally chooses the players who have proved they can catch the ball or stop the run.

Nearly fifteen years ago David McClelland wrote an article called "Testing for Competence Rather Than Intelligence." It said, in effect, that what Don Shula does for the Dolphins the testing and licensing system should do for the professions. While some people are brighter than others, and while the variations in their abilities matter in some jobs, differences in IQ scores should not be the central concern of professional licensing. The proper function of licenses is to ensure that when passengers enter an airplane, they can count on the pilot's knowing how to fly, and that anyone who offers to argue a case in court or prepare a tax return is competent in those tasks. Designing tests of these specific skills might be slightly harder than drawing up yet another IQ test, McClelland said, but the obstacles would hardly be insuperable. Social competition would be more open, the economy would be more flexible, and standards of performance would be higher if credential requirements gave way to tests of specific skills.

In business the companies that are growing and changing the fastest, and where flexibility and performance are presumably more crucial than anywhere elese, already tend to overlook credentials and behave like armies in wartime, rewarding people for what they can do today, not for their background or what their theoretical potential might be. "We do a lot of college recruiting to find our new people," says Steven Ballmer, a twenty-nine-year-old vice-president of Microsoft, the phenomenally successful software firm that Ballmer's contemporary and college classmate, Bill Gates, founded after dropping out of Harvard. Ballmer dropped out of Stanford Business School to join him. "We go to colleges not so much because we give a damn about the credential but because it's hard to find other places where you have large concentrations of smart people and somebody will arrange the interviews for you. But we also have a lot of walk-on talent. We're looking for programming talent, and the degree is in no way, shape, or form very important. We ask them to send us a program they've written that they're proud of. One of our superstars here is a guy who literally walked in off the street. We talked him out of going to college and he's been here ever since."

Such established firms as General Electric and AT&T have long been known for recruiting college graduates and then offering management training, as necessary, inside the firm. Of the 4,500 entry-level professionals General Electric hires each year, only fifty are new M.B.A.s. Most of the others have technical backgrounds; as they move up, they are given brief courses inside the company rather than being formally sent back to school. "As far as we're concerned, there's no broad incentive for technical companies to go out and get M.B.A.s," says James Baughman, who formerly taught at Harvard Business School and now supervises management training at GE. "It's a heck of a lot easier to change a technical person into a businessman over the years than the other way around."

As an alternative or supplement to judging academic credentials, many firms have developed "assessment centers," in which employees handle simulated business problems, in a setting as close to real life as possible, to demonstrate their competence or indicate the need for training. Candidates for administrative jobs, for example, might work their way through a sample in-box. "Bosses find those promoted because of their assessment center scores to be competent, the candidates feel the system is fair, and assessors believe that the process has given them the chance to measure important characteristics," wrote Robert Klitgaard in his recent book, Choosing Elites.

A number of firms, from McDonnell Douglas to Mobil to Digital Equipment, have turned to McBer for its "competency" analyses of specific jobs. The results are sometimes surprising. To manage its new-product-development lab, for example, one firm had habitually looked for freewheeling, creative types; the lab's researchers were innovators, so naturally their boss should be too. "It turned out that those with the best performance were actually less creative and risk-taking than others," Richard Boyatzis says. "The most creative people held onto ideas way too long. What distinguished the superior performers were other traits, like being able to informally steer people and to get engineers, market researchers, and scientists to pull together."

Equipped with such knowledge, the company was able to select more-competent directors; more important, it was able to train a broader range of people to succeed. McBer's view of "competencies" is very similar to Binet's view of intelligence: after the illness, the remedy. Boyatzis says, "The most positive message we consistently get is that people do want to improve themselves, but usually they don't know exactly what to work on. When you can give them good feedback on specific goals, that relates the natural internal inclination to improve."

IT IS POSSIBLE TO COMBINE THAT BASIC DESIRE FOR IMPROVEMENT and upward mobility with standards that ensure high performance? Can a society be both efficient and open? One of the most successful, and least credentialed, assessment procedures suggests that it is.

Among lawyers, accountants, an M.B.A.s incompetence may be a nuisance, but in airline pilots it is a catastrophe. In the early days of commercial flight the airlines bore the responsibility for training and certifying their pilots, but the soon begged for government regulation, so as to spread the responsibility when crashes occurred. Like the licensing procedures for doctors, lawyers, and engineers, these standards were supposed to protect the public from incompetence, but they were of a very different nature from those of the professional guilds. The pilot-licensing system was built on the premise that competence was divisible: people can be good at one thing without being being good at others, and they should be allowed to do only what they have mastered. As opposed to receiving a blanket license, the way members of other professions do, pilots must work their way up through four certificate levels, from student to air-transport pilot, and be specifically qualified on each kind of aircraft they want to fly. What's more, a pilot must demonstrate at regular intervals that he is still competent. To keep his license a pilot must take a review flight with an instructor every two years, and the pilots for commercial airlines must pass a battery of requalification tests every six months. "A small but regular percentage is washed out each time," John Mazor, of the Air Line Pilots Association, says. It is reassuring to know they are gone, but what about their tenured counterparts in the other professions?

The results of this licensing scheme are a high level of proficiency and a profession more open socially than the rest. Most pilots of big jets learned to fly in the military, since that is the least expensive way to put in the 1,500 hours of flight time necessary for an air-transport license. But the remainder slowly worked their way up, putting in flight time on their own or working for small air-taxi outfits until they could move to the next level of licensure. Imagine what other professions would be like if they operated this way. The sociologist Randall Collins's prescription for medical training follows a similar pattern:

All medical careers would begin with a position as orderly, which would be transformed into the first stage of a possible apprenticeship for physicians. After a given number of years, successful candidates could leave for a few years of medical school (2 years seems sufficient background for most practitioners ...) and then return to the hospital for advanced apprenticeship training of the sort now given in internship and residency programs. ... Advanced specialties could continue to be taught as they now are--through further on-the-job training; only medical researchers would be involved in lenghty schooling.

In theory business is better positioned than the professions to resist the worst effects of a meritocracy. The professions depended for their creation and growth on credential barriers that kept people out; business depends for its survival on making the best and most flexible use of all its resources, including talent. Even dominant firms must face the possibility that somebody who may not have gone to the right school and may not have the right degree might still come to market with a better, cheaper product.

Because successful business practice depends to some extent on apperances, business may never be as completely open as America's one true meritocracy--sports. (It didn't matter that Babe Ruth was fat, slovenly, and ungrammatical, so long as he could hit the ball.) But why shouldn't sports, rather than the professions, epitomize the meritocracy to which we aspire? American professional sports have their sins and excesses, to be sure. But with their relative openness to newcomers and disregard for background (most teams have hired no-name free agents and waived famous first-round draft choices) and their faith that ruthless and continuing judgments of performance will finally lead to equal opportunity, sports seem more admirably meritocratic than the system of early selection and later tenure that meritocracy has been perverted to mean.

Perhaps the cultural changes that have professionalized America are irreversible. The economist Mancur Olson has gloomily hypothesized that most societies tend to separate into inflexible castes, except when warfare or other cataclysms disrupt the social order an unleash new talent. The United States has renewed itself in less traumatic fashion--by continually populating new regions, by absorbing varied immigrant groups, and by taking deliberate steps, such as the GI Bill, to give more of its people a chance. As we drift toward a neater and more predictable social order, we might reflect on the rough-and-ready adaptation to experience that brought us this far, and ask ourselves whether we need it still.

James Fallows is a contributing writer at The Atlantic and author of the newsletter Breaking the News.