For some reason, no matter what I try, the alignment on my blog is getting messed up. Hope it isn't too hard to read.
"What you earn," Bill Clinton said more than once when he was president, "is a function of what
you can learn." That had always been true, but Clinton's point was that at the close of the 20th
century it was becoming more true, because computers were transforming the marketplace. A
manufacturing-based economy was giving way to a knowledge-based economy that had an upper class and a lower class but not much of a middle class.
The top was occupied by a group that Clinton's first labor secretary, Robert Reich, labeled
rearranged, juggled, or experimented with" using "mathematical algorithms, legal arguments,
financial gimmicks, scientific principles, psychological insights," and other tools seldom
acquired without a college or graduate degree. At the bottom were providers of "in-person
services" like waitressing, home health care, and security. The middle, once occupied by factory
workers, stenographers, and other moderately skilled laborers, was disappearing fast.
Did computerization create the Great Divergence?
Our story begins in the 1950s, at the dawn of the computer age, when homo sapiens first began
to worry that automation would bring about mass unemployment. Economic theory dating back
to the 19th century said this couldn't happen, because the number of jobs isn't fixed; a new
machine might eliminate jobs in one part of the economy, but it would also create jobs in another
8 Reich’s January 2008 Berkeley lecture, “How Unequal Can America Get Before We Snap?,” an
excellent introduction to the topic at hand, is available on You Tube.
For example, someone had to be employed to make these new machines. But as the
economists Frank Levy of MIT and Richard J. Murnane of Harvard have noted, computers
represented an entirely different sort of new machine. Previously, technology had performed
physical tasks. (Think of John Henry's nemesis, the steam-powered hammer.) Computers were
designed to perform cognitive tasks. (Think of Garry Kasparov's nemesis, IBM's Deep Blue.)
Theoretically, there was no limit to the kinds of work computers might eventually perform. In
1964 several eminent Americans, including past and future Nobel laureates Linus Pauling and
Gunnar Myrdal, wrote President Lyndon Johnson to warn him about "a system of almost
unlimited productive capacity which requires progressively less human labor."
Such a dystopia may yet one day emerge. But thus far traditional economic theory is holding up
reasonably well. Computers are eliminating jobs, but they're also creating jobs. The trouble,
Levy and Murnane argue, is that the kinds of jobs computers tend to eliminate are those that
require some thinking but not a lot—precisely the niche previously occupied by moderately
skilled middle-class laborers.
Consider the sad tale of the bank teller. When is the last time you saw one? In the 1970s, the
number of bank tellers grew by more than 85 percent. It was one of the nation's fastest-growing
occupations, and it required only a high school degree. In 1970, bank tellers averaged about $90
a week, which in 2010 dollars translates into an annual wage of about $26,000. But over the last
30 years, people pretty much stopped ever stepping into the lobby of their bank; instead, they
started using the automatic teller machine outside and eventually learned to manage their
accounts from their personal computers or mobile phones.
Today, the job category "bank teller" is one of the nation's slowest-growing occupations. The
Bureau of Labor Statistics projects a paltry 6 percent growth rate during the next decade. The job
now pays slightly less than it did in 1970, averaging about $25,000 a year.
As this story plays out in similar occupations—cashiers, typists, welders, farmers, appliance
repairmen (this last already so obsolete that no one bothers to substitute a plausible ungendered
noun)—the moderately skilled workforce is hollowing out. This trend isn't unique to the United
States. The Japanese have a word for it: kudoka. David Autor, an MIT economist, calls it "job
polarization," and he has demonstrated that it's happening to roughly the same extent within the
European Union as it is in the United States. But Autor readily concedes that computer-driven
job polarization can't possibly explain the entire trend toward income inequality in the United
States, because income inequality is much greater in the United States than it is in Europe.
Another problem that arises when you try to attribute the income-inequality trend to computers is that the Great Divergence began in the late 1970s, well before most people had ever seen a
personal computer. By the late 1990s, as businesses stampeded to the Internet, inequality
slackened a bit. If computers were the only factor driving inequality, or even the main factor, the opposite should have happened.9 A final problem is that the income premium for college or
graduate-level education gradually slackens off at higher incomes, even as income inequality
intensifies. If computers required ever-higher levels of education to manipulate ever-growing
quantities of information in ever-more rococo ways, then we'd expect the very richest people to
be the biggest nerds. They aren't.
Here, then, is a dilemma. We know that computers put a premium on more highly educated
workers, but we can't really demonstrate that computers caused the Great Divergence. What is it that's so special about computers? Harvard economists Claudia Goldin and Lawrence Katz offer
an interesting answer: Nothing!
Yes, Goldin and Katz argue, computer technology had a big impact on the economy. But that
impact was no larger than that of other technologies introduced throughout the 20th century,
starting in 1900 with the dynamo that Henry Adams famously swooned over at the Paris
Exposition. Between 1909 and 1929, Katz and Goldin report in their 2008 book, The Race, the percentage of manufacturing horsepower acquired
Between Education and Technology
through the purchase of electricity rose sixfold. From 1917 to 1930, the proportion of U.S.
homes with electricity increased from 24 percent to 80 percent. By contrast, from 1984 to 2003,
the proportion of U.S. workers using computers increased from 25 percent to 57 percent.
Computer use has spread quickly, but not as quickly as electric power did during the early part of
the 20th century. "Skill-biased technological change is not new," Katz and Goldin wrote in a
2009 paper, "and it did not greatly accelerate toward the end of the twentieth century."
Contemporary culture is so fixated on the computer revolution that the very word "technology"
has become an informal synonym for "computers." But before computers we witnessed
technological revolutions brought on by the advent of the automobile, the airplane, radio,
television, the washing machine, the Xerox machine, and too many other devices to name. Most
of these earlier inventions had much the same effect as the computer—that is, they increased
demand for progressively higher-skilled workers. But (with the possible exception of radio) none
of these consumer innovations coincided with an increase in inequality. Why not? Katz and
Goldin have a persuasive answer that we'll consider later in this series.
Correction, Sept. 9, 2010: An earlier version of this story misstated Kasparov's first name as
"Boris.”
No comments:
Post a Comment