Authors: Kurt Andersen
Along the way, a people who defined themselves as seekers and personifications of novelty, rushing toward a blindingly bright future, also sought occasional respite from the accelerating shocks of the new. Almost as soon as the new United States was established, Americans developed an occasional cultural tic, indulging in nostalgic entertainments of a whole new kind, most notably for American times and places they themselves hadn’t experienced. Our forefathers created a nostalgia industry that fictionalized our recent past, turning Daniel Boone and Buffalo Bill Cody into living celebrity-hero artifacts and reenactors of the frontier days, in the nineteenth century devouring James Fenimore Cooper’s romantic stories of the untamed eighteenth century.
was driven by Henry David Thoreau’s nostalgia for the era of his childhood in the 1820s and ’30s, before railroads and the telegraph, and his dreamy wish “not to live in this restless, nervous, bustling, trivial Nineteenth Century.” As that century turned into the even more restless, nervous, bustling twentieth, the United States relentlessly modernized and urbanized. Then came our biggest act of national nostalgia, creating and moving en masse to suburbs, brand-new green and peaceful simulations of old-time American towns and villages and homesteads.
But the suburbs of the twentieth century also embodied an ugly nostalgia, of course, because they were places to which native-born white Americans could escape the city to live almost exclusively among their own kind. America’s tragic flaw is our systemic racism, and it’s a residue of a terrible decision our founders made to resist the new and perpetuate the old: the enslavement of black people. Slavery had ended in most of Europe by the 1500s, but not in its colonies in the New World and elsewhere. France and Spain and Britain outlawed their slave trades and slavery itself decades before the United States did, and they found it unnecessary to fight civil wars over the issue. Tsarist Russia emancipated its serfs before democratic America emancipated its slaves. On abolition we were not early adopters.
The most important foundations for slavery, of course, were racism and economics. In the antebellum South, enslaved blacks constituted half of all wealth and generated a quarter of white people’s income. But white Southerners also clung to slavery due to nostalgia for the time when slavery wasn’t under serious challenge by their fellow citizens, and anticipatory nostalgia for the idealized present they feared losing—as a governor of Tennessee wrote after the Civil War, for “the cotton fields, alive with the toiling slaves, who, without a single care to burden their hearts, sang as they toiled from early morn till close of day.” White Southern nostalgia was also for the fictional feudal pasts depicted in the novels of Walter Scott, set in ye olde England and Scotland, published in the 1820s and ’30s, and particularly, phenomenally popular in the American South because the fictions served to romanticize their own slave-based neofeudalism. Mark Twain blamed secession and the Civil War on such Southern “love [of] sham chivalries of a brainless and worthless long-vanished society.”
The brand-new party of the North was the most progressive party—Republicans were not just antislavery but in favor of building railroads and public universities and giving free land to homesteaders. Lincoln and his party of the new won the war that killed 4 percent of American men in four years, the equivalent of 7 million today. The country at large looked to be more or less back on an Enlightenment path and catching up with the rest of the civilized world—letting go of the hopelessly obsolete and irrational, changing dramatically when necessary. Despite powerful backlashes along the way (the South’s replacement of Reconstruction with Jim Crow, the fundamentalist Christian upsurge of the early 1900s, the resistance to civil rights a half-century later, the insidious and ongoing systemic racism), over the long run the country was more open to the new than fixated on the past. Progress fitfully prevailed. For most of the twentieth century, most Americans seemed to have permanently learned lessons about the mortal dangers of pathological nostalgia and resistance to change, so that a large governing majority remained committed to making America perpetually new.
While not everything new is good or desirable, an exceptional
to the new was America’s factory setting.
We call the desirable new
From the late 1700s to the late 1900s, economic growth helped make the United States perpetually new, thus enabling social progress.
Economic growth comes overwhelmingly from incorporating new technologies into the way work is done.
I want to return to that last point, and to what I said earlier about how, just as the United States was giving birth to itself 250 years ago, the steam-powered industrial revolution began turning a society of farmers and artisans into one of workers tending machines in factories and elsewhere, thereby permitting the U.S. economy to grow and grow.
That’s all true, but it’s not the whole truth. Because as I’ve said, and as I’ll now explain in more detail, every economy, including ours, is a political economy. So here’s a crucial fifth point:
One way or another, we
as a society how to make economic use of new technologies
of the new wealth that results from economic growth.
Of course, when I say “as a society,” I mean the people who exercise effective power over the choices a society makes. For instance, even before the game-changing steam-powered contraptions of the industrial age came along after 1800 and transformed the nature of more and more jobs, bosses for a century or two had been redesigning work to suit themselves. During this so-called
revolution, manufacturers began gathering their artisanal workers each day into buildings that they began calling factories. Thus gathered, a Cornell University historian of work explains, “the labor could be divided and supervised. For the first time on a large scale, home life and work life were separated. People no longer controlled how they worked, and they received a wage instead of sharing directly in the profits.”
The industrial revolution’s great American celebrity promoter Eli Whitney became famous in the 1790s, just out of Yale, for inventing a machine that was at the center of America’s industrial revolution—a new, improved cotton gin that mechanized the process of removing the dozens of seeds from each cotton boll. At the time, growing cotton was a new and very small part of American agriculture. One worker using just his or her hands spent a whole day to produce one pound of clean cotton. But using Whitney’s gin, that output suddenly leaped to twenty or twenty-five pounds per day. In the early 1800s, cotton replaced tobacco as the biggest U.S. export, and production increased 3,000 percent over the next half-century.
The cotton gin is a perfect illustration of the economic fact to which I keep returning: a new technology makes workers more efficient, increased productivity results in more profits, and the economy grows. But as I’ve also said, every economy is a political economy, and the story of the cotton gin is also an extreme illustration of that fundamental truth. Cotton growing happened in the South, of course, and the people whose productivity dramatically improved were enslaved African Americans, at least two-thirds of whom worked producing cotton. And the profits that dramatically improved as a result, of course, all went to the plantation owners. So this remarkable piece of new technology, in addition to driving overall U.S. economic growth, was responsible as well for making slavery a foundation of the U.S. economy.
Eli Whitney also happens to be a fabulous case study of the overenthusiasm and overpromising that surround technological progress—especially at moments of revolutionary economic change, such as the early nineteenth and early twenty-first centuries. We learn in school that Whitney came up with another, more foundational piece of the industrial revolution: manufacturing things out of standardized bits and pieces,
from gears and levers then to Ethernet plugs and semiconductor chips now.
Coming off the success of the cotton gin, young Whitney convinced the new U.S. government that he was their man to mass-produce ten thousand muskets, even though he knew nothing about making guns. Two years later, after failing to meet his contractual deadline, he went to Washington to keep his remorseful buyers on the hook. His state-of-the-art musket was taking a
longer than expected to get right, he told President Adams and President-elect Jefferson and the secretary of war, because it would consist entirely of fantastic new interchangeable parts, meaning that manufacture would be cheaper and faster, and repair easier. He spread a hundred metal pieces on a tabletop.
Sirs, here before you are all the ordinary parts from ten of my new gunlocks,
Hand me one of each, any you wish, at random, and from those, using only a screwdriver, I shall assemble a working apparatus!
Which he proceeded to do, wowing everyone, getting his deadline extended, more money, and a contract for still
Whitney’s demonstration in Washington, however, had been almost all show. According to the MIT technology historian who wrote the definitive account of that episode, “Whitney must have staged his famous 1801 demonstration with specimens specially prepared for the occasion. It appears that Whitney purposely duped government authorities” into believing “that he had successfully developed a system for producing uniform parts.” (
would be the word coined two centuries later.)
The inventor and nail manufacturer Jefferson, who’d been excited for years about the prospect of interchangeable parts, was wildly enthusiastic. Not long after the faked demo, he wrote a letter of introduction on Whitney’s behalf to the governor of Virginia, his protégé James Monroe. Whitney “has invented moulds & machines for making all the pieces of his locks so exactly equal,” the president wrote, and thus “furnishes the US. with muskets, undoubtedly the best they receive.” None of that was true. In fact, Whitney wouldn’t deliver any muskets to the government until 1809, nine years later, and interchangeable parts weren’t perfected until after his death.
Whitney was absolutely honest when he admitted in 1812 that the whole point of using identical, interchangeable parts to make things in factories would be to render old-fashioned craftspeople obsolete—that is, “to substitute correct and effective operations of machinery for the skill of the artist.” This new way of organizing production, and using technology to replace skilled workers with cheap unskilled workers, was known at the time as the American System.
I also told not-the-whole-truth in the last chapter when I wrote that the industrial revolution made the average citizen’s share of the economy start growing. That’s true, but is also misleading, as are many statements involving mathematical averages.
For the first half of the 1800s, workers in the industrializing economy didn’t actually get a fair share of the new bounty. In fact, their overall incomes probably stagnated or shrank, while the capitalists’ share doubled, dramatically increasing inequality. Which led many people starting in the 1840s in America and Europe to imagine that these booming new capitalist systems were unsustainably unfair and might presently bust.
In Britain, a twenty-four-year-old executive at a cotton mill near Manchester, the son of one of the company’s founders, wrote in an 1845 book that “the most important social issue in England [is] the condition of the working classes, who form the vast majority of the English people. What is to become of these propertyless millions who own nothing and consume today what they earned yesterday?” The “industrialists,” this young industrialist wrote, “grow rich on the misery of the mass of wage earners,” but “prefer to ignore the distress of the workers.”
The writer was Friedrich Engels, about to become Karl Marx’s lifelong friend, collaborator, and patron. Instead of collapsing, the new capitalist system adapted. America’s has always been a free-market economy, but as a
economy, the society constantly redesigns and tweaks the system and enforces its operating rules and norms. And so from the mid-1800s onward, as new technologies kept making workers more productive—by 1840 the productivity of U.S. factory workers exceeded Britain’s—and as the economy kept growing, workers started getting a proportionately larger share of the enlarging pie, and then they kept doing so. To use the other standard metaphor again, for the next century and a half, all boats rose together.
That principle of economic fairness was at the heart of our American social contract as we evolved from rough-and-ready start-up nation to successful global superpower. It didn’t happen because the new capitalists of the 1840s suddenly became kind and began sharing their wealth, like Dickens’s Ebenezer Scrooge.
Encouraging virtue among the well-off—that is, creating good, strong social norms and shaming violators—does have a place in this story. But the big shift toward economic fairness that began in the later 1800s was the result of a new system of controls that we democratically built into our political economy.
“Private economic power is held in check by the countervailing power of those who are subject to it,” the supremely lucid economist John Kenneth Galbraith wrote in
in 1952. “The first begets the second.”
is a political economic concept that Galbraith nailed down. From the late 1800s through the 1900s, that power was organized and built throughout U.S. society by citizens and workers and customers to check and balance the new and rapidly growing power of big business. We almost only talk about “checks and balances” concerning Washington politics, presidents versus Congress versus the federal courts. But economies—especially modern free-market economies, loosely supervised day to day, operating mostly without government commanders-and-controllers—also need systems of checks and balances.
It can be useful to think of an economy as a game, the highest-stakes game there is. As in D&D or backgammon or Chutes and Ladders, some players are extremely avid and others have better things to do, some players are very skilled and others just…keep tossing the dice. But unlike other games, in the economy
is a player. And we all need one another to keep playing forever. Because the whole point is never-ending play, without overwhelmingly decisive, permanent winners or so many losers the game doesn’t work as well as it can and should. All games have rules, but unlike other games, the rules by which an economy operates only
like they’re handed down by a godlike game designer—whereas in fact, they’re amended and sometimes dramatically rewritten by the players over time.
Meanwhile, back in real-life American economic history, starting in the 1800s the industrial revolution changed the game, modern corporations formed, and one player, big business, began acquiring unparalleled new economic and political power. A major inherent advantage that business has over other players in the economic game is its centralized, undemocratic nature: no matter how sprawling companies and financial institutions may be, they have headquarters, strategies, and bosses who give orders that are obeyed—whereas individual workers and customers and small businesses and citizens in this vast country are…individuals, spread out, disparate, disorganized, relatively powerless.
So individual Americans got together and organized new checks and balances in the new economy. Workers formed unions. “The operation of countervailing power is to be seen with the greatest clarity in the labor market where it is also most fully developed,” Galbraith wrote in
Just thirty years earlier, in the 1920s, “the steel industry worked a twelve-hour day and seventy-two-hour week with an incredible twenty-four-hour stint every fortnight when the shift changed. No such power is exercised today,” by bosses in the 1950s, only because the overworked workers had risen up and “brought it to an end.” That is, during the 1930s, in addition to enacting a minimum wage and child labor laws, citizens through their government created a system that let workers’ unions organize and negotiate fairly, and instantly union membership more than tripled. By the 1950s, a third of all jobs at private companies had been unionized.
But the new countervailing power wasn’t just about employees forcing businesses to share more of their profits. Citizens elected legislators and governors and presidents to enact new rules concerning how big business could and couldn’t conduct itself in other ways, and created new regulatory agencies to enforce those rules. Some of the agencies and missions were very specifically purpose-built to discourage businesses from harming and killing people. In 1906 we created the Food and Drug Administration to prevent the sale of dangerous food and phony medicines, just as in 1970 we created the Environmental Protection Agency to prevent businesses from putting too many toxins into the air and water and land.
Before and after the turn of the twentieth century, America decided that even if businesses weren’t physically harming people, a few companies had simply become too large and too powerful for the good of our economy and our democracy—the so-called corporate trusts. In 1910 President Theodore Roosevelt, a rich Republican, said that “corporate funds” used “for political purposes” were “one of the principal sources of corruption” and had “tended to create a small class of enormously wealthy and economically powerful men whose chief object is to hold and increase their power.” Antitrust laws were passed to prevent corporations from using their economic power to keep prices unfairly high or wages unfairly low. For a century, those laws saw to it that if one or a few big companies controlled the supply of basic products or services in a region or the whole country—railroads, gasoline and steel, electrical and telephone service, banking, TV and radio—those companies had to be more closely regulated. Or broken up into smaller, less powerful enterprises that would compete with one another—the way we did in the early 1900s to the new companies that controlled 90 percent of the suddenly humongous markets for petroleum (Standard Oil) and cigarettes (American Tobacco). The government’s enforcement of these laws kicked into even higher gear under the New Deal in the 1930s; in a decade, the Justice Department’s antitrust division grew from 15 lawyers to 583.