Authors: Kurt Andersen
From the end of World War II through the New York World’s Fair, as I’ve described, the future looked pretty fabulous to most Americans, but then during the roiling late 1960s people got confused and scared of the new. Specifically concerning what computers implied for the future of work and jobs, however, the consensus suddenly did the reverse: for two decades, experts had worried about where automation was leading our economy, but starting in the late 1960s the smart set couldn’t wait to get to superautomated Tomorrowland.
A significant early worrier had been the mathematician Norbert Wiener—college graduate at fourteen, Harvard professor at nineteen, at MIT the godfather of artificial intelligence—who back in 1948 published
a groundbreaking book that gave a new technological field a name. It was remarkably popular, and talking about it to a reporter back then, Wiener succinctly and accurately foresaw the future of work—that is, our present. Just as “the first industrial revolution devalued human labor” such that “no pick-and-shovel ditch-digger can sell his services at any price in competition with a steamshovel,” before too long the second industrial revolution would completely automate a factory
without a human operator…Such machines will make it very difficult for the human being to sell a service that consists of making routine, stereotyped decisions. The electronic brain will make these logical decisions more cheaply, more reliably, and, of course, more quickly.
In an article he wrote right afterward in 1949 for the
(that wasn’t published because the editor demanded too many rewrites), Wiener elaborated. “These new machines have a great capacity for…reducing the economic value of the routine factory employee to a point at which he is not worth hiring at any price.” So unless we change “our present factory system…we are in for an industrial revolution of unmitigated cruelty….We can be humble and live a good life with the aid of the machines, or we can be arrogant and die.”
Reading Wiener then, young Kurt Vonnegut was inspired to write his first novel,
which depicted that cruel, arrogant, dystopian American future. In the early 1960s plenty of Washington studies and reports appeared warning of the existential challenges of automation. In 1964 the Oxford mathematician and AI expert who was about to advise Stanley Kubrick on creating HAL for
2001: A Space Odyssey
wrote that although “this point is made…seldom outside of science fiction,” in fact “the first ultraintelligent machine is the last invention that man need ever make,” raising the “possibility that the human race will become redundant.”
But then in 1967 Herman Kahn, an inspiration for the title character in Kubrick’s previous movie,
said not to worry, keep loving the new—automation would lead in no time to four-day workweeks and three months of vacation for everybody. And to the utopian youth of the late 1960s, computer-generated ultra-prosperity looked sweet: if work would soon become unnecessary, conventional ambition could be abandoned. The New Left’s favorite living Marxist, Herbert Marcuse, wrote that automation was “the first prerequisite for freedom” to give every individual “his time, his consciousness, his dreams.” In fact, Marx himself, a century earlier in notebooks first unearthed and published in the 1960s, foresaw a pleasant future with “an automatic system of machinery…itself the virtuoso, with a soul of its own,” to which “the human being comes to relate more as watchman and regulator to the production process itself.” And in the bestselling 1969 book that coined the term
the author, a young Bay Area professor, wrote that “economic security” was “something [young Americans] can take for granted” now because “we have an economy of cybernated abundance that does not need their labor.” In general the utopians at that giddy moment didn’t very carefully address how capitalism in the United States and other countries would have to change to avoid Wiener’s economic future of unmitigated cruelty.
As the 1970s began, the cultural and political Sixties were still going full tilt, accelerating. Single-sex colleges were all rushing to go co-ed—Princeton, Yale, Bennington, and Kenyon in 1969, Johns Hopkins, Colgate, the University of Virginia, and Williams in 1970. A year earlier a half-million young people had assembled for Woodstock, a new species of American event, and another, record-breaking half-million had assembled in Washington, D.C., to protest the Vietnam War. The New Left spun off a terrorist faction that was setting off an average of ten bombs a week in government buildings and banks around America. A constitutional amendment to lower the voting age from twenty-one to eighteen was about to be passed by Congress (unanimously in the Senate, 401–19 in the House), then ratified by the states in a hundred days, faster than any amendment before or since. Congress promptly passed another constitutional amendment, one to guarantee equal rights for women, by margins almost as large.
Given how much had changed during just the last few years, if that tidal wave of new continued through the 1970s—and why wouldn’t it?—what additional shocking changes might lie just ahead?
In fact, in the early 1970s, we had reached Peak New.
Although the Internet was invented in the 1960s—its fundamental technologies were developed starting in 1963, and the prototype system ARPANET transmitted its first computer-to-computer message in 1969.
“We set sail on this new sea,” Kennedy said in that famous speech, “because there is new knowledge to be gained, and new rights to be won,” because we’d “be enriched by new knowledge of our universe and environment, by new techniques of learning and mapping and observation, by new tools and computers.”
After all, Apollo (rational, reasonable, orderly) and Dionysus (instinctual, emotional, sensual) were brothers, both arrogant sons of Zeus.
“Everything happened during the sixties,” the dystopian fiction writer J. G. Ballard said after they ended. He’d turned thirty in 1961, the optimal age to be a trustworthy real-time chronicler of that decade.
“Thanks to TV, you got strange overlaps between the assassinations and Vietnam and the space race and the youth pop explosion and psychedelia and the drug culture. It was like a huge amusement park going out of control.”
I was only fifteen at the end of the 1960s, but it really was like that, even though not the
park was haywire, and some astoundingly great new attractions (civil rights, expanded social welfare, feminism, and environmentalism) were being built at the same time.
published in the summer of 1970, became one of the bestselling books of the decade. “This is a book about what happens to people when they are overwhelmed by change,” wrote the authors, whose lecture in Omaha I excitedly attended at fifteen, “the shattering stress and disorientation that we induce in individuals by submitting them to too much change in too short a time,” the “roaring current of change…so powerful today that it overturns institutions [and] shifts our values.”
At that same moment, as the besotted forty-two-year-old Professor Reich at Yale published
The Greening of America,
the more typical reaction to the tumult was that of Harvard’s fifty-one-year-old professor Daniel Bell, definitely not feeling groovy. “No one in our post-modern culture is on the side of order or tradition,” he wrote in a famous essay called “The Cultural Contradictions of Capitalism.” He despaired that the “traditional bourgeois organization of life—its rationalism and sobriety—no longer has any defenders in the culture.”
As with all zeitgeists, not everybody and probably not even most people were entirely on board with the spirit of the time. Frightened and angry reactions to the culturally and politically new had germinated instantly. For many people during the 1960s, the perpetual novelty that had been at the heart of modern American capitalism and modern American culture changed from amazing and grand to disconcerting and traumatic. What Marx and Engels had written 120 years earlier about capitalism’s collateral impacts was coming true, too true—
all fixed relations swept away, all new-formed ones antiquated before they can ossify, all that is holy profaned, all that is solid melts into air.
In culture, Bell wrote in that 1970 essay, there was now an overriding
impulse towards the new and the original, a self-conscious search for future forms and sensations….Society now…has provided a market which eagerly gobbles up the new, because it believes it to be superior in value to all older forms. Thus, our culture has an unprecedented mission: it is an official, ceaseless searching for a new sensibility….A society given over entirely to innovation, in the joyful acceptance of change, has in fact institutionalized an
and charged it—perhaps to its own eventual dismay—with constantly turning up something new….There exists only a desire for the new.
As people get older, they do tend to lose interest in the new. And what I call Peak New has a statistical demographic underpinning: Americans’ median age had been in the teens and twenties for our whole history, and it was dropping again in the 1950s and ’60s—but then after 1970 it began increasing, quickly, the average American getting two or three years older each decade. By 1990 it reached thirty-three, higher than it had ever been, and it has continued going up toward middle age.
During the 1970s, just coming off the ’60s and their relentless avant-gardism, people really did feel exhausted, ready to relax and be reassured. Even lots of people who were delighted by the 1960s, by the new laws intended to increase equality and fairness and by the loosey-goosier new laissez-faire cultural sensibilities and norms, were in a kind of bewildered morning-after slough. In response, more and more Americans began looking back fondly to times before the late 1960s, times that seemed by comparison so reassuringly familiar and calm and coherent. In other words, that curious old American nostalgia tic expressed itself as it hadn’t for decades—in fact, it took over with an intensity and longevity it never had before. The multiple shocks of the new triggered a wide-ranging reversion to the old. It turned out Isaac Newton’s third law of motion operates in the social universe as well as physics: the 1960s actions had been sudden and powerful, and the reactions starting in the 1970s were equal and opposite, with follow-on effects that lasted much, much longer.
Some of the origins of this 1970s plunge into nostalgia, in fact, had showed themselves a bit earlier. Paradoxically, as America was approaching Peak New during the 1950s and ’60s, some members of the cultural avant-garde led the way in making the past seem stylish, embracing certain bits and pieces of the old days in order to be unorthodox,
cultural, cooler. It was selective stylistic nostalgia as a way of going against the grain, rejecting earnest upbeat spic-and-span corporate suburban midcentury America. Back in the 1950s, when
applied only to wine and automobiles, the Beats and beatniks had bought and proudly worn used clothes from the 1920s and ’30s. Jack Kerouac’s
On the Road,
the classic cutting-edge Beat novel, is actually an exercise in nostalgia, as the critic Louis Menand says, published and set in 1957 but actually “a book about the nineteen-forties,” the “dying…world of hoboes and migrant workers and cowboys and crazy joyriders.” His cool 1950s characters, Kerouac wrote, all shared “a sentimental streak about the old days in America,…when the country was wild and brawling and free, with abundance and any kind of freedom for everyone,” and the character Old Bull Lee’s “chief hate was Washington bureaucracy; second to that, liberals; then cops.” The simultaneous folk-music revival, from which Bob Dylan emerged, also consisted of cool kids scratching the same nostalgic American itch ahead of everyone else. College students and hepcats in the early 1960s also rediscovered and worshiped 1940s movies like
The Maltese Falcon
at smoky revival movie theaters.
In 1964 Kerouac’s road-trip buddy Neal Cassady joined young Ken Kesey and his band of protohippies, driving them across America from the Santa Cruz Mountains to New York City to visit, yes, the World’s Fair. They were pioneering inventors of the counterculture—which presently became a mass phenomenon and inherited some of the Beats’ sentimental streaks concerning the American old days. Even as youth circa 1970 thought of themselves as shock troops of a new age, part of their shocking newness was nostalgic cosplay. Dressed in reproduction nineteenth-century artifacts—blue jeans, fringed leather jackets, boots, bandanas, hats, men mustachioed and bearded—they fancied themselves hoboes and cowboys and joyriders and agrarian anarchists as they got high and listened to “Maggie’s Farm” (Bob Dylan), “Up on Cripple Creek” (the Band), and “Uncle John’s Band” (the Grateful Dead). Overnight they made the uncool old Victorian houses in San Francisco cool. The vision of the future sold starting in 1968 by the
Whole Earth Catalog,
the counterculture’s obligatory omnibus almanac, was agrarian and handmade as well as—
ahead of the curve—computerized and video-recorded.
In 1969, at the Woodstock Festival, the music of the final performer, Jimi Hendrix, was absolute late ’60s, disconcertingly and deliciously freaky and vain. Playing right before him, however, had been a group almost nobody knew. Sha Na Na, led by a Columbia University graduate student, sang cover versions of a dozen rock and doo-wop songs from 1956 to 1963, wearing 1950s-style costumes and doing 1950s-style choreography. To the crowd and to the
movie audiences in 1970, this was spectacularly surprising and amusing. It was intense
nostalgia, a measure of just how much and how quickly everything had changed. Songs only six or twelve years old, the music of their childhoods and earlier adolescence—“Jailhouse Rock,” “The Book of Love,” “At the Hop,” “Teen Angel,” “Duke of Earl”—already seemed
so ridiculously dated.
Even at the event that remains a defining peak moment of a revolutionary new age that had only just gotten started—the phrase
—Americans began turning backward for the reassuring, unchallenging gaze back at a past that wouldn’t change or surprise or shock.
Nostalgia was the charming sanctuary to which people retreated to feel better during their post-1960s hangover—and then never really left. They were encouraged by a culture industry that immediately created a wide-ranging nostalgia division of a kind that hadn’t existed before.
The Last Picture Show,
set in 1951, came out in 1971, made tons of money, and won Oscars. The musical
set in 1959, appeared in 1971, became the most popular movie of 1978 (featuring Sha Na Na, who by then had their own popular TV variety show), and ran on Broadway for the whole decade.
The Way We Were,
the fifth most popular movie of 1973, was set mainly in the 1950s. George Lucas’s
set in 1962, was the third most popular movie of 1973 and softened the ground for the premiere a few months later of its TV doppelgänger
which in 1976 spun off
Laverne & Shirley,
set in the late 1950s and early ’60s.
also set in 1962, came out in the late 1970s and was one of the most successful movies of the decade.
“I saw rock and roll future and its name is Bruce Springsteen,” an influential young rock critic wrote in a review of a live performance in 1974, then helped make it so by becoming his producer for two decades. Hearing the seventy-year-old Springsteen singing his songs today, rhapsodizing about characters and tales of his youth, the nostalgia seems earned and real. But back in the early 1970s, as a twenty-four-year-old, he came across as a superior nostalgia act, an earnest higher-IQ Fonzie
He “seems somewhat anachronistic to many—black leather jacket, street-poet, kids-on-the-run, guitar as switchblade,” another influential young rock critic wrote in his positive review of
Born to Run
in 1975. “Springsteen is not an innovator—his outlook is rooted in the Fifties; his music comes out of early rock ’n’ roll, his lyrics from 1950s teenage rebellion movies and beat poetry.”
It wasn’t just the American 1950s on which American pop culture suddenly, lovingly gorged in the 1970s.
became a nostalgic fetish object. During the 1970s, fans of the Grateful Dead began bathing in nostalgia for the late 1960s, “obsessively stockpiling audio documentation of the live Dead,” as the cultural historian Simon Reynolds explains, indulging their “deepest impulse: to freeze-frame History and artificially keep alive an entire era.” And that has continued into the twenty-first century—“the gentle frenzy of Deadheads is a ghost dance: an endangered, out-of-time people willing a lost world back into existence.”
“Everything Old Is New Again” became a pop hit in 1974 for a reason.
(1972) fetishized the look and feel of the 1940s,
The Great Gatsby
(1974) of the 1920s—and at the heart of both were notions central to the emerging American economic zeitgeist: “It’s not personal, it’s strictly business,” as Michael Corleone said, and greed and ostentatious wealth and gangsterism were all hereby cool. Most of the earnest bits in Woody Allen’s work consist of nostalgia, starting in 1972 with
Play It Again Sam.
of the most popular movies released in 1973 trafficked in twentieth-century nostalgia, including the gorgeous Depression of
Paper Moon. The Waltons,
a sentimental TV drama set during the Depression and World War II in a small Virginia town, premiered in 1972 and ran until 1981. Even the one enduring
Hollywood genre that arose in the mid-1970s and early ’80s, what Lucas and Steven Spielberg created with
Raiders of the Lost Ark,
was actually just a big-budget revival of an old genre, forgettable action-adventure B movies and serials from the 1930s and ’40s and ’50s.
In the 1970s I was too young to perceive this sudden total national immersion in nostalgia as unprecedented and meaningful, so I’ve wondered since if it only looks like that in retrospect. I was therefore delighted, as I was almost finished with this book, to discover a somewhat shocked contemporaneous account of the phenomenon. It’s a remarkable Rosetta Stone.
Robert Brustein, the dean of the Yale School of Drama at the time, published a magazine essay in 1975 called “Retread Culture.” Back then, by today’s standards, revivals and remakes and multiple sequels were still extremely rare. The first modern superhero movie (
1978) hadn’t yet been made. But Brustein was struck by the strangeness of “the current nostalgia boom,” the “revivals of old stage hits,” “retrospectives of films from the thirties and forties by auteur directors, authentic looking reconstructions of period styles in new films,” “revived musical forms,” and so on. “Much of contemporary American entertainment,” he wrote, “is not so much being created as re-created,” each “recycled commodity” presented in the place of something actually new.