The Surprising Origins of Our Obsession with Creativity

By the time I graduated college, the conventional wisdom among those who loved to reject conventional wisdom was not only that being creative was a good thing but that creative people were going to inherit the earth.

A recent crop of cocktail-party nonfiction about Right-Brainers and Bourgeoise Bohemians and something called the Creative Class said the age of rule-loving “organization men” had passed, leaving the field to those who rebelled against the status quo. As factories left American shores and computers automated more and more white-collar brainwork, the raw material of our increasingly “weightless” postindustrial economy was no longer steel and coal, but rather ideas.

In this “weightless” and “increasingly complex world,” the creatives, long confided to garrets and bohemian cafes, would finally take their places as the leaders of a bright new future. The MFA was the new MBA.

These turn-of-the-millennium prophesies soon became dogma. In 2010, a poll of 1,500 CEOs ranked creativity as “the most important leadership quality for success in business,” beating out such virtues as integrity and global thinking. The Partnership for the 21st Century Skills, a committee representing the National Educational Association, the Department of Education, and a handful of top tech companies including Apple, Microsoft, and Cisco, listed creativity as one of the “4 Cs” along with communication, collaboration, and critical thinking.

And in 2020 the World Economic Forum declared creativity “the one skill that will future-proof you for the jobs market.” We apparently already knew that: back in 2011, “creative” was already the most common adjective job applicants used to describe themselves on LinkedIn.

Creativity’s Big Bang

We tend to assume creativity is a timeless human value. But in fact, creativity has only been a regular part of our vocabulary since the middle of the twentieth century. Its first known written occurrence was in 1875, making it an infant as far as words go.

“Creativeness” goes back a bit further, and was more common than creativity until about 1940, but both were used rarely and ad hoc. Strikingly, before about 1950 there were approximately zero articles, books, essays, treatises, odes, classes, encyclopedia entries, or anything of the sort dealing explicitly with the subject of “creativity.” (The earliest dictionary entry I found was from 1966.)

It is not, it turns out, in Plato or Aristotle (even in translation). It’s not in Kant (ditto). It’s not in Wordsworth or Shelley, or in the Americans Emerson, William James, or John Dewey. As the intellectual historian Paul Oskar Kristeller finds, creativity, though we tend to assume it is a timeless concept, is a term with “poor philosophical and historical credentials.”

Yet, just around the end of World War II, the use of “creativity” shot upward—the big bang of creativity.


The relative frequency of the term “creativity” in books in English between 1800 and 2019. Google Books.

How did we so quickly go from never talking about creativity to constantly talking about, it, and even anointing it as the reigning value of our time? What was it about the dawn of the postwar era that demanded this new Swiss Army knife of a term? And whom did it serve?

When I began researching creativity’s big bang, I imagined finding its roots in the burgeoning youthful counterculture—the spirit of self-expression, experimentation, and rebelliousness that would crest in the 1960s and give that decade its distinctive mark. Instead, I found that the creativity craze began somewhat earlier, in the 1950s, an era we associate with conformity, bureaucracy, and suburbanization.

And it didn’t come from artists or the bohemian fringe in that era, either. As Dorothy Parker quipped in 1958, the high-water mark of the postwar creativity craze, the more a writer actually sits down to write, “the less he will get into small clusters bandying about the word ‘creativity.’”

How did we so quickly go from never talking about creativity to constantly talking about, it, and even anointing it as the reigning value of our time?

Despite the fact that many in the postwar American art world embraced self-expression and experimentation, it turns out the efforts to really get under the hood of something called creativity—which also encompassed ideas like “creative ability,” “the creative personality,” and “the creative process”—were primarily driven by a concern not for art per se but for inventiveness in science, technology, consumer products, and advertising.

At the same time, the artsy connotations were not incidental: the postwar cult of creativity was driven by a desire to impart on science, technology, and consumer culture some of the qualities widely seen to be possessed by artists, such as nonconformity, passion for work, and a humane, even moral sensibility, in addition to, of course, a penchant for the new.

As America emerged from World War II, the needs of industry were changing. Having achieved a productive capacity far beyond that necessary to meet everyone’s basic needs, management was suddenly less worried about efficiency than about marketing, innovation, and adaptability. As Peter Drucker said, not manufacturing but “innovation and marketing” were the new managerial priorities.

Particularly as computers began to assume some of the more menial office tasks, managers began to fret that a whole half century of inculcating in workers the values of rationality and order, and encouraging specialization, was leaving the workforce ill equipped for this new reality. Just as the wartime factories had to be retooled to meet the demands of a consumer economy, so did the white-collar worker need to be retooled.

Creativity as a Cure

The concept of creativity, typically defined as a kind of trait or process vaguely associated with artists and geniuses but theoretically possessed by anyone and applicable to any field, emerged as a psychological cure for these structural and political contradictions of the postwar order. Psychologists developed tests to identify “creative people” based largely on the needs of military and corporate R & D, but they were also motivated by a larger desire to save individuals from the psychic oppression of modernity.

Likewise, in industry, the first creative thinking methods, such as brainstorming, were initially geared toward industrial improvement and new product development, but they did so by addressing alienation on the job. Advertising professionals touted “creative advertising” as both a cure for lagging sales and a way to bring individual vision back into their field. And many corporations embraced creativity not only to help spur innovation but also because it made them look more humane amid backlash against the military-industrial complex.

In all of these cases, the practical matters of staffing R & D labs, coming up with new product ideas, or selling said products coexisted with larger concerns about conformity, alienation, and the morality of work.

Bruner suspected that what really lay behind the sudden interest in research on creativity was an anxiety about the nature of white-collar work.

Creativity could, for one thing, ease a nagging tension between the utilitarian and the humane or transcendent. In 1962, the distinguished psychologist Jerome Bruner noted “a shrillness to our contemporary concern with creativity.” Psychologists were being asked to dissect the nature of “the creative” not purely as scientists “but as adjutants to the moralist.”

Bruner suspected that what really lay behind the sudden interest in research on creativity was an anxiety about the nature of white-collar work, particularly among scientists and technicians. These workers had been inculcated with the dogma of professionalism and efficiency, seeing themselves as part of a great social machine.

For an engineer or an advertising professional, to be creative was not simply to be productive, though it was that, but also to model oneself not on the machine but on the artist or poet. It was to pursue work with an intrinsic motivation, a passion for the act of creation. It was to be more human. Though this didn’t necessarily change anything about the actual products these workers were employed to invent, design, and sell, it did implicitly add a moral aura to their work by shifting the emphasis from product to the process itself, creativity.

Hope and Tension

The development of the concept of creativity by psychologists and creative thinking experts allowed for the emergence of a new form of subjectivity, the creative person. The creative person was a producer in a world of consumers. No impotent Babbitt or parasitic pencil-pusher, the creative person was a rebel and freethinker. They lived to create. This person, generally assumed to be male, was nonetheless construed as more emotionally sensitive than the old model. As reductive as they were, these tweaks to the middle-class self did broaden the view of whose mental labor might be valuable.

It should be no surprise that in the liberation movements of the 1960s, arguments for the right to participate in national life were sometimes made in the language of creativity. For instance, when Betty Friedan wrote in 1963 that women could achieve self-realization only through “creative work,” she meant work that was traditionally male, such as journalism, the sort recognized as worthy of remuneration and renown.

For an engineer or an advertising professional, to be creative was not simply to be productive, though it was that, but also to model oneself not on the machine but on the artist or poet.

Friedan embodies an additional key theme—a tension between optimism and pessimism. She was deeply critical of the way the world was, but also hopeful for what it could be. Focusing on creativity, which spoke of excellence, excitement, even joy, was for many an act of hope.

Many psychologists, for example, contrasted their creativity research with the focus on mental illness and dysfunction that had preoccupied their field, while creativity management consultants imagined they were leading the charge in forming a more humane workplace. These people hoped that automation and affluence would provide more opportunities for human flourishing, even a transcendence of traditional capitalist relations. Could we be headed, as Thomas Watson of IBM put it, for “a new age of Pericles,” our material needs met and our minds free to partake in higher artistic and intellectual pursuits? Or would it all lead to opulence and stagnation, dooming America, as the historian Arnold Toynbee had warned, to the fate of fallen civilizations past?

For all the optimism that gathered around creativity, the very need to name, and so to begin to understand and master, this thing, some individual font of dynamism, betrayed a deep fear that it was already sorely lacking.

Finally, besides the overarching tension between the individual and mass society, and between optimism and pessimism, creativity also mediated between elitist and egalitarian tendencies. On one hand, the postwar era was a profoundly democratic age, characterized by a strong welfare state, expanding minority rights, and widely shared prosperity. Americans were constantly reminded that it was in the name of democracy that they fought wars and now aggressively patrolled the globe, and the figure of the “common man” still had some heroic charm from the Depression years.

On the other hand, particularly after the Russian launch of Sputnik, the fear of “mediocrity” brought on a new, often reactionary focus on “excellence.” Toynbee lamented that America was neglecting its “creative minority” and so risking the kind of decline that befell every great empire before it. The question was, as the title of the 1961 John W. Gardner book put it, “can we be equal and excellent too?” It’s no coincidence that Gardner, as an officer of the Carnegie Corporation, funded some of the earliest and most influential psychological research on creativity.

To understand how recently creativity arrived, and the messy, practical world from which it arose, is to understand how we got to our own situation.

Creativity was a topic capacious enough to apply to Great Men as well as elementary school children and rank-and-file engineers. Unlike genius, creativity could be said to exist in everyone, and in that sense was both more democratic and (more importantly, perhaps) more useful for managers overseeing scores or hundreds or thousands of employees. It satisfied a nostalgia for an earlier age of the genius inventor and entrepreneur, but in a form befitting the ideological and pragmatic realities of the mass society.

Creativity Today

Excavating this history overturns many assumptions we have about creativity, including that it’s always been with us, or that it was once a term reserved for gods, artists, and geniuses. To understand how recently creativity arrived, and the messy, practical world from which it arose, is to understand how we got to our own situation.

Today’s dogged determination to “do what you love” and our disdain for the nine-to-five; the very fact that we now have a class of people known simply as “creatives” or even “creators”; and the persistence of optimism despite so many cruel realities of modern capitalism—all of these can in some way trace their origins to the immediate postwar cult of creativity.

The fact that we are still in many ways dealing with the same contradictions helps explain why we continue to be so enamored of the idea, and why so many of us want so desperately to be creative.


Reprinted with permission from The Cult of Creativity: A Surprisingly Recent History by Samuel W. Franklin, published by the University of Chicago Press. © 2023 by Samuel W. Franklin. All rights reserved.


​When you buy a book using a link on this page, we receive a commission. Thank you for supporting Behavioral Scientist’s nonprofit mission.