http://mosaicmagazine.com/essay/2014/04/does-america-still-have-what-it-takes/?utm_source=Mosaic+Daily+Email&utm_campaign=29d187ca02-2014_4_1&utm_medium=email&utm_term=0_0b0517b2ab-29d187ca02-41165129
Why the American spirit of innovation is in trouble, and what culture has to do with it.
By Charles Murray
Some years ago, I conducted an ambitious research project to document and explain patterns of human accomplishment across time and cultures. My research took me from 800 BCE, when Homo sapiens’ first great surviving works of thought appeared, to 1950, my cut-off date for assessing lasting influence. I assembled world-wide inventories of achievements in physics, biology, chemistry, geology, astronomy, mathematics, medicine, and technology, plus separate inventories of Western, Chinese, and Indian philosophy; Western, Chinese, and Japanese art; Western, Arabic, Chinese, Indian, and Japanese literature; and Western music. These inventories were analyzed using quantitative techniques alongside standard qualitative historical analysis. The result was Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences (2003).
My study confirmed important patterns. Foremost among them is that human achievement has clustered at particular times and places, including Periclean Athens, Renaissance Florence, Sung China, and Western Europe of the Enlightenment and the Industrial Revolution. But why? What was special about those times and places? In the book’s final chapters, I laid out my best understanding of the environment within which great accomplishment occurs.
In what follows, I want to conduct an inquiry into the ways in which the environment of achievement in early 21st-century America corresponds or fails to correspond to the patterns of the past. As against pivotal moments in the story of human accomplishment, does today’s America, for instance, look more like Britain blooming at the end of the 18th century or like France fading at the end of the 19th century? If the latter, are there idiosyncratic features of the American situation that can override what seem to be longer-run tendencies?
To guide the discussion, I’ll provide a running synopsis, in language drawn from Human Accomplishment, of the core conditions that prevailed during the glorious periods of past achievement. I’ll focus in particular on science and technology, since these are the fields that preoccupy our contemporary debates over the present course and future prospects of American innovation.
1. Wealth, Cities, Politics
I begin with enabling conditions. They don’t explain how the fires of innovative periods are ignited—we’ll come to that later—but they help explain how those fires are sustained.
Accomplishment in the sciences and technology is facilitated by growing national wealth, both through the additional resources that can support those endeavors and through the indirect, spillover effect of economic vitality on cultural vitality.
What is the relation between innovation and economic growth? The standard account assumes that the former is a cause and the latter is an effect. To judge from past accomplishment in fields other than technology, however, the causal arrow points in the other direction as well. Growing wealth encouraged a competitive art market in Renaissance Florence, providing incentives for the young and talented to enter the field. Growing wealth in 18th-century Europe enabled patrons to support the work of the great Baroque and classical composers. Similarly with technological innovation: growing wealth is not only caused by it but helps to finance the pure and applied research that leads to it.
Growing national wealth also appears to have a more diffuse but important effect: encouraging the cultural optimism and vibrancy that accompany significant achievement. With only one conspicuous exception—Athens in the fourth century BCE, which endured a variety of catastrophes as it produced great philosophy and literature—accomplishment of all sorts flourishes in a context of prosperity.
In assessing contemporary America’s situation from this angle, the big unanswered question is whether the upward growth curve that has characterized the nation’s history will continue or whether our present low-growth mode is a sign of creeping economic senescence. It is too soon to say, but if the latter proves to be the case, innovation can be expected to diminish. No society has ever been economically sluggish and remained at the forefront of technological innovation.
Streams of accomplishment become self-reinforcing as new scientists and innovators build on the models before them.
Statistically, one of the strongest predictors of creativity in a given generation is the number of important creative figures in the two preceding generations. By itself, the correlation tells us only that periods of creativity tend to last longer than two generations. The reasons are unknown, but one specific causal factor has been noted by writers going all the way back to the Roman historian Velleius Paterculus in the first century CE. Explaining the improbable concentration of great accomplishment in Periclean Athens, Paterculus observed that “genius is fostered by emulation, and it is now envy, now admiration, which enkindles imitation.” In the modern era, the psychologist Dean Simonton has documented the reality underlying Paterculus’s assertion: a Titian is more likely to appear in the 1520s if Michelangelo and Leonardo were being lionized in the 1500s; a James Maxwell is more likely to turn his mathematical abilities to physics in the 1850s if Michael Faraday was a national hero in the 1840s.
By this standard, American culture would seem to be going downhill. It’s likely that individuals within most technological industries still have heroes, unknown to the public at large, who serve as models. People within the microchip industry know about Jack Kilby, Robert Noyce, and Gordon Moore; people within the energy-development industry know about George Mitchell. But such local fame is not what inspires members of one generation to emulate members of the preceding generation or generations.
In part, the declining visibility of outsized individuals reflects the increasingly corporate nature of technological innovation itself. Insiders may be aware of the steps that led to the creation of the modern microchip or the development of slickwater fracturing, but those steps have no counterpart to the moments when Samuel Morse telegraphed “What hath God wrought” and Alexander Graham Bell said “Mr. Watson, come here,” or to the day when Thomas Edison watched an incandescent bulb with a carbon filament burn for 13.5 hours after hundreds of other filaments had failed. Even Steve Jobs and Bill Gates, the most famous people involved in the development of the personal computer, didn’t actually invent anything themselves.
In part, too, the decline I’m tracing here reflects a larger cultural shift. In America, inventors once loomed large in the popular imagination. In the classroom, schoolchildren throughout the 19th and early 20th centuries grew up on the stories of Bell and Morse and Edison, of Eli Whitney, Robert Fulton, the Wright brothers, Henry Ford, and more—as well as on stories of awe-inspiring technological achievements like the building of the transcontinental railway and the Panama Canal. Popular fiction celebrated inventors and scientists—Sinclair Lewis’s Arrowsmith provoked a surge of interest among young people in becoming medical researchers—and Hollywood made movies about them. There are still occasional exceptions (the movies Apollo 13 and The Social Network come to mind), but they are rare. The genre is out of fashion, as is the ethos that supported it.