The Charlie Hebdo attack and its aftermath in the streets and in the press tempt one to dust off Samuel Huntington‘s 1996 book, The Clash of Civilizations and the Remaking of World Order. Despite the criticisms he provoked with that book and his earlier 1993 article in Foreign Affairs, recent events would seem to be proving him prescient.
Or was he?
While I am not about to deny the importance of religion and culture as drivers of geopolitical dynamics, I will argue that, more important than the clashes among the great civilizations, there is a clash within each of the great civilizations. This is the clash between those who have “made it” (in a sense yet to be defined) and those who have been “left behind” — a phrase that is rich with ironic resonance.
Before I make my argument, I warn that the point I’m trying to make is fairly subtle. So, in the interest of clarity, let me lay out what I’m not saying before I make that point. I am not saying that Islam as a whole is somehow retrograde. I am not agreeing with author Sam Harris’ October 2014 remark on “Real Time with Bill Maher” that “Islam is the mother lode of bad ideas.” Nor am I saying that all religions are somehow equal, or that culture is unimportant. The essays in the book Culture Matters, which Huntington helped edit, argue that different cultures have different comparative advantages when it comes to economic competitiveness. These essays build on the foundation laid down by Max Weber’s 1905 work, The Protestant Ethic and the Spirit of Capitalism. It is only the “sulfuric odor of race,” as Harvard historian David Landes writes on the first page of the first essay in Culture Matters, that has kept scholars from exploring the under-researched linkages between culture and economic performance.
Making It in the Modern World
The issue of the comparative advantages or disadvantages of different cultures is complicated and getting more so because with modernity and globalization, our lives are getting more complicated. We are all in each other’s faces today in a way that was simply not the case in earlier centuries. Whether through travel or telecommunications or increasingly ubiquitous and inexpensive media, each and every one of us is more aware of the cultural other than in times past. This is obvious. What is not so obvious are the social and psychological consequences of the inevitable comparisons this awareness invites us to make: How are we measuring up, as individuals and as civilizations?
In the modern world, the development of the individual human, which is tied in part to culture, has become more and more important. If you think of a single human life as a kind of footrace — as if the developmental path from infancy to maturity were spanning a certain distance — then progress over the last several millennia has moved out the goal posts of maturity. It simply takes longer to learn the skills it takes to “make it” as an adult. Surely there were skills our Stone Age ancestors had to acquire that we moderns lack, but they did not have to file income taxes or shop for insurance. Postmodern thinkers have critiqued the idea of progress and perhaps we do need a concept that is forgivingly pluralistic. Still, there have been indisputable improvements in many basic measures of human progress. This is borne out by improved demographic statistics such as birth weight, height and longevity, as well as declining poverty and illiteracy. To put it very simply, we humans have come a long way.
But these historic achievements have come at a price. It is not simple for individuals to master this elaborate structure we call modern civilization with its buildings and institutions and culture and history and science and law. A child can’t do it. Babies born into this world are biologically very similar to babies born 10,000 years ago; biological evolution is simply too slow and cannot equip us to manage this structure. And childhood has gotten ever longer. “Neoteny” is the technical term for the prolongation of the period during which an offspring remains dependent on its parent. In some species, such as fish or spiders, newborns can fend for themselves immediately. In other species — ducks, deer, dogs and cats — the young remain dependent on their mothers for a period of weeks. In humans, the period of dependency extends for years. And as the generations and centuries pass, especially recently, that period of dependency keeps getting longer.
As French historian Philippe Aries informed us in Centuries of Childhood, “in medieval society, the idea of childhood did not exist.” Prior to modernity, young people were adults in miniature, trying to fit in wherever they could. But then childhood got invented. Child labor laws kept children out of the factories and truancy laws kept them in public schools. For a recent example of the statutory extension of childhood known as neoteny, consider U.S. President Barack Obama’s announcement that he intends to make community college available for free to any high school graduate, thus extending studenthood by two years.
The care and feeding and training of your average human cub have become far greater than the single season that bear cubs require. And it seems to be getting ever longer as more 20-somethings and even 30-somethings find it cheaper to live with mom and dad, whether or not they are enrolled in school or college. The curriculum required to flourish as an adult seems to be getting ever longer, the goal posts of meaningful maturity ever further away from the “starting line,” which has not moved. Our biology has not changed at anywhere near the rate of our history. And this growing gap between infancy and modern maturity is true for every civilization, not just Islamic civilization.
The picture gets complicated, though, because the vexed history of the relationships among the world’s great civilizations leaves little doubt about different levels of development along any number of different scales of achievement. Christian democracies have outperformed the economies and cultures of the rest of the world. Is this an accident? Or is there something in the cultural software of the West that renders it better able to serve the needs of its people than does the cultural software called Islam?
Those Left Behind
Clearly there is a feeling among many in the Islamic world that they, as a civilization, have been “left behind” by history. Consider this passage from Snow, the novel by Nobel Prize-winning Turkish author Orhan Pamuk:
“We’re poor and insignificant,” said Fazul, with a strange fury in his voice. “Our wretched lives have no place in human history. One day all of us living now in Kars will be dead and gone. No one will remember us; no one will care what happened to us. We’ll spend the rest of our days arguing about what sort of scarf women should wrap around their heads, and no one will care in the slightest because we’re eaten up by our own petty, idiotic quarrels. When I see so many people around me leading such stupid lives and then vanishing without a trace, an anger runs through me…”
Earlier I mentioned the ironic resonance of this phrase, “left behind.” I think of two other recent uses: first, the education reform legislation in the United States known as the No Child Left Behind Act; the second, the best-selling series of 13 novels by Tim LaHaye and Jerry Jenkins in which true believers are taken up by the Rapture while the sinners are “left behind.” In both of these uses, it is clearly a bad thing to be left behind.
This growing divide between those who have made it and those who are being left behind is happening globally, in each of the great civilizations, not just Islam. To quote my fellow Stratfor columnist, Ian Morris, from just last week:
Culture is something we can change in response to circumstances rather than waiting, as other animals must, for our genes to evolve under the pressures of natural selection. As a result, though we are still basically the same animals that we were when we invented agriculture at the end of the ice age, our societies have evolved faster and faster and will continue to do so at an ever-increasing rate in the 21st century.
And because the fundamental dynamics of this divide are rooted in the mismatch between the pace of change of biological evolution on the one hand (very slow) and historical or technological change on the other (ever faster), it is hard to see how this gap can be closed. We don’t want to stop progress, and yet the more progress we make, the further out the goal posts of modern maturity recede and the more significant culture becomes.
There is a link between the “left behind” phenomenon and the rise of the ultra-right in Europe. As the number of unemployed, disaffected, hopeless youth grows, so also does the appeal of extremist rhetoric — to both sides. On the Muslim side, more talk from the Islamic State about slaying the infidels. On the ultra-right, more talk about Islamic extremists. Like a crowded restaurant, the louder the voices get, the louder the voices get.
I use this expression, those who have “made it,” because the gap in question is not simply between the rich and the poor. Accomplished intellectuals such as Pamuk feel it as well. The writer Pankaj Mishra, born in Uttar Pradesh, India, in 1969, is another rising star from the East who writes about the dilemma of Asian intellectuals, the Hobson’s choice they face between recoiling into the embrace of their ancient cultures or adopting Western ways precisely to gain the strength to resist the West. This is their paradox: Either accept the Trojan horse of Western culture to master its “secrets” — technology, organization, bureaucracy and the power that accrues to a nation-state — or accept the role of underpaid extras in a movie, a very partial “universal” history, that stars the West. In my next column, I’ll explore more of Mishra’s insights from several of his books.
“Mind the Gap is republished with permission of Stratfor.”