I’ve started in on writing a course for next winter called “A History of Everything,” which will cover---well—everything!: from the Big Bang to our current 21C crisis. This process is forcing me to rethink a lot of assumptions about the human condition and my understanding of history.
One aspect which is a tangent on my stance on modernity (i.e. the last 250 years) is the degree to which we have a bias for change. It’s not entirely new to my generation. Indeed, Alvin Toffler wrote about “Future Shock” over a half-century ago. My grandfather was born before the age of flight and lived to see women in space. Pretty much everyone born in the 20C (certainly in the West and much of the rest of the world as well) has lived through more change—social, economic, technological, cultural—than their grandparents could have likely imagined. Certainly those born in the 21C (if I may eschew the “Gen X, Gen Y…Gen Alpha” nomenclature) knows nothing but. And social media is nothing but change, hyped by fashion and a proverbial short attention span. These days, this encompasses virtually everyone in the world with the exception of a few tiny groups isolated from global modernity.
By a “bias for change,” I mean an assumption of impermanence, an expectation of evolution, and a moderate degree of surprise at encountering stability (stasis). It’s so ordinary that we’re like a fish in water. It is, if I may mix environmental metaphors, part of the air we breathe. But beyond this normative sense, change is seen as an inherent good; at least insofar as the dominant culture is concerned. We tend to look down on those societies which continue more-or-less unabated and untroubled by disruption.
Our models for such societies are drawn from history and their relatively poor socio-economic-technological state compared with our own (exalted, high-tech) state. But by focusing on this angle of comparison, we lose sight of the potential benefits of stability and continuation. After all, such groups lived in an environment where change was not normal; they were fish swimming in different water. They had no conception of what would eventuate (ditto for us, but that’s a different story) and so did not suffer by comparison with their own situation. Few likely rued the absence of the latest iPhone operating system or the chance to live in cities with millions of others (as most of us do). They might have been impressed with our ability to manage diseases and live longer, but they didn’t engage in such (to them) hypotheticals. We may champion those historical figures who sought and implemented change, but there weren’t very many of them and we claim them as the forefathers of our own modernity. We also tend to quickly forget the real-time costs of change—epistemological disruption, migration, inequality—indeed, the very phrase “transitional costs” almost invites dismissal once the immediate pain has passed.
Historians, of course, have an innate bias for change. Indeed, one definition of History is the “study of change over time.” As a discipline, we love to write about what’s new and its ramifications. If the past wasn’t dynamic, there wouldn’t be much to write about. Indeed, we could argue that the modern idea of History emerged in the 19C in response to the acceleration of change in the 18C (e.g., French and Industrial “Revolutions”).
This happened about the time that the idea of political conservatism crystallized (I mean, of course, actual conservativism, not what passes for the “right wing” these days. It’s a bit too crude to characterize conservatives as the “anti-change” party, but it’s not far from the mark. Certainly, their premise is that change should be incremental and organic; rather than dramatic, exogenous, or “revolutionary.” A more radical conservatism earned the title “reactionary,” arguing (at least implicitly) for a return to traditional political, economic, and social modes. The current version of this group seems to stand astride the twitching body of the Republican Party. The GOP used to have a “normal” conservative stance, albeit with intermittent reactionary elements. It’s now a zombie political entity, mouthing some conservative bromides, but increasingly reactionary and in a highly selective way. As I noted recently, it’s more about mythology and bad History than any cogent engagement with the past.
I do wonder as to the degree this represents not just an outlook that doesn’t like the current state of things, but an inability to cope with change. In making this point, I have to be careful. It would be easy to fall into a trap of equating an embracing of change as “normal,” and implying a moral deficiency to those whose psychology doesn’t work like mine does. Nor do I want to create a model in which those who are not forward looking are archaic or somehow “deplorable.” To the contrary, I’m suggesting that some aversion to change is actually normal and ordinary, even if it doesn’t rise to the level of a political stance. Change is hard and not always an improvement. To equate change with moral progress is precisely the trap with which I’m concerned.
Indeed, I can argue that those who are inured to the current rate of change suffer from a different distortion of perspective. The cult of progress, grounded in the remarkable improvement in technologies of all sorts over the past 250 years, has made it difficult to recognize that such “progress” is extraordinary. The resulting optimism (of which tech bros’ gushing enthusiasm for AI is the most recent example) seems similar to the blindness of those who are financially well-off to the nature and sources of their cultural and economic advantages. Being born into such a world (of wealth or of progress or of race) can be distorting and terms such as “merit” or “fairness” need to be closely scrutinized.
The pace of change has accelerated and may continue or careen out of control. It’s no time to make blithe assumptions about what is ultimately beneficial.
RSS Feed