Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Ancien Regime

1/16/2026

0 Comments

 

In December, 1783, the signed confirmation of British recognition of American independence arrived in Philadelphia to great excitement. A month later, it was ratified by the Continental Congress, acting under the Articles of Confederation. The Articles, which set the framework for joint action by the thirteen colonies had been in place for three years, but many were already unhappy with how it worked. 

By 1786, that unhappiness had increased and delegates gathered in Annapolis to see if they could propose some improvements, but only five states showed up and most states had authorized their delegates to discuss only a limited range of issues, so the Annapolis group advised that a more extensive set of reforms needed to be considered. The Continental Congress agreed that a revision was appropriate and delegates from twelve states arrived in the Spring of 1787 to discuss those changes. 

What (we call) the Constitutional Convention proposed instead was a wholesale rewrite of the relationship between the States, including a detailed structure for a new national government. It was far beyond their mandate, but they were able to persuade Congress and the country that more radical action was necessary. Even back then, the concept of the rule of law was part of  British political culture and, from that perspective, the route to the Constitution for a large part of the former British North America was problematic, not to say (literally) revolutionary.

Immediately after the new Constitution was ratified, the Bill of Rights was adopted and we have been living under this arrangement (with only a few significant formal amendments) ever since. Hundreds of proposals for updating have died somewhere in the multi-stage process of amendment established under Article V. Along the way, the (unelected) Supreme Court has (usually to much controversy) interpreted the document in novel ways. But, that’s it.

While the great powers of Europe and the ancient cultures of Japan and China can claim greater duration, the US has the oldest continuous Constitutional system of any country in the world. Perhaps in competition for their venerability, we have, for a long time, been proud of our longevity as a nation and the stability of our system of government.

That’s no longer the case, or, stated differently, our society is stuck with a governing structure that is archaic, virtually static, and unfit for purpose in the 21C.

It’s as if we were trying to run an AI system on MS-DOS (although, I guess, early AI attempts were exactly that!).

From another perspective, when the Constitution was adopted there were less than four million people in the US, of whom (excluding women, slaves, and children) there were well under one million eligible voters. (In fact, only 28,000 people actually voted for George Washington for President a year later.) So, a group roughly the size of Connecticut or Utah wrote the rules under which we still live.  By the same token, our current total population of about 1/3 billion represents about 60% of all the people who have ever lived in these United States.  The historical tail is wagging the contemporary dog.

It's an interesting question of culture, history, (and anthropology?) as to why we subject ourselves to their rules and practices. The point goes far beyond the constitutional context framed here. Culture is, pretty much by definition, the product of history; after all, we don’t know anything else. It can also be seen as a deal between the past, the present, and the future. We (in the present) accept the judgment of our predecessors as to how we should run ourselves and our society. We also represent to our progeny that how we are doing so is the best way for those to come to run themselves and their society; even knowing—in each case—that changes have been and will be made. No group at any particular point has the time/bandwidth to rewrite everything; so, most change is sporadic and incidental (however chaotic it might seem at the time).

In her recent history of the Constitutional amendment process “We the People,” Jill Lepore reiterates that we are out of practice in terms of Constitutional change. The last time we made more than minor tweaks in the document through the formal amendment process was well over a hundred years ago. Since then, the substance of our constitutional order has changed solely through de facto practice and Supreme Court “interpretations.” That each of these modes is ephemeral and reversible has become only too clear in the last couple of years. Lepore points out that progressive forces despaired of the (extremely difficult) formal amendment process and pursued socially-necessary changes through these other mechanisms. This includes both the considerable expansion of the scope of governmental activities (regulatory and social welfare), as well as civil rights for a range of groups. Now, the sauce for the goose is being served for the gander and the progressive goose is (sorry to mix befowled metaphors) cooked.

Even a dramatic political reversal and overruling of several recent Court decisions will only get us back to where we were twenty-five years ago. They won’t solve the underlying structural problems or reflect a 21C society.

A few weeks after Washington was inaugurated, a group of newly chosen representatives gathered outside Paris. Within just a few months, they launched what we call the French Revolution and, over the course of a few years, overturned the local variant of the long-established political and cultural society of Europe: the Ancien Regime. Images of the trés fabulique lives of Marie Antoinette et al. make it easy to consign this concept to history (even if the practice of “royal” elites dominating and exploiting a country continued well into the 20C). It would seem to have no resonance in our modern republican mentalité, but the Constitution is our Ancien Regime. By now, we are so deeply imbued with concepts like the “rule of law” that it’s hard to imagine that anything truly disruptive or radical could happen here (January 6 notwithstanding). Incumbents in such power structures have denied the possibility of change up to (sometimes past) the last minute, but it comes, sometimes suddenly and violently, sometimes in other painful stories. And, as the Stuarts, Bourbons, Romanovs, Pahlevis, and others can attest; no one knows what will emerge.

0 Comments

Venezuela

1/9/2026

0 Comments

 
Venezuela

Three years ago (120222, “Sauce for the Gander”), I wrote about geopolitical spheres of influence with particular focus on China and Taiwan. I compared that situation with the two-hundred-year-old predecessor to the “Don-roe” doctrine recently revived in Caracas. I noted that, based on our own historical practice, we didn’t have much basis for criticizing the Chinese for effectively claiming a sphere of influence encompassing Taiwan and the South China Sea. I don’t have much to add to the general point made there, other than to note that we have a long history of military interventions in Latin America, often (as here) from commercial motives (on top of some convenient distraction from domestic economic challenges). 

Beyond the obvious immediate problems of morality and international and domestic law arising from our kidnapping of Maduro and threats of coercion and control, our actions must make the Chinese feel smug about the implications for their freedom of action in their own backyard (even if we effectively pushed them out of our backyard); no to mention the Russians in their Ukrainian backyard. 

More fundamentally, coupled with the latest sword-rattling over Greenland (see also 011025, “Baby, It’s Cold Outside”), Cuba, and Mexico, the Administration is actively proclaiming the return of realpolitik as the basis of US foreign policy. Unfortunately, their timing is all wrong and their other foreign policy actions seem to undermine this latest thrust. The timing is wrong because China is a rising power and the US is relatively falling (see 121820, “Rising and Falling Powers”). Rather than doubling down on military might, we should be seeking other modes of constraining China, not least of which is building stronger alliances with others similarly situated (Europe, India, Japan). The practice of realpolitik, however, also requires clear-headed thinking about our strength and clear-headed thinking is rare enough, especially with the JV foreign policy team currently running the show (see 120525, “You Can’t Go Home Again"). 

You read it here first.

[and now back to our regularly-scheduled program.]

0 Comments

Invisible Hand

1/9/2026

0 Comments

 
Invisible Hand

Wherever one might be on the fan/critic spectrum of capitalism, we can all recognize its important impacts on how modern life works. Markets and money are central to the meaning and operations of a capitalist system and their pervasiveness has many ramifications. 

For millennia, money has been tangible (e.g. coins, cowrie shells) and transactions, even if mundane, were usually clearly noticeable. Mary Poppins, for instance, bought bird feed at tuppence a bag. She handed over some coins and got the goods in return. The shift to paper money (banknotes date back almost 4,000 years) was an important stage in the abstraction of value. The notes were a representation of value—often silver or gold—mediated by the promise of a bank or merchant to pay, but the notes had no inherent value themselves. As governments increasingly got into the act (e.g. China in the 7C), they, too, promised to redeem the paper for a chunk of precious metal. 

This link was broken in the aftermath of WWI, when reconstruction demands outstripped the piles of gold in government vaults. Those of a certain age remember when US paper money expressed a promise to redeem the note for some of that gold.

Until it didn’t. Nixon broke the formal link between money and gold in the US in 1971. Silver was eliminated from US coinage in 1965. Since then, we all rely solely on the social convention (back by statute) that our coins and paper are actually worth something more than the paper (or copper/zinc) that they’re printed on. 

As part of this broader shift, credit cards began to become widespread in the 1950s, further distancing value from transactions. Electronic transactions followed in the 1970s and now phones are banks and myriad apps (e.g. Venmo, PayPal, ApplePay, Zelle) make money instant and frictionless. Cryptocurrencies (with their own set of complications) are of a similar ilk.

We’ve come a long way from “tuppence a bag.”

One of the important implications of this historical process is that it’s become increasingly easy to forget that you’re spending money. As transactional friction has gone down, the psychological hurdle of handing over your hard-earned cash is lowered. It’s therefore easier to spend more. I doubt this is a coincidence.

The latest stage is the semi-invisible charge account. You give Uber your credit card and every time you take a ride, the money (indirectly and in stages) moves from your account to theirs. Subscriptions (e.g. Netflix, cell phones) accelerate this process. The transaction is so transparent and distant from actual usage that it virtually disappears and the funds automatically transfer to the service provider regardless of whether you actually use the service or not. Why worry about the cost of heating if your utility bill gets folded into your monthly credit card statement (along with dozens of other charges) which then gets paid by automatically dipping into your bank account? 

The perennial bugaboo of “hidden charges” is part of the same story. Most of the time legal requirements mandate their disclosure, but they’re usually buried in a plethora of fine print and legalese. Here in San Francisco, restaurants have taken this to a new level by adding a surcharge of 4-7% nominally for meeting local mandates for benefits for employees. How these additional costs are different from the other normal costs of doing business is not clear, but instead of bumping up the price of a $24 plate of pasta by $1.50, the restaurant puts a line in small print at the bottom of the page indicating the charge but leaving the “list price” the same. 

Karl Marx (and other critics of capitalism) have pointed out that the insertion of markets ensures alienation of both laborers and consumers from each other and the products and services created and utilized. Now that everything seems to pass through “the Cloud,” this distancing and alienation are increasing and, likely, contributing to the general sense of disconnection we all face in the modern world.

When Adam Smith talked about the invisible hand of the marketplace (another metaphor that has been seriously abused by subsequent interpreters), he likely didn’t imagine that invisible hand reaching down into our pockets and scooping up the cash without so much as a “by your leave.”

0 Comments

Intent in Language

1/2/2026

0 Comments

 
Perhaps it is my legal training, but I have long been focused on semantics. For a lawyer, words have meaning, usually a quite specific meaning— and we are attuned to the problems that arise when two people have differing thoughts in mind when they use the same word. You can’t, for example, have an agreement to share profits in a business without having a pretty specific idea of what “profit” means. So, a well-drafted legal agreement will always include a fair number of express definitions

One aspect of this attention to meaning has come up repeatedly for me in my study of history and related fields.  It is fundamental and, unsurprisingly, a bit obscure; but it sheds considerable light on how language works and, more importantly, how humans craft Histories.

Many terms carry an implication of consciousness, awareness or intent. Sometimes, they refer to a human, sometimes to a society, sometimes to some aspect of nature. Here are a few examples:

I was recently reading a study of evolution which discussed at great length the “struggle for existence.” To me, the term “struggle” connotes something more than just “making an effort in difficult circumstances.” Yes, work is involved; sometimes hard work, sometimes work that is unsuccessful; sometimes work that—when unsuccessful—ends in death. That work, however, is not inherently a “struggle for existence,” because when you’re referring to a plant, and animal, or even a group of humans, that effort is not made for the purpose of existing. For plants, most animals, and many groups of humans, there is no consciousness being exercised, so there can be no “purpose.” The goal is instinctual, or at least unconscious—to seek water or avoid pain. They have —literally— no idea what “existence” is, nor are they conscious of competition with others for whatever they’re seeking. In a similar vein, genes or species are often described as choosing to evolve in one manner or another. But, again, there’s no choice going on here. A mutation that gives a frog stronger legs will (likely) enable them to out jump some predator and so, survive and procreate more such strong-legged frogs. That’s it.

A recent well-regarded study of economic history provides another common example; “capitalism” we are told, “was dogmatic only about profits.” Leaving aside concerns about overgeneralization; this characterization imputes intent to an amorphic and disembodied “capitalism.” But capitalism (however defined (see Das Kapital 102122)) is no “thing.” Individual “capitalists” might have had particular goals (and profit would likely have led the list), but there was no group manifesto and, indeed, individuals in different eras and cultures had different mixes of motivations. Being “dogmatic,” in any event, requires some reasonably coherent focus. Similarly, events have no intent. The French Revolution didn’t “foreshadow” anything; subsequent historical echoes and rhymes are framed by Historians as being interpretable in the same light as the original event, but this is a far cry from the shorthand language being used. These are traps into which many who look backwards—professional or ordinary—fall.

If you read and listen carefully, you will likely find plenty of other examples; but a little self-reflection quickly shows that even in our own actions, clear and conscious intent is not as frequent as we might like to think. Much less when we get to organizations and other groups and certainly not when referring to amorphous constructs. Hell, despite vast scholarly attention we can’t say clear what the “intent” of the Founding Fathers was when they drafted the Constitution, the current fad for “originalism” notwithstanding.

There are several reasons for this practice. Using this kind of language is a variety of anthropomorphizing; that is, imputing human traits to other species or inanimate objects (the most common is “Mother Nature”). We do this to make the world more familiar. to cast it in terms WE understand. Second, it is also a variety of the psychological concept of “projecting”, i.e. making others seem more like ourselves. If we don’t have to account for our differences with others, we can live in a simpler and more familiar world. So, let’s just use language that makes others seem like us. This (thirdly) not only normalizes our own behavior and motivations, but also creates a world in which a higher percentage of phenomena and activities appear to be intentionally motivated. In other words, the world makes more sense because we can attribute these actions to someone/something’s plan to make them happen. 

This is especially important in History, where we are inherently hungry for coherent narrative; but it applies to our everyday lives today as well. One of the most common modes of imputing intent is to ascribe intent to “God.” The rationalization of the contingencies and variations of the world under the rubric of “god’s will” has been a ubiquitous theme of human societies for thousands of years. 

We similarly impute intent to the actions —whether unknown or merely inexplicable—of human actors. It’s more comforting to think, as the basis of many “conspiracy theories” for example, that there is a “deep state” planning to accomplish some nefarious deed, or a country’s decision to go to war is ascribed to some strategic vision. In fact, such decisions can as often be attributed to much more mundane occurrences or emotions. Immense effort went into blaming Germany for starting WWI, for example, rather than dealing with the fact that the assassination of the Austrian Archduke was a fluke of timing and circumstance. 

It could be argued that all this practice is merely metaphor; but metaphor used so frequently and without comment quickly loses its referential anchor for both writer and reader and passes into unconsciousness. In other words, the metaphor is forgotten and the meaning referred to—the intentionality of the action—stands on its own. In this case we construct a view of the world and its history that places human intent at the center of things. It may be more comfortable to think there’s a plan (human or divine), but most of the time—just like the stars that appear to make up Orion’s Belt, but are really many light years apart—it’s merely a comforting story.

0 Comments

Bang

12/19/2025

0 Comments

 
My upcoming course on the “History of Everything” starts with the Big Bang, about 13.8 billion years ago. We’re still living off the energy that was explosively deposited in our universe at that time. Being more interested in social phenomena, however, I was curious about how much “bang” we humans have produced. I utilized a (somewhat) trusty AI research assistant (in this case, Google’s Gemini) for some of the statistics used here, with all the caveats associated therewith.

This course is looking at long-term trends and significant developments and I’ve long thought that the technology of gunpowder was one of the most influential.  It emerged out of China and made its way to Europe late in the medieval era. The interaction of this technology—both in terms of artillery and individual firearms—with other forces of change led to a multi-stage “military revolution” in the way wars were fought and states were organized well into the 20C when atomic weaponry took things to an entirely new level. Fortunately, only two such bombs have ever been used, so this technology really hasn’t affected the total very much.

Democratization and technology are well recognized as important aspects of modernity, but I suspect that the spread of gunpowder/firearm-aided coercive power to the masses has not been seen as a central part of that story. Our demo-triumphant history much prefers Madison to muzzle-loaders; even if that revered political thinker Mao Zedong deftly captured the concept when he noted that “political power grows out of the barrel of a gun.” As a practical matter, however, the European revolutionary tradition is unthinkable without the ability of popular forces for political change being able to access and deploy firearms. Revolutions elsewhere, especially across the 20C, seem even more reliant on shifting the balance of military power away from the incumbent regime. Similarly, due in no small part to its unique gun culture, the role of firearms in the US, particularly in the last 75 or so years, has been profound. 

We can see this by going back to the global historical view, which shows, as with most technologies, guns spread as technical advances made them easier to use and cheaper to produce. 

Rough estimates of gun availability per thousand people are:

1500 – less than one gun
1800 –about 10 guns
1900—about 50 guns
2000—about 125 guns
Today—about 140 guns

(Btw, a significant part of the recent growth has been concentrated in the US, which currently has over 1200 guns/ thousand people (that’s right, over ten times the average for the rest of the world!!).)

Of course, the cause/effect relationship between the distribution of firepower and that of political power will have run both ways and in a wide range of distinct national circumstances, but the basic connection seems important.

From a geopolitical perspective, I was curious about the dominance of European/Western countries over the centuries. Europe was the principal manufacturing center until the 20C and Western countries controlled over 75% of firearms in both 1800 and 1900. This dropped radically in the later 20C to just under 50%, reflecting the growth of militaries in former European colonies, the Communist Bloc, and, especially lately, the rise of China. Of course, most of that firepower was used outside of Europe, by both European armies/police and local purchasers of European arms.

In parallel, the military use of explosives shows a slightly different pattern, reflecting the configurations of major geopolitical conflicts. The estimated quantity of ordnance used in the 18C was in the low millions of SHELLS, growing to the tens of millions of shells across the 19C. WWI and WWII drove the usage in the first half of the 20C to about 250 million TONS, which dropped back over 90% to about 8 million tons for 1950-2000 (mostly the US in Vietnam) and “only” 1-2 million tons so far in the 21C.

I’m sorry to throw a lot of rough data of uncertain reliability at you, but the important take-away is qualitative, not the specific numbers. Even discounting the early 20C spike for WWI/WWII, we would seem to live in much more violence-feasible world. How can we square this with the spread of democratic values (at least the “power to the people” part, if not the rights and due process part)? How much is cause and how much is effect?

From another perspective, however, this apparent tsunami of firepower at multiple levels stands in sharp contrast to the analysis done by Steve Pinker in his “Better Angels of Our Nature” (2011). Pinker shows, with a fair amount of support and rationale, that our sense of living in a highly violent world has been largely a product of media sensationalism and the growth of overall populations, even though our chance of dying a violent death is substantially less than in centuries past. 

Pinker argues that the relative stability of everyday life brought about by socialization and much stronger government-driven public order has vastly reduced domestic homicides, and that even the massive death-tolls of WWI and WWII leave the bloody 20C with a lower per-capita military death rate than earlier eras (and especially since the end of WWII). 

It’s a conundrum. There’s a lot more bang out there and most people—whether in Santa Monica or Johannesburg—feel less secure. Yet the numbers are pretty strong that there’s less violence (again: per capita) than before. We seem to have forgotten the ordinariness of the casual “civil” violence that marked almost all human societies until pretty recently. We also put the impact of modern wars into some easily excludible categories of “far away” or “world wars don’t count.” Down deep, it’s more likely that our psychic sense of security is being measured against some “Ozzie-and-Harriet” suburban standard than the cumulative data.

Pinker notwithstanding, therefore, even if the total volume of violence is down, the spread of the sources of that violence away from elites and the governments they control may correlate with the spread of democratic political power.  We will have to see how this plays out with the global crisis of democracy in the 21C.

0 Comments

History of a Species

12/12/2025

0 Comments

 
Two years ago (120823), I referred to Thomas Carlyle’s famous quote (1841) about how all history is about “Great Men.” I juxtaposed two books I had just recently read which highlighted the unseen forces (mosquitos and logistical supply chains) that had a greater impact than the individuals we so often focus on. In the process of writing my upcoming “History of Everything” course, I’ve developed an even stronger sense of the limits of Carlyle’s framing. From this “Big History” perspective, it’s not only famous men that fade into the background, but all humanity. 

Indeed, if we say that our “show” has been running for 13.8 billion years, there were no actors until a hundred thousand years ago or so, and no speaking parts until about 50,000 years ago. Even then, due to relatively small populations (~4-5 M 10,000 years ago) and only incremental impacts on the world, humans haven’t been notable causes of change until the “agricultural revolution” of 10,000 to 8,000 years ago. If we manage to off ourselves through any number of potential apocalypses, then we will be a mere blip in the history of life on earth (depending on who is around to write such a history).

Even if we take a less dire scenario, however, there is still much to be gained by considering “decentering” individuals from the story. John Brooke’s “Climate Change and the Course of Global History” (2014) does a fine job of putting the Earth—in its full range of geological and environmental activities—on center stage. Even on current (awful) trends, our present path of overheating the planet and eliminating thousands of species still is relatively minor compared to the various eras of glaciation and extinction that have preceded us. Much of the awfulness of what we’re now doing comes from 1) our moral responsibility as the cause of these deaths, and 2) the fact that, unlike the much more dramatic Late Heavy Bombardment (~4B years ago) , or the separation of Pangaea (~200M years ago), we humans are around to see it and suffer from it. Systemically speaking, it’s more about the rapidity of the change (on a geologic scale) than about the absolute physical changes being wrought.

Historians are finally catching up with this repositioning of the human angle on History. Of course, most history is still written in a Carlylian vein, even if it takes account of Great Women and the ordinary folks of any gender. There has been a long-running historiographical parlor game as to whether History is a “science,” but “Big History” and its climate-driven siblings are inserting ‘real’ science into History. Michel Foucault, the radical French thinker/historian of the late 20C, would have been pleased. He called for and made some efforts to pursue an “archeology” of human societies, urging us to ‘get outside’ our cultural frameworks/prejudices to see how we really roll.

In this way, we Historians are continuing the work of Copernicus, Galileo, Darwin and others who have shown us that our construction of creation stories about the universe, solar system, planet, and plants and animals has been driven by solipsism than a holistic and objective view of how the cosmos is and how life works. The insertion of a human-imitative God into the story—the premise of the Abrahamic faiths and some other belief systems—does little to change this; which is why religious leaders (Christianity in particular) have gone to great lengths to suppress those epistemologically revolutionary interpretations. In other words, there’s not much room for God in Kuiper belts of asteroids, plate tectonics, or DNA mutations. He/She may be making things happen behind the scenes; but there’s no way to tell.

The revival of European humanism in the late medieval period made “Man…the measure of all things.” This new history marches firmly in the other direction. Glaciation cycles and volcanic explosions that darken global skies for years on end (1816, e.g., was known as the “year without a summer” due to fallout from a series of eruptions between 1808 and 1814), don’t really care whether there are people around as witnesses/victims. There is thus likely some irony in the fact that this humanism contributed to the epistemological climate that fostered the “Scientific Revolution” of the 17-19C with all the resulting “objectification” (one might say dehumanizing) of experience that followed in its wake.

Even if we keep human societies in the picture, the impact of individuals (“Great” or otherwise) still fades. The longer and grander framings of human development still leave little room for specific personalities. The number of folks who still matter after a century or two is minute; most survive as exemplars of their eras and as the basis of interesting and illustrative stories that Historians tell. Even broader cultures have relatively short half-lives of impact; although, interestingly, most of the longer-lived ones (e.g., Han China, Egypt) date to well before the modern era. These days, we have too much change going on to allow particular countries/cultures to last too long (e.g., Assyria has a claim to a run of 1400 years, more than five times that of these United States). 

So, my “History of Everything” project has quite set my mind spinning in new directions (and is, therefore, a success even before I get into the classroom). I won’t be pushing my class in all the directions touched on here; after all, there’s a lot of “substance” to talk about, too. Still, it’s a provocative step for a “modern” Europeanist to take; particularly at this point in our bewildering, ephemeral culture of the early 21C. This story, even with a lengthy and “objective” (i.e., non-self/culture-centered) perspective will be different fifty years from now. Indeed, it’s hard to imagine that our culture would have produced such a thing fifty years ago. Who knows what the AI/Borg will come up with for the history of a certain species at that time?

0 Comments

You Can't Go Home Again

12/5/2025

1 Comment

 
A few weeks ago (102425), I talked about the current administration’s attack on the discipline of history. Besides the fact that Historians are evidence-based thinkers, we are as a group, apparently, overly “woke,” and unredeemably “lib.” One manifestation of those characteristics is that we tend to point out when someone makes an error about historical events. This is a problem when many of the policies being bandied about these days are based on stories about the past that are dubious, naïve, and sometimes blatant nonsense. That they have no sense of the longer-term implications of their policies sets a nice bookend to their short-sightedness.

Recasting the Defense Department as the “Department of War” in order to promote a “warrior” mentality reveals a mindset that is stuck in an image of mid-20C American invincibility (if not some medieval tale of chivalry and derring-do). It’s as if the post-WWII period was simple and grand. We definitely basked in the glow of our triumph over fascism. After all, our foes had done us the service of acting in so brutal a manner as to make our Manichean self-righteousness all too easy. Never mind the not inconsiderable contribution of the Red Army in defeating Hitler, nor the fundamental futility of the Axis grand strategy. Let’s conveniently forget that we bestrode the world in no small part due to the War’s extensive destruction of the economies of any of our possible post-War economic competitors. And, let’s squint so our vision doesn’t encompass the problematic “police actions” in Korea and Vietnam, or the “loss” of China which followed our wartime apotheosis.  It would be comforting to construct a mythology of a simple time when America was “great,” and then reengineer our way back to it. (Of course, I’ve only touched on the complexities of that era.)

Besides ensuring the restoration of order in our havoc-strewn cities, our military’s principal activities have been down in the Caribbean, revitalizing the generally dormant tradition of American intervention/imperialism. One needn’t go back to the Spanish-American War (1898), or the Mexican-American War (1845), much less the Monroe Doctrine (1823) to see this region as a playground where we would blithely telling other folks how to run their countries. The list of military deployments since I was born include six invasions (Guatemala 1954, Cuba 1961, Dominican Republic 1966, Grenada 1983, Panama 1990, and Haiti 1994), not to mention the Nicaraguan “Contras” exploits of the 1980s or numerous incidents in the first half of the 20C. These days the target, depending on who you talk to, is either the drug gangs or the Maduro regime in Venezuela. It must feel good to send a spare aircraft carrier down there and blow up/shoot up some bad guys so we can promote our own outstanding version of democracy and freedom. Featuring most recently Afghanistan and Iraq, our record of “nation-building” and democracy restoral is pretty much bereft of successes, but let’s not trouble ourselves with a few data points amid all the glory of war.

Over at the Transportation Department, Secretary Duffy has called for a return to sartorial decorum on our aviation system. Apparently, the current trends in casual dress are a problem significant enough for his attention. Safety, a limping air traffic control system, passenger discomfort, and airline extraction of every possible revenue stream all must fall into the queue behind redressing slackers who fly in pajamas. Another would-be time traveler, it seems, to the “Golden Age” of aviation, untroubled by the differences in safety, cost, and extent of governmental regulation of that earlier era. Can “Coffee, tea, or me” be far behind?

Naturally, the principal avatar of atavism (a killer alliteration if I do say so!) is the orange-haired one himself, aka HWSNBN. It’s hard—between the long list of misogynistic comments, quasi-racist “dog whistles,” and the pervasive atmosphere of anger and hatred—to know where to start. The most recent example arose in the aftermath of the killing of the National Guardswoman in DC by an Afghan man who had sought refuge here after we trashed his country. A pretty young white woman killed by violent man of color; it was a trope not to be missed. And it’s not that he needed any particular prompt to move against immigrants. So, it shouldn’t have been surprising that he referred to immigrants from “third-world countries” as the main group to be excluded. “Third-world” is a phrase that has been outdated since the demise of the Cold War over 30 years ago. After all, if the globe wasn’t any longer defined by the battle between the (liberal Western) “First World,” and the (evil Communist) “Second World,” then there wasn’t really any reason to dump everyone else into the “Third World” pot. But there he is, back in his formative years of the 1950s and 1960s; and, apparently, we’re along for the ride.

There are many and considerable moral issues about the nature of US society in the middle of the 20C that might deter one from seeing this as an idyllic period to be emulated in the 21C. But, even putting those to one side (along with whatever similar critiques one might have of the current administration in general), we should still recognize that the idea of return to some golden age does seem to be animating this gang. 

It’s futile, of course. You can’t pick and choose some parts of the past that you like and pretend that there was no baggage to be dragged along, too. I had a pretty nice upbringing, but picturing my mom solely as the one who made me warm chocolate chip cookies when I came home from school doesn’t respect her or help me. The only real lesson of history is that life is complex and hard and we have to pay attention to those realities and not pretend that situations or people fit into neat categories with over-generalized characteristics. Any attempt to portray the past as simple (much less Elysian) should put us on alert that we are being led astray. The Historian’s job is to sound those alarms.

1 Comment

Political Tectonics

11/28/2025

0 Comments

 

Our understanding of a central issue in geology—how the continents were formed and located—wasn’t clearly settled until the 1960s (I still am amazed that such an important aspect of science remained unclear until so recently!). The (now) standard theory is called plate tectonics and it posits the existence of a group of (quite large) “plates” (16 major ones) that float on top of Earth’s mantle. They bump up against each other and—from time to time—move, usually causing earthquakes and tsunamis. Most of the time there’s no action, just a build-up of pressure, until the tension becomes too great and the plates jerk into a new configuration.

Shortly thereafter, a similar theory emerged which was applicable to the story of the evolution of plants and animals: things alter minutely and incrementally until some shock to the system causes major and widespread changes. It’s called “punctuated equilibrium,” most notably championed by Stephen Gould in the 1970s.

Now, many aspects of social studies (sciences?) can be analogized to physics and other sectors of the physical (“hard”) sciences; such concepts as entropy, gravity, and inertia can each be applied to the creation and function of human societies, their politics and economics. So, I’m espousing a theory of “political tectonics” which suggests that human societies—and, in particular, their power structures—move in similar ways. As with any analogy, we must allow for some ‘slosh’ room and not look for precise matching. This is especially true where we move from measurable and quantifiable sciences to the “softer” realm of people and history. 

In terms of domestic political change, we might analogize from terrestrial “plates” to socio-political groupings, movements, parties, organizations, and other institutions. These groupings evolve, to be sure, in the ordinary course of things; changing membership, shifting ideologies, and accreting or shedding political power vis-à-vis other groupings. Much of this is not so visible unless closely studied (usually in retrospect). Often, however, the tensions build without much change…until they do. Historically, we can look at the status of slavery in the US in the early 19C, or the powers of the British House of Lords in the late 19 and early 20C. The status of women and Blacks in the US made only incremental process until the 1960s. In each case, a crisis forced things to a head and remarkable and significant changes resulted.

The great revolutions in France and Russia can be seen in the same light. Ditto for China in the early 20C and Iran in the 1970s. 

A similar perspective applies internationally. The collapse of the Soviet empire came pretty much out of the blue, despite some rumblings in the 1970s and 1980s. The start of WWI can be seen as the cataclysmic spasm of realignment of the Great Powers in Europe early in the 20C. Lately China has made modest strides at projecting global power, but we may well look back on the 2020s as an inflection point in global geopolitics.

In each case, surface appearances and political institutions remain stable (until they don’t); but underneath they mask a shift in political power. It is the accumulated tension of this mismatch that—due to some butterfly effect—can break out into dramatic realignment. There is an interpretation of British political history in the 19C that saw its gradual accommodation of the emerging power of the working class into the political system as a great accomplishment in the avoidance of the punctuated revolutions which characterized European continental developments in the same period.

If geologists have a hard time predicting earthquakes or volcanic eruptions (usually even moments before), much the same can be said of social scientists and generic pundits’ efforts to do the same for political and social realignments. Instead, we are deluged with this crowd revealing all sorts of scenarios as to what might happen soon; in part to fill their words per month output quotas, in part so they can (when lucky) and say “I told you so.” The vagaries of political polling (from “Dewey Defeats Truman” to a pair of Trumpian triumphs) are notable in this regard. The media—from lame/mainstream to micro-social—are replete with this sort of blather.

It may be frustrating to many that despite the immense strides in sciences of all sorts, much of the time we simply don’t know what’s happening to our world/society until things actually happen. Our disappointment is partially due to the high expectations we have developed around the predictability achieved in many areas of the hard sciences. It is also due to our difficulty in tolerating an awareness that our world and social structures are precarious and could be tipped over in an instant. We’ve managed (so far) not to blow ourselves up in a nuclear war. The financial meltdown of 2007-09 could easily have done far more damage to our banking/insurance/credit systems. Our luck, however, is no cause for comfort regarding the next “big one.” 

Even if we make progress in mapping geologic phenomena to a degree that our understanding of plate tectonics approaches that of electric grids or telecom networks, people are a couple of orders of magnitude more ephemeral, contingent, and flaky. The operation of social systems therefore has to be more speculative and the fundamental analogy of this argument is limited. Even AI is unlikely to be able to predict things any better than we do now (although its mechanistic appearance may seem more reassuring).

In sum, as Donald Rumsfield famously said: there are the “known unknowns” and the “unknown unknowns.” We might track geologic plates in the hope of figuring out the next earthquake, or speculate about the impact of working-class social disaffection on the current political culture, or even on the impact of massive AI-driven investment on labor or energy markets. Even if we can only suss out directional indications rather than any specific implication, it’s good to bear in mind that we won’t know much until it happens.

0 Comments

A Bias for Change

11/21/2025

0 Comments

 


I’ve started in on writing a course for next winter called “A History of Everything,” which will cover---well—everything!: from the Big Bang to our current 21C crisis. This process is forcing me to rethink a lot of assumptions about the human condition and my understanding of history. 

One aspect which is a tangent on my stance on modernity (i.e. the last 250 years) is the degree to which we have a bias for change. It’s not entirely new to my generation. Indeed, Alvin Toffler wrote about “Future Shock” over a half-century ago. My grandfather was born before the age of flight and lived to see women in space. Pretty much everyone born in the 20C (certainly in the West and much of the rest of the world as well) has lived through more change—social, economic, technological, cultural—than their grandparents could have likely imagined. Certainly those born in the 21C (if I may eschew the “Gen X, Gen Y…Gen Alpha” nomenclature) knows nothing but. And social media is nothing but change, hyped by fashion and a proverbial short attention span. These days, this encompasses virtually everyone in the world with the exception of a few tiny groups isolated from global modernity.

By a “bias for change,” I mean an assumption of impermanence, an expectation of evolution, and a moderate degree of surprise at encountering stability (stasis). It’s so ordinary that we’re like a fish in water. It is, if I may mix environmental metaphors, part of the air we breathe. But beyond this normative sense, change is seen as an inherent good; at least insofar as the dominant culture is concerned. We tend to look down on those societies which continue more-or-less unabated and untroubled by disruption. 

Our models for such societies are drawn from history and their relatively poor socio-economic-technological state compared with our own (exalted, high-tech) state. But by focusing on this angle of comparison, we lose sight of the potential benefits of stability and continuation. After all, such groups lived in an environment where change was not normal; they were fish swimming in different water. They had no conception of what would eventuate (ditto for us, but that’s a different story) and so did not suffer by comparison with their own situation. Few likely rued the absence of the latest iPhone operating system or the chance to live in cities with millions of others (as most of us do). They might have been impressed with our ability to manage diseases and live longer, but they didn’t engage in such (to them) hypotheticals. We may champion those historical figures who sought and implemented change, but there weren’t very many of them and we claim them as the forefathers of our own modernity. We also tend to quickly forget the real-time costs of change—epistemological disruption, migration, inequality—indeed, the very phrase “transitional costs” almost invites dismissal once the immediate pain has passed.

Historians, of course, have an innate bias for change. Indeed, one definition of History is the “study of change over time.” As a discipline, we love to write about what’s new and its ramifications. If the past wasn’t dynamic, there wouldn’t be much to write about. Indeed, we could argue that the modern idea of History emerged in the 19C in response to the acceleration of change in the 18C (e.g., French and Industrial “Revolutions”).

This happened about the time that the idea of political conservatism crystallized (I mean, of course, actual conservativism, not what passes for the “right wing” these days. It’s a bit too crude to characterize conservatives as the “anti-change” party, but it’s not far from the mark. Certainly, their premise is that change should be incremental and organic; rather than dramatic, exogenous, or “revolutionary.” A more radical conservatism earned the title “reactionary,” arguing (at least implicitly) for a return to traditional political, economic, and social modes. The current version of this group seems to stand astride the twitching body of the Republican Party. The GOP used to have a “normal” conservative stance, albeit with intermittent reactionary elements. It’s now a zombie political entity, mouthing some conservative bromides, but increasingly reactionary and in a highly selective way. As I noted recently, it’s more about mythology and bad History than any cogent engagement with the past.

I do wonder as to the degree this represents not just an outlook that doesn’t like the current state of things, but an inability to cope with change. In making this point, I have to be careful. It would be easy to fall into a trap of equating an embracing of change as “normal,” and implying a moral deficiency to those whose psychology doesn’t work like mine does. Nor do I want to create a model in which those who are not forward looking are archaic or somehow “deplorable.” To the contrary, I’m suggesting that some aversion to change is actually normal and ordinary, even if it doesn’t rise to the level of a political stance. Change is hard and not always an improvement. To equate change with moral progress is precisely the trap with which I’m concerned.

Indeed, I can argue that those who are inured to the current rate of change suffer from a different distortion of perspective. The cult of progress, grounded in the remarkable improvement in technologies of all sorts over the past 250 years, has made it difficult to recognize that such “progress” is extraordinary. The resulting optimism (of which tech bros’ gushing enthusiasm for AI is the most recent example) seems similar to the blindness of those who are financially well-off to the nature and sources of their cultural and economic advantages. Being born into such a world (of wealth or of progress or of race) can be distorting and terms such as “merit” or “fairness” need to be closely scrutinized. 

The pace of change has accelerated and may continue or careen out of control. It’s no time to make blithe assumptions about what is ultimately beneficial.

0 Comments

Going for Broke

11/14/2025

1 Comment

 
The recent US government shut down echoes the ongoing political crisis in France over its Parliament’s inability to agree on a budget for the country as just the latest manifestations of a fundamental problem with government, governability, and the endemic short-termism of modern western culture. Virtually every major Western government has been facing similar crises over the past decade. Our own situation here in the US is worse, even if we’re more devious in our means of ducking the problem. 
Picture


This chart gives a rough picture of how public debt levels in the major Western countries (US, Japan, Germany, France, Britain, Italy, Canada) have changed since the turn of the 20C. If you “normalize” out the spikes caused by WWI and WWII, you can see that debt levels increased only marginally across the 20C. This is remarkable in itself given the significant changes in the nature and scope of governmental activities (esp. the rise of the “welfare state”) during that period. 

Since then, things have gotten much worse. It’s not just the demands of the particular challenges of the Great Recession (fifteen years ago) or of COVID (five years ago), although they have certainly contributed to the problem. Rather, it reflects some real problems in terms of how governments raise and spend money. There are several (overlapping) contributing factors:
* Political short-termism – Politicians have rarely met a bullet they were willing to bite. No one wants to raise taxes, everybody wants to feed at the government trough. The implications of not investing today will be the problems of the next generation (by which time the current gang will be retired and forgotten, but in the meantime, they will have sipped and supped with power).
* Intergenerational theft – This is often unintentional, but no less damaging. The responsibility (blame?) lies most heavily on the Boomer generation who grew up in an era of expanding governmental support programs (health care and pensions in particular) which had been funded by the expanding economies of the late 20C. The burden falls on the younger, less powerful (and less likely to vote) generations. Now, with demographic changes, there are relatively fewer younger folks and the older recipients won’t let go of their entitlements.
* Bad accounting  -- Dodgy calculations (retirement, health care, inflation), and a refusal to save up for infrastructure depreciation will eventually come due. 
* Oligarchical domination of governments – This shows up principally in innumerable tax-reduction schemes and regressive structures that—while rationalized by all sorts of (well-lobbied) arguments about investment and property “rights”—mainly benefit the top few percent of the wealthy in each country who have disproportionate political power. 
* Bourgeois entitlements – This is another way of characterizing the benefit programs and tax schemes (“middle class” tax breaks) that are embedded in the system and are accordingly difficult to retrieve without a great “hue-and-cry” about embedded expectations. As Ronald Reagan said of the Panama Canal: “We stole it fair and square.”

Of course, the problem of government overspending is hardly a new one. History is littered with the bankruptcies of great powers and the French inability to come to terms with their debt accumulation was a significant cause of the crisis that led to the Great Revolution of 1789. Spain and France (the leading powers of the 16-18C) each defaulted more than a half-dozen times during that period. Indeed, one of the key aspects of the rise of British power to replace them during the 18C was its ability to manage its debt (both in terms of spending and financing). The pattern waned in the 19C, although lesser powers (e.g. Portugal and especially Argentina (much in the news lately, too) became notoriously unreliable. In the 20C, there were fewer formal defaults (if you leave aside the wholesale flushing out of the finances of Russia by the new Soviet regime in 1918 and Germany, Italy, and Japan in the aftermath of WWII). 

Despite the pie-in-the-sky claims of an AI-generated spurt of productivity, the situation going forward is actually pretty bleak. Aging populations will suck up lots of cash for retirement and health care. Roads and bridges will need to be repaired. A less stable/secure world will demand more expenditures for military modernization. Not to mention climate repair/mitigation. Fewer workers per retiree (especially as we slash immigration) means that we’re running out of places to find cash for the government costs.

The likely outcome will be a combination of politically unpalatable steps. There will be much screaming and hand-wringing, but (even in this age of mythical thinking) numbers need to add up. Capping or cutting social benefits, increasing taxes, and, seemingly inevitably, an extended bout of non-trivial inflation, lie ahead. Much of this will cut along class lines, leading to a significant economic “populism” movement. We’re seeing the first steps along those lines now, the recent tax and expenditure moves will soon (within the next two years) start to bite. So, we will have an early indicator of whether those adversely affected will mobilize to protect the social safety net by demanding an increase in taxes. Everyone will be affected by inflation, although those most well off will face the biggest impact on the relative value of their accumulated wealth.

Politically, this won’t be pretty. The desultory skirmishing over the government shut down will seem trivial. Lots of folks will be getting pretty angry and increasing numbers will radicalize. The hollowing out of the middle-class will make the premises of democracy less tenable. The combination will make the fears of political philosophers from Plato and Aristotle onwards—the perennial tension between the masses and the well-off--more tangible. It will be remarkable if the body politic can come through this with a new balance of power, money, and justice. 

Maybe we should require all elected officials to pass mathematics and accounting exams and cut their pay and benefits if their numbers don’t add up.


1 Comment
<<Previous

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    January 2026
    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly