Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

History of a Species

12/12/2025

0 Comments

 
Two years ago (120823), I referred to Thomas Carlyle’s famous quote (1841) about how all history is about “Great Men.” I juxtaposed two books I had just recently read which highlighted the unseen forces (mosquitos and logistical supply chains) that had a greater impact than the individuals we so often focus on. In the process of writing my upcoming “History of Everything” course, I’ve developed an even stronger sense of the limits of Carlyle’s framing. From this “Big History” perspective, it’s not only famous men that fade into the background, but all humanity. 

Indeed, if we say that our “show” has been running for 13.8 billion years, there were no actors until a hundred thousand years ago or so, and no speaking parts until about 50,000 years ago. Even then, due to relatively small populations (~4-5 M 10,000 years ago) and only incremental impacts on the world, humans haven’t been notable causes of change until the “agricultural revolution” of 10,000 to 8,000 years ago. If we manage to off ourselves through any number of potential apocalypses, then we will be a mere blip in the history of life on earth (depending on who is around to write such a history).

Even if we take a less dire scenario, however, there is still much to be gained by considering “decentering” individuals from the story. John Brooke’s “Climate Change and the Course of Global History” (2014) does a fine job of putting the Earth—in its full range of geological and environmental activities—on center stage. Even on current (awful) trends, our present path of overheating the planet and eliminating thousands of species still is relatively minor compared to the various eras of glaciation and extinction that have preceded us. Much of the awfulness of what we’re now doing comes from 1) our moral responsibility as the cause of these deaths, and 2) the fact that, unlike the much more dramatic Late Heavy Bombardment (~4B years ago) , or the separation of Pangaea (~200M years ago), we humans are around to see it and suffer from it. Systemically speaking, it’s more about the rapidity of the change (on a geologic scale) than about the absolute physical changes being wrought.

Historians are finally catching up with this repositioning of the human angle on History. Of course, most history is still written in a Carlylian vein, even if it takes account of Great Women and the ordinary folks of any gender. There has been a long-running historiographical parlor game as to whether History is a “science,” but “Big History” and its climate-driven siblings are inserting ‘real’ science into History. Michel Foucault, the radical French thinker/historian of the late 20C, would have been pleased. He called for and made some efforts to pursue an “archeology” of human societies, urging us to ‘get outside’ our cultural frameworks/prejudices to see how we really roll.

In this way, we Historians are continuing the work of Copernicus, Galileo, Darwin and others who have shown us that our construction of creation stories about the universe, solar system, planet, and plants and animals has been driven by solipsism than a holistic and objective view of how the cosmos is and how life works. The insertion of a human-imitative God into the story—the premise of the Abrahamic faiths and some other belief systems—does little to change this; which is why religious leaders (Christianity in particular) have gone to great lengths to suppress those epistemologically revolutionary interpretations. In other words, there’s not much room for God in Kuiper belts of asteroids, plate tectonics, or DNA mutations. He/She may be making things happen behind the scenes; but there’s no way to tell.

The revival of European humanism in the late medieval period made “Man…the measure of all things.” This new history marches firmly in the other direction. Glaciation cycles and volcanic explosions that darken global skies for years on end (1816, e.g., was known as the “year without a summer” due to fallout from a series of eruptions between 1808 and 1814), don’t really care whether there are people around as witnesses/victims. There is thus likely some irony in the fact that this humanism contributed to the epistemological climate that fostered the “Scientific Revolution” of the 17-19C with all the resulting “objectification” (one might say dehumanizing) of experience that followed in its wake.

Even if we keep human societies in the picture, the impact of individuals (“Great” or otherwise) still fades. The longer and grander framings of human development still leave little room for specific personalities. The number of folks who still matter after a century or two is minute; most survive as exemplars of their eras and as the basis of interesting and illustrative stories that Historians tell. Even broader cultures have relatively short half-lives of impact; although, interestingly, most of the longer-lived ones (e.g., Han China, Egypt) date to well before the modern era. These days, we have too much change going on to allow particular countries/cultures to last too long (e.g., Assyria has a claim to a run of 1400 years, more than five times that of these United States). 

So, my “History of Everything” project has quite set my mind spinning in new directions (and is, therefore, a success even before I get into the classroom). I won’t be pushing my class in all the directions touched on here; after all, there’s a lot of “substance” to talk about, too. Still, it’s a provocative step for a “modern” Europeanist to take; particularly at this point in our bewildering, ephemeral culture of the early 21C. This story, even with a lengthy and “objective” (i.e., non-self/culture-centered) perspective will be different fifty years from now. Indeed, it’s hard to imagine that our culture would have produced such a thing fifty years ago. Who knows what the AI/Borg will come up with for the history of a certain species at that time?

0 Comments

You Can't Go Home Again

12/5/2025

1 Comment

 
A few weeks ago (102425), I talked about the current administration’s attack on the discipline of history. Besides the fact that Historians are evidence-based thinkers, we are as a group, apparently, overly “woke,” and unredeemably “lib.” One manifestation of those characteristics is that we tend to point out when someone makes an error about historical events. This is a problem when many of the policies being bandied about these days are based on stories about the past that are dubious, naïve, and sometimes blatant nonsense. That they have no sense of the longer-term implications of their policies sets a nice bookend to their short-sightedness.

Recasting the Defense Department as the “Department of War” in order to promote a “warrior” mentality reveals a mindset that is stuck in an image of mid-20C American invincibility (if not some medieval tale of chivalry and derring-do). It’s as if the post-WWII period was simple and grand. We definitely basked in the glow of our triumph over fascism. After all, our foes had done us the service of acting in so brutal a manner as to make our Manichean self-righteousness all too easy. Never mind the not inconsiderable contribution of the Red Army in defeating Hitler, nor the fundamental futility of the Axis grand strategy. Let’s conveniently forget that we bestrode the world in no small part due to the War’s extensive destruction of the economies of any of our possible post-War economic competitors. And, let’s squint so our vision doesn’t encompass the problematic “police actions” in Korea and Vietnam, or the “loss” of China which followed our wartime apotheosis.  It would be comforting to construct a mythology of a simple time when America was “great,” and then reengineer our way back to it. (Of course, I’ve only touched on the complexities of that era.)

Besides ensuring the restoration of order in our havoc-strewn cities, our military’s principal activities have been down in the Caribbean, revitalizing the generally dormant tradition of American intervention/imperialism. One needn’t go back to the Spanish-American War (1898), or the Mexican-American War (1845), much less the Monroe Doctrine (1823) to see this region as a playground where we would blithely telling other folks how to run their countries. The list of military deployments since I was born include six invasions (Guatemala 1954, Cuba 1961, Dominican Republic 1966, Grenada 1983, Panama 1990, and Haiti 1994), not to mention the Nicaraguan “Contras” exploits of the 1980s or numerous incidents in the first half of the 20C. These days the target, depending on who you talk to, is either the drug gangs or the Maduro regime in Venezuela. It must feel good to send a spare aircraft carrier down there and blow up/shoot up some bad guys so we can promote our own outstanding version of democracy and freedom. Featuring most recently Afghanistan and Iraq, our record of “nation-building” and democracy restoral is pretty much bereft of successes, but let’s not trouble ourselves with a few data points amid all the glory of war.

Over at the Transportation Department, Secretary Duffy has called for a return to sartorial decorum on our aviation system. Apparently, the current trends in casual dress are a problem significant enough for his attention. Safety, a limping air traffic control system, passenger discomfort, and airline extraction of every possible revenue stream all must fall into the queue behind redressing slackers who fly in pajamas. Another would-be time traveler, it seems, to the “Golden Age” of aviation, untroubled by the differences in safety, cost, and extent of governmental regulation of that earlier era. Can “Coffee, tea, or me” be far behind?

Naturally, the principal avatar of atavism (a killer alliteration if I do say so!) is the orange-haired one himself, aka HWSNBN. It’s hard—between the long list of misogynistic comments, quasi-racist “dog whistles,” and the pervasive atmosphere of anger and hatred—to know where to start. The most recent example arose in the aftermath of the killing of the National Guardswoman in DC by an Afghan man who had sought refuge here after we trashed his country. A pretty young white woman killed by violent man of color; it was a trope not to be missed. And it’s not that he needed any particular prompt to move against immigrants. So, it shouldn’t have been surprising that he referred to immigrants from “third-world countries” as the main group to be excluded. “Third-world” is a phrase that has been outdated since the demise of the Cold War over 30 years ago. After all, if the globe wasn’t any longer defined by the battle between the (liberal Western) “First World,” and the (evil Communist) “Second World,” then there wasn’t really any reason to dump everyone else into the “Third World” pot. But there he is, back in his formative years of the 1950s and 1960s; and, apparently, we’re along for the ride.

There are many and considerable moral issues about the nature of US society in the middle of the 20C that might deter one from seeing this as an idyllic period to be emulated in the 21C. But, even putting those to one side (along with whatever similar critiques one might have of the current administration in general), we should still recognize that the idea of return to some golden age does seem to be animating this gang. 

It’s futile, of course. You can’t pick and choose some parts of the past that you like and pretend that there was no baggage to be dragged along, too. I had a pretty nice upbringing, but picturing my mom solely as the one who made me warm chocolate chip cookies when I came home from school doesn’t respect her or help me. The only real lesson of history is that life is complex and hard and we have to pay attention to those realities and not pretend that situations or people fit into neat categories with over-generalized characteristics. Any attempt to portray the past as simple (much less Elysian) should put us on alert that we are being led astray. The Historian’s job is to sound those alarms.

1 Comment

Political Tectonics

11/28/2025

0 Comments

 

Our understanding of a central issue in geology—how the continents were formed and located—wasn’t clearly settled until the 1960s (I still am amazed that such an important aspect of science remained unclear until so recently!). The (now) standard theory is called plate tectonics and it posits the existence of a group of (quite large) “plates” (16 major ones) that float on top of Earth’s mantle. They bump up against each other and—from time to time—move, usually causing earthquakes and tsunamis. Most of the time there’s no action, just a build-up of pressure, until the tension becomes too great and the plates jerk into a new configuration.

Shortly thereafter, a similar theory emerged which was applicable to the story of the evolution of plants and animals: things alter minutely and incrementally until some shock to the system causes major and widespread changes. It’s called “punctuated equilibrium,” most notably championed by Stephen Gould in the 1970s.

Now, many aspects of social studies (sciences?) can be analogized to physics and other sectors of the physical (“hard”) sciences; such concepts as entropy, gravity, and inertia can each be applied to the creation and function of human societies, their politics and economics. So, I’m espousing a theory of “political tectonics” which suggests that human societies—and, in particular, their power structures—move in similar ways. As with any analogy, we must allow for some ‘slosh’ room and not look for precise matching. This is especially true where we move from measurable and quantifiable sciences to the “softer” realm of people and history. 

In terms of domestic political change, we might analogize from terrestrial “plates” to socio-political groupings, movements, parties, organizations, and other institutions. These groupings evolve, to be sure, in the ordinary course of things; changing membership, shifting ideologies, and accreting or shedding political power vis-à-vis other groupings. Much of this is not so visible unless closely studied (usually in retrospect). Often, however, the tensions build without much change…until they do. Historically, we can look at the status of slavery in the US in the early 19C, or the powers of the British House of Lords in the late 19 and early 20C. The status of women and Blacks in the US made only incremental process until the 1960s. In each case, a crisis forced things to a head and remarkable and significant changes resulted.

The great revolutions in France and Russia can be seen in the same light. Ditto for China in the early 20C and Iran in the 1970s. 

A similar perspective applies internationally. The collapse of the Soviet empire came pretty much out of the blue, despite some rumblings in the 1970s and 1980s. The start of WWI can be seen as the cataclysmic spasm of realignment of the Great Powers in Europe early in the 20C. Lately China has made modest strides at projecting global power, but we may well look back on the 2020s as an inflection point in global geopolitics.

In each case, surface appearances and political institutions remain stable (until they don’t); but underneath they mask a shift in political power. It is the accumulated tension of this mismatch that—due to some butterfly effect—can break out into dramatic realignment. There is an interpretation of British political history in the 19C that saw its gradual accommodation of the emerging power of the working class into the political system as a great accomplishment in the avoidance of the punctuated revolutions which characterized European continental developments in the same period.

If geologists have a hard time predicting earthquakes or volcanic eruptions (usually even moments before), much the same can be said of social scientists and generic pundits’ efforts to do the same for political and social realignments. Instead, we are deluged with this crowd revealing all sorts of scenarios as to what might happen soon; in part to fill their words per month output quotas, in part so they can (when lucky) and say “I told you so.” The vagaries of political polling (from “Dewey Defeats Truman” to a pair of Trumpian triumphs) are notable in this regard. The media—from lame/mainstream to micro-social—are replete with this sort of blather.

It may be frustrating to many that despite the immense strides in sciences of all sorts, much of the time we simply don’t know what’s happening to our world/society until things actually happen. Our disappointment is partially due to the high expectations we have developed around the predictability achieved in many areas of the hard sciences. It is also due to our difficulty in tolerating an awareness that our world and social structures are precarious and could be tipped over in an instant. We’ve managed (so far) not to blow ourselves up in a nuclear war. The financial meltdown of 2007-09 could easily have done far more damage to our banking/insurance/credit systems. Our luck, however, is no cause for comfort regarding the next “big one.” 

Even if we make progress in mapping geologic phenomena to a degree that our understanding of plate tectonics approaches that of electric grids or telecom networks, people are a couple of orders of magnitude more ephemeral, contingent, and flaky. The operation of social systems therefore has to be more speculative and the fundamental analogy of this argument is limited. Even AI is unlikely to be able to predict things any better than we do now (although its mechanistic appearance may seem more reassuring).

In sum, as Donald Rumsfield famously said: there are the “known unknowns” and the “unknown unknowns.” We might track geologic plates in the hope of figuring out the next earthquake, or speculate about the impact of working-class social disaffection on the current political culture, or even on the impact of massive AI-driven investment on labor or energy markets. Even if we can only suss out directional indications rather than any specific implication, it’s good to bear in mind that we won’t know much until it happens.

0 Comments

A Bias for Change

11/21/2025

0 Comments

 


I’ve started in on writing a course for next winter called “A History of Everything,” which will cover---well—everything!: from the Big Bang to our current 21C crisis. This process is forcing me to rethink a lot of assumptions about the human condition and my understanding of history. 

One aspect which is a tangent on my stance on modernity (i.e. the last 250 years) is the degree to which we have a bias for change. It’s not entirely new to my generation. Indeed, Alvin Toffler wrote about “Future Shock” over a half-century ago. My grandfather was born before the age of flight and lived to see women in space. Pretty much everyone born in the 20C (certainly in the West and much of the rest of the world as well) has lived through more change—social, economic, technological, cultural—than their grandparents could have likely imagined. Certainly those born in the 21C (if I may eschew the “Gen X, Gen Y…Gen Alpha” nomenclature) knows nothing but. And social media is nothing but change, hyped by fashion and a proverbial short attention span. These days, this encompasses virtually everyone in the world with the exception of a few tiny groups isolated from global modernity.

By a “bias for change,” I mean an assumption of impermanence, an expectation of evolution, and a moderate degree of surprise at encountering stability (stasis). It’s so ordinary that we’re like a fish in water. It is, if I may mix environmental metaphors, part of the air we breathe. But beyond this normative sense, change is seen as an inherent good; at least insofar as the dominant culture is concerned. We tend to look down on those societies which continue more-or-less unabated and untroubled by disruption. 

Our models for such societies are drawn from history and their relatively poor socio-economic-technological state compared with our own (exalted, high-tech) state. But by focusing on this angle of comparison, we lose sight of the potential benefits of stability and continuation. After all, such groups lived in an environment where change was not normal; they were fish swimming in different water. They had no conception of what would eventuate (ditto for us, but that’s a different story) and so did not suffer by comparison with their own situation. Few likely rued the absence of the latest iPhone operating system or the chance to live in cities with millions of others (as most of us do). They might have been impressed with our ability to manage diseases and live longer, but they didn’t engage in such (to them) hypotheticals. We may champion those historical figures who sought and implemented change, but there weren’t very many of them and we claim them as the forefathers of our own modernity. We also tend to quickly forget the real-time costs of change—epistemological disruption, migration, inequality—indeed, the very phrase “transitional costs” almost invites dismissal once the immediate pain has passed.

Historians, of course, have an innate bias for change. Indeed, one definition of History is the “study of change over time.” As a discipline, we love to write about what’s new and its ramifications. If the past wasn’t dynamic, there wouldn’t be much to write about. Indeed, we could argue that the modern idea of History emerged in the 19C in response to the acceleration of change in the 18C (e.g., French and Industrial “Revolutions”).

This happened about the time that the idea of political conservatism crystallized (I mean, of course, actual conservativism, not what passes for the “right wing” these days. It’s a bit too crude to characterize conservatives as the “anti-change” party, but it’s not far from the mark. Certainly, their premise is that change should be incremental and organic; rather than dramatic, exogenous, or “revolutionary.” A more radical conservatism earned the title “reactionary,” arguing (at least implicitly) for a return to traditional political, economic, and social modes. The current version of this group seems to stand astride the twitching body of the Republican Party. The GOP used to have a “normal” conservative stance, albeit with intermittent reactionary elements. It’s now a zombie political entity, mouthing some conservative bromides, but increasingly reactionary and in a highly selective way. As I noted recently, it’s more about mythology and bad History than any cogent engagement with the past.

I do wonder as to the degree this represents not just an outlook that doesn’t like the current state of things, but an inability to cope with change. In making this point, I have to be careful. It would be easy to fall into a trap of equating an embracing of change as “normal,” and implying a moral deficiency to those whose psychology doesn’t work like mine does. Nor do I want to create a model in which those who are not forward looking are archaic or somehow “deplorable.” To the contrary, I’m suggesting that some aversion to change is actually normal and ordinary, even if it doesn’t rise to the level of a political stance. Change is hard and not always an improvement. To equate change with moral progress is precisely the trap with which I’m concerned.

Indeed, I can argue that those who are inured to the current rate of change suffer from a different distortion of perspective. The cult of progress, grounded in the remarkable improvement in technologies of all sorts over the past 250 years, has made it difficult to recognize that such “progress” is extraordinary. The resulting optimism (of which tech bros’ gushing enthusiasm for AI is the most recent example) seems similar to the blindness of those who are financially well-off to the nature and sources of their cultural and economic advantages. Being born into such a world (of wealth or of progress or of race) can be distorting and terms such as “merit” or “fairness” need to be closely scrutinized. 

The pace of change has accelerated and may continue or careen out of control. It’s no time to make blithe assumptions about what is ultimately beneficial.

0 Comments

Going for Broke

11/14/2025

1 Comment

 
The recent US government shut down echoes the ongoing political crisis in France over its Parliament’s inability to agree on a budget for the country as just the latest manifestations of a fundamental problem with government, governability, and the endemic short-termism of modern western culture. Virtually every major Western government has been facing similar crises over the past decade. Our own situation here in the US is worse, even if we’re more devious in our means of ducking the problem. 
Picture


This chart gives a rough picture of how public debt levels in the major Western countries (US, Japan, Germany, France, Britain, Italy, Canada) have changed since the turn of the 20C. If you “normalize” out the spikes caused by WWI and WWII, you can see that debt levels increased only marginally across the 20C. This is remarkable in itself given the significant changes in the nature and scope of governmental activities (esp. the rise of the “welfare state”) during that period. 

Since then, things have gotten much worse. It’s not just the demands of the particular challenges of the Great Recession (fifteen years ago) or of COVID (five years ago), although they have certainly contributed to the problem. Rather, it reflects some real problems in terms of how governments raise and spend money. There are several (overlapping) contributing factors:
* Political short-termism – Politicians have rarely met a bullet they were willing to bite. No one wants to raise taxes, everybody wants to feed at the government trough. The implications of not investing today will be the problems of the next generation (by which time the current gang will be retired and forgotten, but in the meantime, they will have sipped and supped with power).
* Intergenerational theft – This is often unintentional, but no less damaging. The responsibility (blame?) lies most heavily on the Boomer generation who grew up in an era of expanding governmental support programs (health care and pensions in particular) which had been funded by the expanding economies of the late 20C. The burden falls on the younger, less powerful (and less likely to vote) generations. Now, with demographic changes, there are relatively fewer younger folks and the older recipients won’t let go of their entitlements.
* Bad accounting  -- Dodgy calculations (retirement, health care, inflation), and a refusal to save up for infrastructure depreciation will eventually come due. 
* Oligarchical domination of governments – This shows up principally in innumerable tax-reduction schemes and regressive structures that—while rationalized by all sorts of (well-lobbied) arguments about investment and property “rights”—mainly benefit the top few percent of the wealthy in each country who have disproportionate political power. 
* Bourgeois entitlements – This is another way of characterizing the benefit programs and tax schemes (“middle class” tax breaks) that are embedded in the system and are accordingly difficult to retrieve without a great “hue-and-cry” about embedded expectations. As Ronald Reagan said of the Panama Canal: “We stole it fair and square.”

Of course, the problem of government overspending is hardly a new one. History is littered with the bankruptcies of great powers and the French inability to come to terms with their debt accumulation was a significant cause of the crisis that led to the Great Revolution of 1789. Spain and France (the leading powers of the 16-18C) each defaulted more than a half-dozen times during that period. Indeed, one of the key aspects of the rise of British power to replace them during the 18C was its ability to manage its debt (both in terms of spending and financing). The pattern waned in the 19C, although lesser powers (e.g. Portugal and especially Argentina (much in the news lately, too) became notoriously unreliable. In the 20C, there were fewer formal defaults (if you leave aside the wholesale flushing out of the finances of Russia by the new Soviet regime in 1918 and Germany, Italy, and Japan in the aftermath of WWII). 

Despite the pie-in-the-sky claims of an AI-generated spurt of productivity, the situation going forward is actually pretty bleak. Aging populations will suck up lots of cash for retirement and health care. Roads and bridges will need to be repaired. A less stable/secure world will demand more expenditures for military modernization. Not to mention climate repair/mitigation. Fewer workers per retiree (especially as we slash immigration) means that we’re running out of places to find cash for the government costs.

The likely outcome will be a combination of politically unpalatable steps. There will be much screaming and hand-wringing, but (even in this age of mythical thinking) numbers need to add up. Capping or cutting social benefits, increasing taxes, and, seemingly inevitably, an extended bout of non-trivial inflation, lie ahead. Much of this will cut along class lines, leading to a significant economic “populism” movement. We’re seeing the first steps along those lines now, the recent tax and expenditure moves will soon (within the next two years) start to bite. So, we will have an early indicator of whether those adversely affected will mobilize to protect the social safety net by demanding an increase in taxes. Everyone will be affected by inflation, although those most well off will face the biggest impact on the relative value of their accumulated wealth.

Politically, this won’t be pretty. The desultory skirmishing over the government shut down will seem trivial. Lots of folks will be getting pretty angry and increasing numbers will radicalize. The hollowing out of the middle-class will make the premises of democracy less tenable. The combination will make the fears of political philosophers from Plato and Aristotle onwards—the perennial tension between the masses and the well-off--more tangible. It will be remarkable if the body politic can come through this with a new balance of power, money, and justice. 

Maybe we should require all elected officials to pass mathematics and accounting exams and cut their pay and benefits if their numbers don’t add up.


1 Comment

On Gates and Climate

11/7/2025

1 Comment

 
On Gates and Climate

Last week, Bill Gates published a statement on climate change and the broad direction of national and global policies aimed at improving the lives of people around the world. It was timed to affect the imminent annual global meeting on our dire environment; a fraught process that already faces fresh headwinds due to political retrenchment by the US and others. While I only have a small fraction of Gates' readership (and a much, much smaller fraction of his resources to apply towards these concerns), I think it’s important to address what he said (and not what politicians and the mass media said about what he said). Here are his three punchlines:
  • “Climate change is a serious problem, but it will not be the end of civilization.” [Well, that’s nice to know. Nuclear war, biological agents (intentional or accidental) haven’t ended civilization either …yet. By the same token, malaria and starvation in poor tropical countries won’t end civilization either. Climate change will, however, likely kill millions in the meantime and vastly disrupt the lives of virtually everyone else. For the former, their civilization WILL end; and for those whose lives will only be marginally affected (i.e. the well-off), what kind of “civilization,” morally, will be left?]
  • “Temperature is not the best way to measure our progress on climate.” [Even if this is true, it’s a simple and easy to understand standard for the millions of folks who will need to pay attention. Public support and engagement is an essential part of addressing the problem.]
  • “Health and prosperity are the best defense against climate change.” [Dead corals and other collapsing ecosystems may not find much protection here. Even if this were just about humans, many climate impacts don’t care about net worth and nutrition. The need for improved global health stands on its own. Figuring out what prosperity means on a global basis is another issue altogether; getting everyone in the Global South up to OECD levels of living standards would be equitable, but would vastly increase the drain on the planet.]


I cannot, of course, challenge any of the facts that he cites. But there’s a lot of opinion mixed in with his facts and no small amount of selectivity in the facts he chooses to highlight. My concerns are more with his assumptions and the way he frames the issues than his particular positions.

Concern #1 –Short-term perspective: It takes a long time to alter planetary climate. We humans have been at it aggressively for several hundred years, particularly in the last century. It’s virtually impossible to see how we could get back to 1950 climate levels within a century and ecosystems around the world (e.g. icecaps or coral reefs) may never recover even if temperature increases are reversed. 

Gates is concerned about people living now, especially those in poverty or subsistence lifestyles; but they, and the rest of us, will be living for a while and will do so increasingly in a world adversely affected by climate change. We also need to consider those who might live on this planet later in this century or the next, with climates running 2-3º (best case) above “normal.” In other words, if we don’t start fixing it now, it will never get better. Deterring action, as Gates implicitly argues for, will mean more folks will have problems longer.

Concern #2—Naivité about how public awareness and public policy change over time: The modern environmental movement is about 50-70 years old. It was pretty much on the fringe for the first several decades and has only been a serious force domestically and globally for 30-40 years. As the science has gotten better, the public debates have gotten more serious and meaningful, but despite the efforts of the more outré groups (e.g. Greenpeace, Climate Extinction), most folks don’t pay attention and the ranks of climate deniers (not least the current Administration) remain robust.  Until we start to do enough, we need to continue to make noise and push changes in policy and behavior.

Gates is aware that his statement will be used against climate activists (the orange-haired tweeter said the next day that Gates statement means that: “I won the war against the climate change hoax.”) I have to ask whether fortifying an administration that has little interest in either climate issues or improving the health and well-being of the millions of folks that Gates claims are his priority was useful.

In addition, as a practical matter, climate impacts on those individuals and countries which have more resources to do something about it are more likely to garner attention and resources that benefit everyone, even in the face of inadequate focus on improving global health and welfare generally.

Concern #3 –A false dichotomy about the choices we face: Gates says we should measure our policies and allocate resources against a broad standard of improving “human welfare,” rather than “partitioning it off for particular causes.”  Well, of course. But that begs the question of how best to achieve improved human welfare and then how to allocate resources to each effort. Global health and welfare are underfunded. Climate is underfunded. Pitting them against each other isn’t helpful, nor does it reflect the underlying problem of economic inequality (at both the individual and global levels). 

Concern #4—Techno/capitalistic-optimism: Improvements in technology, whether in health, agriculture, or energy production and usage will make things better across the board to be sure. Real progress is being made. But changing attitudes and behaviors, especially about humans exploiting the planet and pretending that there are no consequences, remain the necessary root source of real solutions. Eliminating malaria and generating food surpluses will increase that exploitation, especially given the population projections for Africa. Smart investors will certainly make money along the way, but this is a different kind of challenge for humanity and doubling-down on modernity will end poorly.

In sum, however accurate his facts and well-meaning his intent, Gates’ memo is politically ham-handed, unlikely to increase the resource allocation to the good causes he supports, and likely to make it easy for climate deniers and climate ignorers to blithely carry on. Not good.


1 Comment

A Party Incubator

10/31/2025

1 Comment

 
Almost five years ago (111320), I pointed out the charade of our two political parties (the Dems being only marginally better off than the GOP) and called for a reformulation of the structure of our partisan system. The Biden interregnum—focused as it was on being more anti-Trump than something innovative—did little to stop the deflation and distraction of the Democrats. Their reaction to the loss last year has been aimless and incoherent and is reflected in dire drops in popular support (both voter registrations and fund-raising). The Republicans, while organizationally stronger, have wholly gone over to the cult of personality and nihilism. 

So, I observed with considerable interest the announcement last July of the establishment of the America Party. Now, the principal sponsor of this new organization is none other than the inimitable Elon Musk, whose outsized personality ensures attention to any of his proposals. His remarkable business success (PayPal, Tesla, Starlink, SpaceX, X (nee Twitter), Boring Company, etc.) has lately been overshadowed by his bizarre dabbling in government reorganization (DOGE). Still, he has been the country’s most successful serial entrepreneur and even if his political nous is spotty, he has a track record for start-ups. The America party may, however, be stillborn. In August, Musk indicated that he was refocusing on the ailments at Tesla (he’s hardly been in the press for months); and it’s not clear whether the party will ever get off the ground. Given Musk’s substantive policy predilections, however, this may not be a great loss to our political culture.

At the time I wondered whether any number of more center-left billionaires would similarly sponsor a new party (Bloomberg?, Soros?). Fund raising would not be a problem, and expertise can be quickly hired (of course substantive political leadership is much harder to find and any likely standard-bearer would likely be wary of being seen as a hired shill). 

Perhaps a more promising approach could be getting a bunch of such folks to each chip in (say, $100M each) into a party incubator. We would need 6-10 folks with different political views or who had no particular political profile to ensure that the funds would not come with any particular agenda attached. 

The project would be overseen by a set of perceptive worthies from across the political spectrum who would be charged with spinning off three or four new partisan entities. Instead of our current donkey and elephant symbolism, we could have an eagle, whale, bison, and salmon (or some such). Groups of polticos would pitch their ideologies/platforms/organizational capabilities to this panel who would choose the most coherent and capable to contest future elections. Given our current selection, we need to repopulate civil society and reboot our political culture.

This approach would accelerate the development of a new political culture in this country. Multiple entities would reframe the issues facing the country, provide platforms for new potential leaders, and garner lots of attention. They would be well funded and not tied to any current organization or ideology or personality. 

It would be a radical re-set for the country. There is a long history of popular interest in independent candidates and “third” parties in this country. However, efforts over the past few decades (John Anderson (1980), Ross Perot/Reform Party (1992/96/2000), Ralph Nader (2000/04) have mostly been vehicles for individuals. Most fringe parties (e.g. Libertarian, Peace and Freedom) have been based on marginal ideologies and have thus had to struggle for funding. 

Incubators have popped up all over as places to spur the development of technology projects, sometimes in academe, sometimes as vehicles for venture capitalists. Philanthropies use them, too, to help nascent non-profits get off the ground. In this sense, politics may not be so different as a field of activity. Bringing together money, mentors, dreamers, and doers without a sanctioned or pre-conceived set of policies or philosophies is a different way to think about moving past our current decimated political landscape.

1 Comment

Kill all the Historians

10/24/2025

1 Comment

 
In Henry IV (Part II), one of those planning to overthrow the incumbent royal dynasty says: “The first thing we do, let’s kill all the lawyers.” It’s a line we attorneys have frequently heard since we were in law school. Usually offered as a derogatory remark, it misconstrues Shakespeare’s sense that lawyers were, in fact, laudable champions of order, whose disposal would pave the way for revolution.

It’s hard to know what the Bard would write about our own parlous times. The current administration clearly doesn’t want to get rid of all the lawyers. After several years of foaming about the Dems “weaponizing” the law, the current incumbents are offering a master class in how to do so. They clearly want at least some lawyers to validate their own interpretations of all sorts of laws. And, as some law firms have demonstrated, there are plenty who (even if not fully fallen to the dark side) have been willing to make Faustian bargains. It seems likely, after all, that we will have plenty of lawyers for years to come.

Nonetheless, Dick the Butcher (Shakespeare’s character)’s sentiment would seem well applicable to my second profession. Whether in terms of attacks on universities in general, or planning to promote only “patriotic history” (according to a recent proposal from the rump Department of Education), or willfully ignoring inconvenient historical facts (foreign or domestic), or trimming the booklists at your local public library, History is under attack as never before. 

Regular readers of this blog are familiar with my critiques of the History discipline, so I won’t rehearse them here. After all, those shortcomings all fall within the category of venial sins and the critiques were intended to promote a more robust engagement with the complexity of our pasts. The current onslaught is more severe and Orwellian. 

History, as the saying goes, is “written by the victors” and has been a tool of state power for millennia. The rise of a professional discipline in the 19C, working toward producing “accurate” descriptions of the past has thus been a counter-vailing trend, which flourished in the 20C; an important branch of the “speak truth to power” school of public engagement. History has also been part of the modern project of recognizing and dealing with the complexity of life. Indeed, one of my grad school professors constantly challenged us to “complicate” the stories we were exploring and explaining.

There are virtues in simplicity and the versions of history that are told in elementary schools are necessarily more cursory than those in high school (a fortiori, in college). It is a sign of intellectual maturity to be able to hold conflicting interpretations and perspectives in one’s head simultaneously. It’s difficult to square the call to only offer “patriotic” history with such aspirations, but perhaps it’s only fitting for an Administration whose members often seem to be stuck in sixth grade and who generally seem to prefer mythology. As Jack Nicholson’s character says in “A Few Good Men” they “can’t handle the truth.” 

We can see this not only in terms of the specific policies and budget moves towards History (slashing NEH and public broadcasting budgets, rewriting the signage in National Parks), but also in the renaming of the Defense Department, recalling the halcyon days of the “War” Department when America won all its wars (and was right (dammit!) to do so), soldiers were soldiers, and men were men. In its eagerness to be anti-“woke”, the Administration is planning a broad return to the “good old days” when only white male lives mattered. Military bases; names are once again named in honor of Confederate Generals (whose patriotism did not fully extend to the United States of America), not to mention the likely plans to haul various Confederate statues out of storage. The “hit list” (so far) of senior government officials includes the Archivist of the United States, most of the leadership of the National Endowment for the Humanities, and the Librarian of Congress. 

Beyond the political level, the discipline is in disarray and ill-prepared to defend itself. Universities caving into Administration pressures provide little cover for those who wish to “speak truth to power” and stand up for their version of history and, more importantly, for the possibility of multiple interpretations. Drops in enrollment generally and evaporating endowment of small liberal-arts colleges (on top of the disorientation of dealing with AI) have demoralized academic Historians across the country. If there’s relatively little protest about the gutting of health care, you can be sure that “History Forever” will not be the battle cry of a mass-movement Resistance.

Almost five years ago, in the context of the first impeachment trial, I noted that the final appeal was not to the legal courts, but to History. This assessment is especially important if (at least) some of the lawyers are “killed” (at least metaphorically). It also highlights the importance of a diverse and vibrant community of Historians to argue over the interpretations of the past and engage youth and citizens in considering their own place in history. In contrast to the (mercifully) relatively incompetent first term, this time around, they seemed to have learned to move against this fallback line of defense as well. 

In doing so, the Administration and its fellow-travelers demonstrate their own short-sightedness. They think theirs is the final act and that the pendulum will not swing back in due course (even if not soon enough for many). If I were to overstep my role as a Historian and enter into a prophecy, I might say that “History will come back.” Instead, I will merely point out that it always has so far.

1 Comment

Baltic States

10/17/2025

2 Comments

 
Baltic States

No, not the Balkans. I’m talking Estonia, Latvia, and Lithuania (with honorable mention for Finland). As a traveler and as a Historian of Modern Europe, I had a big gap for this corner of the Afro-Eurasian continent that’s now (after a 12-day sojourn) patched (if not fully explored).

Big countries get all the press—in both History and current affairs. The political crisis in France the past few weeks has gotten far more coverage than the conflicts within the government in Lithuania, for example. Little countries (and the three Baltic States all qualify, with a combined population of about six million) get squeezed in between big powers, a situation often to their geopolitical detriment. The Baltics have been caught between Russians and Germans (and the Swedes in the 16-18C) ever since they were dragged into Christendom in the 13-14C. It hasn’t been pleasant and since the rise of European nationalism in the 19C, they each have bristled under the domination of their bigger brethren. 

When the Russian Empire collapsed under the twin blows of WWI and Lenin’s Revolution, they bolted for independence, only to be swept up by Stalin in 1940, Hitler in 1941, and Stalin again in 1944. It’s no wonder they were first out the door when the USSR collapsed in 1991. Each asserted that they were reviving their independence from early in the 20C. The Museum of the Occupation in Riga (Latvia) stacks these eras together in a powerful story of modern independence in the face of oppression.

It's also no wonder that they jumped into the arms of the European Union and NATO in 2004 as the best protector both of modern democratic norms and of military defense. The number of NATO flags flown in all corners of the three countries was remarkable and shows that these folks know what’s really important. Even more than the NATO and EU flags, the Ukrainian flag was everywhere. As fellow-sufferers of Russian aggressiveness, the Baltics are strongly committed to support Ukraine morally as well as financially. In both Latvia and Lithuania, they have renamed the street on which the Russian Embassy sits as the “Street of Ukrainian Freedom” and anti-Russian protests are an ongoing feature. Significant Russian-speaking populations in Estonia and Latvia make them uncomfortably similar to Ukraine as targets of Putin’s efforts to reconstruct the Russian Empire.

Beyond their geopolitical situation, it's important to recall that while we may lump the three countries together, there are sharp differences between them in history, culture, and language; although local animosities and rivalries seem to be suppressed. None of the three languages is mutually intelligible and none has any significant connection to their European neighbors. Estonian is closest to Finnish and Hungarian, but Latvian and Lithuanian are their own branch of the linguistic tree. Fortunately (and, again, typical of small countries) everyone in the big cities speaks multiple foreign languages with English being ubiquitous. Lithuania is predominantly Catholic (tied to their close relationship with Poland for several hundred years (16-18C). Latvia is mostly Lutheran, and Estonia is a mixed bag of post-religious, Lutheran and Eastern Orthodox.

Lithuania was a major power in its own right in the Middle Ages. It was the largest country in Europe in the 15C, famously stretching from the Baltic to the Black Sea (far more important than the contemporaneous Henry VIII of England who gets vastly more press). It is now, of course, just a shadow of its former expanse (only about 7%). The Grand Dukes were remarkably religiously tolerant then and, among other groups, Jews flourished, becoming a significant portion of the population before Russia took over in the 18C. Russia confined Jews mostly to that territory (called the “Pale of Settlement”) and, of course, the population of Jews was almost eliminated during the Holocaust. There are moving memorials and museums to this lost culture in each country. Latvia and Estonia never grew beyond their own neighborhood. Instead, their principal ports: Tallinn and Riga were key parts of the Hanseatic League, a Baltic-focused association of trading cities from the era (13-17C) before modern nation-states became the dominant form of political organization.

As former Soviet territories, they emerged in the early 1990s far behind Western Europe in terms of economic development, but have made considerable strides since then, especially in Lithuania and Estonia, not far from Spain in GDP per capita, leaving Russia far in their wake. They cling to their traditional culture even as they increasingly shift towards comprehensive European integration. While Wi-Fi is everywhere, Estonia is notable for its commitment to digitalization, especially of governmental processes. Each has its own sets of castles (mostly from the Middle Ages) and palaces (a couple of Versailles wanna-be’s). Tallinn has a well-preserved medieval old city, Riga has a lovely Art Nouveau quarter, and Vilnius has some older buildings as well, but comes across much more as a bustling modern European capital. 

History, as I have repeatedly said, doesn’t hold many specific lessons for current events; there’s too many differences in contexts. Nonetheless, given the current febrile political environment and the dark outlook we all face on several fronts, there is inspiration to be found in the example of the Baltics. They’ve been occupied for most of the last five hundred years. They have suffered—culturally and economically—far more than what we in the modern West are used to. Still, they persevered and, eventually, triumphed. In each of the three countries, they look back with pride not only on their uniqueness, but also how they banded together in 1988 to create a human chain of more than 2 million people stretching from Tallinn to Riga to Vilnius to demonstrate a shared commitment to their independence and hope for freedom. They were likely as surprised as most that it came to fruition so quickly, when the Soviet empire collapsed in 1991, but they showed that dark times are not all end times. The event is called “The Baltic Way.” 

Overall, I had a successful trip. It was a good reminder that one of the benefits of travel is seeing the diversity of people in places, even (especially?) in parts of the world that we think are broadly familiar. There is diversity and pride in culture and history in every corner of the world and, in an important sense, everywhere is a corner.

2 Comments

The Number of Nines

10/11/2025

1 Comment

 
There is an ongoing (and seemingly perpetual) debate between historians and social scientists over the purpose, feasibility, and value of each others’ work and discipline. As a historian, I am “true to my school,” and can easily poke holes in the premises and theories of social scientists. At the same time, I am alive to the conflicts, limitations, and overreaching of my fellows.

Social scientists critique historians by saying that the latter fail to advance human understanding, that the mere chronicling of human events might well be interesting and their stories might be engrossing, but that without theories and models, we have no way of evaluating behavior and come to an understanding of how and why people do what they did (and do). Historians might respond that their work often (usually) incorporates social science theories (e.g., the will to power/domination, the importance of community compliance and acceptance, or the economic value of personal fashion). Historians would also stress 1) that such theories only go so far in explaining historical actions, 2) that there is so much that is unknown about human motivation, and 3) that there is so much contingency in human events; that theories aren’t, in the end, worth very much.

Historians’ critique of social scientists builds off these points. They emphasize that theories and models may have all sorts of mathematical elegance, but for all their complex calculations, they are still crude approximations of behavior. For example, extensive studies have been made of the correlates of war as social scientists seek to understand what causes such conflicts. Historians remain dubious (to put it mildly) of such efforts, pointing out the significant difficulties of defining what is meant by war (and its various theoretical causes), not to mention the limited number of cases of war in human history, each of which has so much distinctive context as to make them incommensurable. 

A central component of social scientists’ defense of their approach is that they are at least trying to derive meaning from the sprawl of human experience. They emphasize that—as social scientists—they a pursuing an ever-greater approximation of an accurate depiction of the nature and practice of humanity. It is no accident (and here both groups agree) that “science” is part of the rubric of this set of disciplines. The methodologies and language used by social scientists—including experimentation, testing hypotheses, and statistical analyses—are all grounded in the “hard” sciences. Their aspiration is to replicate the amazing success of physics, chemistry, and biology over the past half-millennium only, this time, in terms of understanding economic behavior, voting patterns, and social structures. 

The value of science lies in its production of knowledge that enables us to predict how nature will work in the future. By “nature” here, I mean all manner of ‘things:’ blood vessels, comets, and the addition of a liter of nitric acid to five kilograms of granite. Science, however theoretical or “pure,” is not merely about understanding stuff; historically; it has to lead—eventually—to our ability to interact with nature in the future. The success of science historically can be seen in the extent to which we as a species have gained confidence that we can predict this aspect of the future: the apple that fell from a tree onto Newton’s head gives me confidence that I could determine (with appropriate testing of the tree and stem, combined with an evaluation of the wind) when the apple that sits over my head will fall on me or how, with appropriate design, the ceiling that sits over my head will NOT fall on it. The “scientific revolution” of the 16-18C is a short-hand description of not only the process of gathering such knowledge, but, more importantly, the dramatic boost in human confidence in the understanding, management, and predictability of nature.

Science has leveraged mathematics to foster this sense of confidence and to accommodate its incompleteness (science inherently being a “work in progress”). We can’t be entirely sure that some rogue asteroid won’t hit our planet, but we can speak of a 99.9999% degree of confidence. This is close enough to “true” that we can thereby get on with our lives. This (apparent) precision is an inestimable part of creating the appearance of accuracy as a source of trust.

Few parts of the social sciences can aspire to this kind of precision, much less this degree of approximation of ultimate accuracy. There are too many variants and vagaries in human psychology to support it and there are too many contingencies in the future course of events to enable predictability. Stated simply, social scientists necessarily produce fewer “9s” than their “hard” science counterparts (I will, of course, grant that there are no bright lines here (for example, between physiology and psychology).) What social scientists can produce at best are plausibilites, likelihoods, and indications as to whether Amazon Prime Day prices will result in larger sales volumes, or recent diplomatic gestures regarding Palestine will cause Israel to moderate its attacks. 

Academic social scientists at least insert some number of caveats into their work, even if these often don’t make it into the popular press reports or these same academics when they become talking heads on TV or the Net. Applied social scientists (by which I mean the vast majority of white-collar jobs such as business managers or other bureaucrats, salespeople, educators (including history teachers), or therapists) were either never aware that they don’t know exactly what they’re talking about or have forgotten it.

As I have repeatedly remarked, historians are also at risk of forgetting the necessary caveats in their work or, worse, of forgetting that our knowledge of the past (or as close as we can get to it) provides no more basis for predicting future behavior than for any other profession. Generally, however, we avoid even talking about such precision or accuracy. In other words, we don’t even claim any “9s” 


1 Comment
<<Previous

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly