Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Change, Transition, and Crisis

2/27/2026

1 Comment

 
In our current bleak days, it’s easy to forget that throughout history there’s always a crisis going on. Those in the middle of a specific crisis (that would be us!) often lose perspective amid the pressures and urgencies of the moment. Indeed, I might suggest that we can see that crises are often (usually?) the product of overfocusing on those urgencies and putting off dealing with the important underlying issues until the pressures build up and they crater into their own urgent crisis down the road.

Even if I am wrong in my causal analysis, it’s hard to argue against the notion that crises arise due to the accumulation of change, often sprinkled with some dramatic event which manifests those changes. Since History can be characterized as the study of change over time and since Historians are not above (over-)dramatizing their work, it’s no wonder that “crisis” appears frequently in History book titles.

There’s the “July Crisis” (summer of 1914 leading up to WWI), the “General Crisis of the 17th Century,” Marx’s ongoing “crisis of capitalism,” just to name a few and not to mention the innumerable localized or brief crises scattered about, usually of a geopolitical or economic nature. We have “constitutional crises” (Dred Scott, Nixon, Trump) and the Brits have “cabinet crises.” It’s hard to tell the difference between a “problem” and a “crisis;” so much so that I suspect using the term “crisis” is just a way to claim attention, either contemporaneously or historically. 

Most History books that aren’t “crisis” centered focus rather on “transitions.” How many times have I read: “It was a time of transition from [A] to [B]”? As if. As if there’s not always some transition going on. The nature of history (small h) is that change is always happening: From the Mughals to the Raj, from sail to steam, from foragers to growers, from search engines to AI. As Johan Gouldsblom suggested, pretty much all of history can be summed up as: First, nobody has xx, then some folks have xx, then everybody has xx. Despite our idealization of the past captured in some historical ‘snapshot,” in fact (with the possible exception of “revolutions”) there is no stability from which any particular change is a remarkable difference. And, of course, few of these transitions have much in the way of a clearly demarcated starting point or ending. 

Perhaps crises are merely the crux points of transitions.

It’s also worth noting that much of this crisis/transition sensibility comes from elites who have the ability to observe this level of change. There are many (most?) whose lives are precarious and in a constant state of crisis; but they don’t write books or blogs.

In any event, characterizing some change as either a crisis or a transition is, likely as not, merely a rhetorical device. Sometimes, these are useful as when a historian puts a new frame of interpretation on events, such as the shift from plain old cell phones to “smart” phones or the shift in political alignment among Southern whites from the Democrats to the GOP in the aftermath of the mid-20C Civil Rights movement. Sometimes, however, putting “crisis” in the title is just a way to sell books.

The insightful historian Adam Tooze (2022) characterizes our current era as being in a “polycrisis,” highlighting the multiple overlapping issues that seem to be coming to a head in the 2020s. It’s not a terrible word (and certainly preferable to the overused “perfect storm” metaphor), but as with the 17C or the 1930s (just to pick two examples), any good crisis worth its name always include multiple components and angles.  

One important perspective that arises from looking at the history of crises is that whoever comes out the end figures out how to make do and eventually we get to us in the present day. In other words, however much contemporaries bewail their particular circumstances as the end of the world, it isn’t. And, as noted from the beginning of this blog series, there are few lessons to be taken from these events/developments and fewer occasions for judgmentalism. If the Russians had deployed a bit more sophisticated military planning in 1914, the “July Crisis” would likely have ended up as a relatively forgettable third Balkan War rather than a continental conflagration. While we know the damage that resulted in the event, we don’t know how the world would have been along the line of that alternate history; so we can’t say which was worse or start blaming anyone for what we ended up with. Crises that get resolved get forgotten and subsumed into the flow of history.

Perhaps it’s a fundamental human addiction to adrenalin that makes it so attractive/effective/ necessary to hype things up and overdramatize the mundane. Perhaps parts of us secretly want to (as the Chinese curse) “live in interesting times.” On the other hand, it’s hard to blame those (myself included on some scores) who try to bang the drum loudly when they see peril coming and unattended to. We revert to whatever devices lay at hand, including rhetoric, to rouse the sleeping populace even at the risk of overusing and devaluing the language to the point of moral exhaustion. On the other hand, Paul Revere would not likely have cried out to Lexington and Concord that “the British crisis is coming.”

The real test of a crisis is what we do with it. As Winston Churchill (perhaps apocryphally) and Rahm Emmanual (certainly) said: “Never let a good crisis go to waste.” Still, historically, most crises do; folks muddle through until a bunch more changes pile up, we go through a transition or two and then walk into the next crisis.  Those that then try to seize the day are called “revolutionaries.” They, too almost always end up in a crisis of their own soon enough.
1 Comment

Degrees of Indistinction

2/20/2026

0 Comments

 
The State of California once led the country in establishing a vision for higher education and deploying resources that made college education available to millions of its residents and fostered world-class research across dozens of disciplines. Its latest move shows that this mid-20C spirit is faded, eviscerating the meaning of the degrees which are granted at both the high school and college levels. The institutions which were charged with realizing the original vision have fallen prey to self-preservation and sclerosis.

Specifically, a new law requires most California State University (“CSU”) campuses to offer admission to any California high school student graduating with at least a 2.5 grade point average (i.e., C+/B-). Now, (according to ChatGPT) the average high school GPA has inflated dramatically over the past 40 years. In the 1980s, it was under 2.4, it is now well over 3.0. This means that about 70% of graduates have at least a 2.5 GPA; i.e., below average students are now encouraged to go to college.

[I have to acknowledge that as a graduate in the 1970s from an upper-middle class environment with lots of academic support and resources, I am part of the incumbent elite in this story.]

The combination of these facts and the new law raise several questions:

1) What is the purpose of a college education?
2) How does this law help students?
3) How does this law help legislators and the CSU system?
4) Why would most students who didn’t do all that well in high school, want to go to college?

The post-WWII expansion of higher education was seen as an important means of building a strong-middle class in the US, along with facilitating the growth of the economy (especially in the service and corporate sectors). A college degree was a mark of distinction and was well compensated in the employment histories of those who achieved it. While less than 2% of the population had a college degree in 1900, by 1960 this grew to over 7% and to over 16% by 1980. Since then, the rate has again more than doubled to about 38%.

The marginal benefit of a college degree, therefore, had to fade over time. The mantra of “to get a good job, get a good education,” was tremendously effective in engaging students (and their parents) to steer their focus in this direction. But the surge in degrees necessarily means that they’re no longer so distinctive and economically valuable.

At the same time, broad social changes across the country throughout the second half of the 20C expanded the likely pool of college students and the social necessity of expanding college access. I don’t think it’s accurate to characterize the continued push for degrees to those aspiring to increased socio-economic status as some sort of deception or manipulation by elites, but there is some irony in the fact that this “democratization” of college degrees has coincided with their relative reduction in economic value.

For decades, colleges have deployed resources, techniques, and programs to enhance and accelerate graduation rates for their students. Much of this effort responds to the relative decline in student’s capabilities (for a variety of reasons) at the time they enter college. Overall, they were a modest success—at least on their own terms. The new California law continues this trend. It builds on an embedded belief that the purpose of a college is to produce college graduates. Clear thinking about the meaning and purpose of becoming one of those graduates is, however, harder to find.

I’ve talked before about the vocationalization of college education: the focus on job training and the diminution of the liberal arts. This process has developed in tandem with the “industrialization” of college education: turning colleges into assembly lines for the production of graduates with degrees. 

The new law now encourages high school graduates of below average performance to go to college where they will likely struggle even more than they did in high school, to spend five years or so and about $100,000 (plus living expenses) to get a degree that will get them into the middle of the mass of job seekers. 

Wow! Sounds like a great deal. 

That would be tough enough in ordinary times, but the AI-induced incipient upheaval in the entry-level white collar job market undermines any confidence that this traditional scenario will continue. (See 012425, Morlocks, for a comment on the long-term societal consequences).

Besides providing more “opportunity” for young Californians, the new law also seeks to help those colleges campuses which have faced enrollment declines in the last several years. [This includes SF State, where enrollment drops led to my “retirement” in 2024 (so one might think I would benefit from this law).] In other words, let’s artificially prop up demand for a service and institution that can’t cut it in the marketplace. 

This is the sort of legislation that generates a nice sounding press release and campaign PR for its supporters (it passed unanimously in both houses). It’s no wonder that most voters have little enthusiasm for our democratic institutions. Elected officials are more concerned with sound bites and demonstrating "action” rather than actually thinking about what would be useful to young people and candidly re-assess the nature and purpose of the public university systems and the extensive expenditures we all make to support them. 

I’ve spoken before about the limited utility of analogizing from history, mostly in geopolitical contexts. It’s especially true here; we have no good sense of how this will play out, but the changes are likely to be deep and wide.

0 Comments

The End of the World

2/13/2026

0 Comments

 
In my recent comment on Bill Gates’ piece on climate (110725), I criticized his dismissive remark that the climate crisis was not the “end of civilization.” I pointed out how, for some individual victims and societies, it—literally—is. There are important civilizational questions about how we and those to come will choose to memorialize those whose lives or cultures were cut off in this way. The Museum of Climatically-Extinguished Cultures and Creatures is likely to be pretty crowded by the 22C.

Still, while my life is not likely to come to an end due to climate change, nor is civilization in general likely to collapse by 2039 (my planning horizon according to the Social Security Administration’s life expectancy tables), this prospective doom got me wrestling with why and how I might think about those whose lives will continue into 2040 and beyond and how I should conduct myself in the meantime. In other words, since civilization will have come to an end as far as I’m concerned, why should I care about those left behind?

Philosophy has proposed and history has demonstrated several answers offered to this question. Just to put a bit of structure on this issue, we might divvy up our impact into these categories: the personal memories of those who will continue on, our biological progeny, physical manifestations of our existence, and cultural traces of our impact on the world. We might also distinguish between the impact/legacy of the few prominent people in the world and the vast remainder of us. Finally, we should note that many folks have lived based on their expected status/treatment in some shape of an afterlife (aka Heaven, Hell, Nirvana, Valhalla). However, since I don’t believe in an afterlife, doing stuff now to get credit in the hereafter seems futile (as well as a bit tawdry). 

Based on how people behave, many folks want to be remembered by those still around or to come years and centuries post-mortem. And most are, at least by family and friends. But memory is a fickle thing, not only does it fade over the years, but by the time memories are passed down to succeeding generations, distortions are inevitable. Very, very few are likely to be remembered by anyone in any meaningful way more than 50-70 years after their own passing. Maybe that’s as far ahead as we can imagine, so we don’t care about our more distant legacies. The memories of the 5 billion (+/-) people who died over the course of the 20C are going quickly and in 30-50 years what will be left? My niece and nephew were in their teens when my mother passed. In fifty years, they will be in their 80s with faint wisps of her (either direct or through their father) remaining. I have a friend who is into genealogy and has reconstructed some of his family lines back for several centuries. It seems personally satisfying to him, but those long past have left only a name and a few traces of themselves. With all due respect to Ancestry.com et al., I don’t aspire to be an entry on a long list compiled by a greatx8 niece in the later 22C.

In terms of cultural remembrance, very few leave something behind. Wikipedia includes just over 2M biographical entries and maybe ten times that number have some sort of bio bit somewhere. Out of 100B people who ever lived, that’s not very good odds at being remembered in this way. Donating money will get you a building or a pew, but as with the above examples, the names of those that do will become only words with no meaning behind them all too soon (ditto for street names). A few folks will be memorialized in published articles or their archives dug up by Historians or click-bait chasers. Of course, with big bucks or some luck, you can play in the big leagues: sainthood and multiple churches, universities (unless your money came from enslavement), or cities (Charlestown), states (Penn-sylvania), countries (the Philippines, Bolivia); but that’s likely not more than a few thousand all in. Scientists and doctors have a nice racket going in naming diseases and natural phenomena after themselves (the Humboldt Current, Higgs’ boson, and Alzheimer’s disease all come to mind). The pinnacle is likely the Taj Mahal: prominence at the less than one-in-a-billion level.  All-in-all, for ordinary folks, we will likely get swallowed up in the maw of time. Even if some electronic record remains, who will look for it or do anything with it? (In this regard, access some record of mine by an AI/Borg as it hoovers up items from the past for some college essay in 2076 doesn’t really seem to count!)

In sum, if we can’t count on being remembered in any meaningful way beyond a generation or two, the purpose of memorialization is more likely for the self-satisfaction of those who seek to be remembered; in other words: ego. If you think you’re an exception, see Shelley’s poem: Ozymandias (1818). 

So, if personal, name-and-likeness legacy is a racket, is there anything I can leave behind? I think so, but it’s not likely to be identifiable. I used to say, when I was teaching in college, that my impact on my students was more likely to be in things they remembered or ways of thinking they learned even if they couldn’t remember my name, or even how they might have learned them originally. I suspect the same is true elsewhere, it terms of both family, social, or cultural influence (or even charitable donations). 

If the end of the world (objectively) coincides with the end of the world (subjectively), then it won’t really matter. If the objective world carries on, who knows what direction things might take? Preserving the possibilities for future discovery is as about as far as I can see may be enough and my belief that I helped it to do so is valuable to me and sufficient as a motivation. 

0 Comments

A Poisoned Chalice

2/6/2026

0 Comments

 
If it weren’t for all the mayhem and pain likely to come in the meantime, it would be a blessing that we have 2 ? years until the sturm-und-drang of next Presidential election. Of course, in our perverse media/money-driven system, the major potential candidates are already getting organized and positioning themselves. Even without the particularly dysfunctional nature of our political parties, it’s not likely that we will see a candidate who is really ready to tackle the fundamental issues facing the country. Indeed, whoever wins is likely to fail badly.

Of course, I have zero confidence in the likely GOP standard bearers. If you multiply the imagination, compassion, and integrity of either Vance, Rubio, Ron DeSantis, or Ted Cruz, you might well get a negative number. Anyone else on that side with the capability of being President has either bowed out of our Trumpian-dominated public life or has a sufficiently low profile as to have no chance of serious consideration. As I have noted recently, the entire party seems wrapped up in issues and images of the past which, combined with the remnants of its “small government” philosophy, ensures a passive approach to the dire challenges ahead.

There are half a dozen Dem Governors in the mix (Newsom, Shapiro, Beshear, Pritzker, Wes Moore, Whitmer) plus Pete Buttigieg. They all score much higher on the imagination, compassion, and integrity combination, but face a comparable internecine (“progressives” vs. “moderates”) drag within their “party.” Once elected, they would also have to deal with reconstructing the federal government, in terms of both personnel and policy, in the aftermath of the current evisceration. Even with my suggestion of an accelerated remediation (see my proposed EAGER Act, 052325), much of their first term would be spent getting systems and programs back to the ground floor from the current sub-basement level and dealing with the “normal” range of issues and crises. 

Getting Congress to act, even if it had modest Dem majorities in both houses, presents another ubiquitous hurdle to meaningful action. After all, at the end of the day, the Dems are only marginally more cohesive and effective than the GOP. They have their own share of personality squabbles, infighting, and inertia. They will also be distracted by the shiny toys of power and the opportunity to go after Trump and his many corruptions; as well as “preventing” his abuses from recuring. These are worthy targets in the abstract, but when establishing the priorities for taking care of the country, they have to be relatively low on the list

The prospect of making fundamental changes will also run into the electorate’s unwillingness to recognize root causes and bite the short-term bullets necessary for long-term improvement. Indeed, the one-word summary of today’s popular concern is “affordability.” However, the economic data show that this isn’t really a problem for most of the (middle class to well-off) folks currently complaining. There is something much deeper going on and it’s not susceptible of quick fixes. This includes the loss of the “American Dream” (some version of “Ozzie and Harriet”), uncertainty about our place in the world (aka “globalization”), and a loss of confidence in society’s and government’s ability to maintain coherence and progress.

Even if addressing those historical concerns was feasible, they wouldn’t reach the underlying problems that demand prompt and radical action: climate, inequality, housing, and the imminent disruption of AI on our workforce and demographics.

There are, of course, well-articulated proposals to deal with this list (except for AI where no one has any idea what to do). They require, however, a degree of radicalness that is alien to our self-satisfied and incremental political culture. New tax structures can generate much of the necessary revenue for comprehensive health care, housing, and basic income. Climate changes can be addressed. It is far more a matter of political will than of developing solutions.

The best model of breaking out of this doldrums in the US is the famous “100 days” of the first term of FDR’s administration in 1933. Huge Congressional majorities and a widely-recognized major economic crisis enabled some radical thinking to take hold. There is caution in this tale, however. A conservative Supreme Court struck down many components of FDR’s program and it’s not at all clear how effective those moves were in ultimately providing an exit ramp from the Great Depression. 

All in all, the chances that the US will be in better shape in 2032 than in 2028 are, therefore, not so great. Indeed, things might well be worse given the amount of damage that is currently being done (and I haven’t even touched on international complications yet). So, if a moderately progressive administration comes in, they’re not likely to look very successful four years on. Even if there is great success on fixing the current damage, rebuilding institutions, and laying the foundations for the solutions to long-term problems, that administration is not likely to be able to give much of an answer to the perennial question of electoral politics: “Are you better off now than you were four years ago?” 

Given the friable nature of the electorate, a further zig-zag is quite plausible. Indeed, the volatility of this zigzagging is part of what makes more extreme parties and leaders increasingly popular. This is especially visible in Europe. There are structural problems, to be sure: the difficulty of enacting programs with demonstrable effects within a single term. Added to this is the fundamental nature of the problems facing the country and the difficulty of devising solutions. The electorate, however, has—even in the best and most deliberate times—little patience for considering these constraints. The upshot is that whoever wins risks the specter of failure and subsequent rejection.


0 Comments

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    March 2026
    February 2026
    January 2026
    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly