Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Democracy and Complexity

3/13/2026

0 Comments

 
We’ve come a long way from the citizen assemblies of ancient Athens. Some towns in New England still make decisions by getting all the local citizenry together, but that doesn’t work with larger groups of people. The solution—more-or-less standardized since Madison and the gang in Philadelphia (1787)—is “representative democracy.” Having the masses choose which members of the elite would make decisions for the whole society was a way to keep power in the (relatively trustworthy and reliable) hands of those with a mix of education and wealth, while providing a means for the demos to express itself and retain nominal oversight of the process and tools of government.

Many things have changed in the last 250 years in the US and around the world: in terms of the condition of the mass electorate, we have industrialization, an information tsunami, mass education and culture, and a whole lot more people living. While the precise spread between elites and masses have varied over time and across different cultures, the fundamental differential remains. (It’s one of the discrepancies which our current idealization of democracy buries.) In terms of the world, these same factors, particularly the nature and extent of technologies (high and low, material and cultural) have made for a much more complex environment. On top of these phenomenological changes, our knowledge levels (or, at least, our beliefs about what we know) about causation and effects have drastically increased the difficulty of making decisions. Science and experience have made us more aware of the broad implications and longer-term effects of any nominally straightforward policy option. Whether dealing with tariffs, vaccines, immigration, AI regulation, in toting up the pluses and minuses of any potential policy decision we have a lot more entries. Keeping track of them, then weighing them, and sorting through the trade-offs has become a much more difficult process. We can see this in the construction of bureaucracies and administrative regimes (e.g. taxes, Medicare, education policy, and various schemes of discrimination and preference). 

What challenges do this increased complexity of life raise for the practice of democracy?

The theory of modern democracy is that informed citizens should deliberate and select representatives who devote sufficient time and intellect to comprehend the issues and resolve the political issues that are inherent in the differing beliefs, interests, and priorities of any large group of people. Elected representatives should, in theory, become experts in sorting policy choices as well as interpreting the values of their constituency and applying them to those policy choices. That’s “normal” politics. However, there are problems with this theory both in terms of the represented and the representatives.

First, the levels of complexity and detail are so great that even most elected representatives (especially at the state and federal levels) can’t begin to cope and, de facto, delegate their decisions to either their party leaders, their colleagues (vote-swapping), or their staff members or the implementing bureaucracy. Each presents its own particular problems and risks of corruption and distortion, but in essence, our representatives are working through representatives, most of whom aren’t elected. This was not so much of an issue in the 18C or early 19C, but has swamped the process since the mid-20C.

However, the more fundamental concern is that the electorate, for all its increased education and access to information, can’t make more than a broadly directional choice when it comes to electing representatives. We’re all different, but there’s only a limited set of possible representatives to choose from. Both politics and policy intersect in different mixes, political parties kinda help alignments, but present their own complications and corruptions. The result, in our over-saturated media age, is decision by sound-bite, charisma, and money.

Representative democracy thus looks inherently problematic and even more so these days. Unfortunately, neither of the two most popular solutions is a real improvement. The more well-established alternative is the popular referendum, allowed in about half the states. As we in California know all too well, this method of popular participation is messy and subject to the same money/media distortions as other modes. One principal problem is that ordinary voters are called on to read through and understand an extensive statutory implementation of some policy scheme. By the time you winnow out those who have the capability to work through these challenging questions (e.g. tax codes, environmental regulation, social benefit eligibility) and those that are not interested enough in public policy and have the economic wherewithal to devote the necessary study hours, you’d end up with a fairly skewed (and self-selected) group of citizens; hardly a representative body.

The same problem undercuts the various proposals to have “ordinary” citizens, usually selected at random, participate either as part of regularly-established legislatures or as a stand-alone “citizens’ assembly” which would have a role in the legislative process). It’s nice symbolism, but I suspect this innovation would merely transfer power to the staffs to explain things and the bureaucracies to implement them.

I have been thinking about this issue in the context of an upcoming lecture marking the tenth anniversary of Brexit, the 2016 decision which propelled the United Kingdom out of the European Union (pretty much of a disaster on all fronts and of which more in a coming posting).
British voters were presented with a nice short “yea-or-nay” question for an immensely complicated situation about which there was no clear understanding of what was likely to ensue.

Still, I think there is a role for referenda, as long as they are relatively straightforward and high-level AND subject to either implementation by the legislature or (unlike Brexit) a further ratification of a final detailed plan. Capping referenda questions at 100 words would likely provide broad policy direction without pretending that the electorate reads the third subclause of the fourteenth section of the proposed law.

Not a great overall solution to an increasing problem of democratic governance in the 21C, but worth a shot. We need other ideas of how to balance the complexity of the real world with ensuring all the ordinary folks can help steer the ship.

0 Comments

One Perspective on Capitalism

3/6/2026

0 Comments

 
I recently bought a book (no news there!) and faced a panoply of choices as to price, delivery speed, format, and book quality. Multiple websites provided a great demonstration of the ubiquitousness of capitalism at work in modern life. Consumers’ preferences are dissected and products designed to meet those particularities. So, it’s no small irony that the book in question was Sven Beckert’s new doorstop (1000+ pages) providing a comprehensive history of “Capitalism.” As we said when I was in the law biz: “res ipsa loquitur” (“the thing speaks for itself”).  

As a work of history, “Capitalism” is well thought-through and remarkable in its research, if rather too heavy for the casual reader. Beckert shows that capitalism was a global phenomenon, drawing on practices and experiences far beyond the usual “its all about Europeans” (including the US) framework. He also shows that it has deep roots, extending far earlier than the usual early-modern/industrial revolution/robber barons/globalization storyline. After all, as I have noted elsewhere, greed and profit are hardly modern inventions. He does a good job, as well, in blowing up the myth of “laissez-faire,” the idea that large businesses have developed apart from and in spite of governmental activity.

Beckert applies a phenomenological focus; i.e., he concentrates on the actual practice of “capitalists.” Marx merits less than 30 mentions and other theorists (pro and con) are similarly sidelined.  I would have preferred a more inclusive approach, but his is a legitimate choice and his story benefits from its grounding in the real world. My bigger concerns are that 1) he doesn’t pin down the definition of the concept he’s writing about, and 2) he doesn’t wrestle with how the capitalist mentality spread and swamped other values-based cultural systems. There is, to be sure, a reference to capitalists’ focus on markets/commodities/money, but there is something in modern commercial practice that is different from the mindset of traders a thousand-or-two years ago, and he doesn’t grab on to it. 

This, to me, is the central issue. Capitalism has been a principal strand in the story of the modern world, whether economic, political, or ideological. Historians in general, however, are loath to take on psychological changes, however fundamental they might be. There’s good reason for this, since the evidence is sparse and largely inferential and there’s always a risk of self-projection. And yet, without understanding or at least suggesting some ways in which historical actors were motivated, we can’t come close to understanding how history came about.

As I said a few weeks ago (Capitalism and Me, 012326), I see capitalism as a culture (i.e., a socio-economic-epistemic system) in which we define ourselves and evaluate others and determine how to act across our lives principally from an economic perspective: morals are secondary to money.” In contrast, the practices and institutions which manifest this mentality are what Beckert is talking about.

It is important not to make moral judgments about these institutions per se. Lord Acton famously observed that “power corrupts and absolute power corrupts absolutely.” In a similar vein, St. Paul found money to be the “root of all evil.” They were, however, both wrong. A critic of Acton saw that power itself was not the corrupting source, but rather the vehicle by which human corruption was revealed. Similarly, we can see that money, too, is just the means by which evil is exercised. The fault, in other (Shakespeare’s) words, is not “in our stars” (or our wallets or our ability to affect others), but “in ourselves.” The economic power which the practice of capitalism concentrates in “capitalists” merely allows them to demonstrate their moral (dis-)abilities and their unwillingness to face the complexities (These fundamentals of human nature are essential to parsing the semantic soup that surrounds the term “capitalism.” I will have more to say in a later posting about the different ways in which that term and its cognates, “capital” and “capitalist” are used.)

It's not difficult to see the practice of “capitalism” as the dominant economic system of the modern era. As with other human activities, its significance is the product of the confluence of means, motive and opportunity. The motivation of capitalists, in my framing, is based in the deeply-rooted nature of humans—a desire for security (both physical and psychological) and the many ways in which that desire is overextended, as most pithily captured in the traditional deadly sins of greed, envy, gluttony, and pride. The laissez-faire mythos may have some nuggets of truth, but it’s mostly about the desire of “capitalists” to claim all the credit for the work of many; in other words: ego (also not a new story).

So, from a historical perspective, the rise of practical capitalism is more a function of the means and opportunity, which are largely exogenous factors, and which we can parse into three angles. First, institutions and practices: banks, corporations, trading networks, advertising, governmental actions, etc. Second, new technologies which have created new productivity and economies of scale and scope (not least in terms of transportation and communications).  Third, population growth and density which have created large numbers of consumers, thereby providing the demand which pays prices well above (the new, lower) marginal cost of goods (i.e, more profitability). These factors are the normal materials of historical analysis and we can trace their manifestation, at both the personal and societal levels. But we can’t forget that without the psychological urge, there would be no capitalism at all.

Beckert focuses on these exogenous factors, principally the first and second. I’ve got a bunch more reading lined up on that score and related topics; perhaps someone has taken on this moral/psychological angle in historical perspective. I’ll be back with more on the semantics, the significance, and the solutions—in due course.

0 Comments

Change, Transition, and Crisis

2/27/2026

1 Comment

 
In our current bleak days, it’s easy to forget that throughout history there’s always a crisis going on. Those in the middle of a specific crisis (that would be us!) often lose perspective amid the pressures and urgencies of the moment. Indeed, I might suggest that we can see that crises are often (usually?) the product of overfocusing on those urgencies and putting off dealing with the important underlying issues until the pressures build up and they crater into their own urgent crisis down the road.

Even if I am wrong in my causal analysis, it’s hard to argue against the notion that crises arise due to the accumulation of change, often sprinkled with some dramatic event which manifests those changes. Since History can be characterized as the study of change over time and since Historians are not above (over-)dramatizing their work, it’s no wonder that “crisis” appears frequently in History book titles.

There’s the “July Crisis” (summer of 1914 leading up to WWI), the “General Crisis of the 17th Century,” Marx’s ongoing “crisis of capitalism,” just to name a few and not to mention the innumerable localized or brief crises scattered about, usually of a geopolitical or economic nature. We have “constitutional crises” (Dred Scott, Nixon, Trump) and the Brits have “cabinet crises.” It’s hard to tell the difference between a “problem” and a “crisis;” so much so that I suspect using the term “crisis” is just a way to claim attention, either contemporaneously or historically. 

Most History books that aren’t “crisis” centered focus rather on “transitions.” How many times have I read: “It was a time of transition from [A] to [B]”? As if. As if there’s not always some transition going on. The nature of history (small h) is that change is always happening: From the Mughals to the Raj, from sail to steam, from foragers to growers, from search engines to AI. As Johan Gouldsblom suggested, pretty much all of history can be summed up as: First, nobody has xx, then some folks have xx, then everybody has xx. Despite our idealization of the past captured in some historical ‘snapshot,” in fact (with the possible exception of “revolutions”) there is no stability from which any particular change is a remarkable difference. And, of course, few of these transitions have much in the way of a clearly demarcated starting point or ending. 

Perhaps crises are merely the crux points of transitions.

It’s also worth noting that much of this crisis/transition sensibility comes from elites who have the ability to observe this level of change. There are many (most?) whose lives are precarious and in a constant state of crisis; but they don’t write books or blogs.

In any event, characterizing some change as either a crisis or a transition is, likely as not, merely a rhetorical device. Sometimes, these are useful as when a historian puts a new frame of interpretation on events, such as the shift from plain old cell phones to “smart” phones or the shift in political alignment among Southern whites from the Democrats to the GOP in the aftermath of the mid-20C Civil Rights movement. Sometimes, however, putting “crisis” in the title is just a way to sell books.

The insightful historian Adam Tooze (2022) characterizes our current era as being in a “polycrisis,” highlighting the multiple overlapping issues that seem to be coming to a head in the 2020s. It’s not a terrible word (and certainly preferable to the overused “perfect storm” metaphor), but as with the 17C or the 1930s (just to pick two examples), any good crisis worth its name always include multiple components and angles.  

One important perspective that arises from looking at the history of crises is that whoever comes out the end figures out how to make do and eventually we get to us in the present day. In other words, however much contemporaries bewail their particular circumstances as the end of the world, it isn’t. And, as noted from the beginning of this blog series, there are few lessons to be taken from these events/developments and fewer occasions for judgmentalism. If the Russians had deployed a bit more sophisticated military planning in 1914, the “July Crisis” would likely have ended up as a relatively forgettable third Balkan War rather than a continental conflagration. While we know the damage that resulted in the event, we don’t know how the world would have been along the line of that alternate history; so we can’t say which was worse or start blaming anyone for what we ended up with. Crises that get resolved get forgotten and subsumed into the flow of history.

Perhaps it’s a fundamental human addiction to adrenalin that makes it so attractive/effective/ necessary to hype things up and overdramatize the mundane. Perhaps parts of us secretly want to (as the Chinese curse) “live in interesting times.” On the other hand, it’s hard to blame those (myself included on some scores) who try to bang the drum loudly when they see peril coming and unattended to. We revert to whatever devices lay at hand, including rhetoric, to rouse the sleeping populace even at the risk of overusing and devaluing the language to the point of moral exhaustion. On the other hand, Paul Revere would not likely have cried out to Lexington and Concord that “the British crisis is coming.”

The real test of a crisis is what we do with it. As Winston Churchill (perhaps apocryphally) and Rahm Emmanual (certainly) said: “Never let a good crisis go to waste.” Still, historically, most crises do; folks muddle through until a bunch more changes pile up, we go through a transition or two and then walk into the next crisis.  Those that then try to seize the day are called “revolutionaries.” They, too almost always end up in a crisis of their own soon enough.
1 Comment

Degrees of Indistinction

2/20/2026

0 Comments

 
The State of California once led the country in establishing a vision for higher education and deploying resources that made college education available to millions of its residents and fostered world-class research across dozens of disciplines. Its latest move shows that this mid-20C spirit is faded, eviscerating the meaning of the degrees which are granted at both the high school and college levels. The institutions which were charged with realizing the original vision have fallen prey to self-preservation and sclerosis.

Specifically, a new law requires most California State University (“CSU”) campuses to offer admission to any California high school student graduating with at least a 2.5 grade point average (i.e., C+/B-). Now, (according to ChatGPT) the average high school GPA has inflated dramatically over the past 40 years. In the 1980s, it was under 2.4, it is now well over 3.0. This means that about 70% of graduates have at least a 2.5 GPA; i.e., below average students are now encouraged to go to college.

[I have to acknowledge that as a graduate in the 1970s from an upper-middle class environment with lots of academic support and resources, I am part of the incumbent elite in this story.]

The combination of these facts and the new law raise several questions:

1) What is the purpose of a college education?
2) How does this law help students?
3) How does this law help legislators and the CSU system?
4) Why would most students who didn’t do all that well in high school, want to go to college?

The post-WWII expansion of higher education was seen as an important means of building a strong-middle class in the US, along with facilitating the growth of the economy (especially in the service and corporate sectors). A college degree was a mark of distinction and was well compensated in the employment histories of those who achieved it. While less than 2% of the population had a college degree in 1900, by 1960 this grew to over 7% and to over 16% by 1980. Since then, the rate has again more than doubled to about 38%.

The marginal benefit of a college degree, therefore, had to fade over time. The mantra of “to get a good job, get a good education,” was tremendously effective in engaging students (and their parents) to steer their focus in this direction. But the surge in degrees necessarily means that they’re no longer so distinctive and economically valuable.

At the same time, broad social changes across the country throughout the second half of the 20C expanded the likely pool of college students and the social necessity of expanding college access. I don’t think it’s accurate to characterize the continued push for degrees to those aspiring to increased socio-economic status as some sort of deception or manipulation by elites, but there is some irony in the fact that this “democratization” of college degrees has coincided with their relative reduction in economic value.

For decades, colleges have deployed resources, techniques, and programs to enhance and accelerate graduation rates for their students. Much of this effort responds to the relative decline in student’s capabilities (for a variety of reasons) at the time they enter college. Overall, they were a modest success—at least on their own terms. The new California law continues this trend. It builds on an embedded belief that the purpose of a college is to produce college graduates. Clear thinking about the meaning and purpose of becoming one of those graduates is, however, harder to find.

I’ve talked before about the vocationalization of college education: the focus on job training and the diminution of the liberal arts. This process has developed in tandem with the “industrialization” of college education: turning colleges into assembly lines for the production of graduates with degrees. 

The new law now encourages high school graduates of below average performance to go to college where they will likely struggle even more than they did in high school, to spend five years or so and about $100,000 (plus living expenses) to get a degree that will get them into the middle of the mass of job seekers. 

Wow! Sounds like a great deal. 

That would be tough enough in ordinary times, but the AI-induced incipient upheaval in the entry-level white collar job market undermines any confidence that this traditional scenario will continue. (See 012425, Morlocks, for a comment on the long-term societal consequences).

Besides providing more “opportunity” for young Californians, the new law also seeks to help those colleges campuses which have faced enrollment declines in the last several years. [This includes SF State, where enrollment drops led to my “retirement” in 2024 (so one might think I would benefit from this law).] In other words, let’s artificially prop up demand for a service and institution that can’t cut it in the marketplace. 

This is the sort of legislation that generates a nice sounding press release and campaign PR for its supporters (it passed unanimously in both houses). It’s no wonder that most voters have little enthusiasm for our democratic institutions. Elected officials are more concerned with sound bites and demonstrating "action” rather than actually thinking about what would be useful to young people and candidly re-assess the nature and purpose of the public university systems and the extensive expenditures we all make to support them. 

I’ve spoken before about the limited utility of analogizing from history, mostly in geopolitical contexts. It’s especially true here; we have no good sense of how this will play out, but the changes are likely to be deep and wide.

0 Comments

The End of the World

2/13/2026

0 Comments

 
In my recent comment on Bill Gates’ piece on climate (110725), I criticized his dismissive remark that the climate crisis was not the “end of civilization.” I pointed out how, for some individual victims and societies, it—literally—is. There are important civilizational questions about how we and those to come will choose to memorialize those whose lives or cultures were cut off in this way. The Museum of Climatically-Extinguished Cultures and Creatures is likely to be pretty crowded by the 22C.

Still, while my life is not likely to come to an end due to climate change, nor is civilization in general likely to collapse by 2039 (my planning horizon according to the Social Security Administration’s life expectancy tables), this prospective doom got me wrestling with why and how I might think about those whose lives will continue into 2040 and beyond and how I should conduct myself in the meantime. In other words, since civilization will have come to an end as far as I’m concerned, why should I care about those left behind?

Philosophy has proposed and history has demonstrated several answers offered to this question. Just to put a bit of structure on this issue, we might divvy up our impact into these categories: the personal memories of those who will continue on, our biological progeny, physical manifestations of our existence, and cultural traces of our impact on the world. We might also distinguish between the impact/legacy of the few prominent people in the world and the vast remainder of us. Finally, we should note that many folks have lived based on their expected status/treatment in some shape of an afterlife (aka Heaven, Hell, Nirvana, Valhalla). However, since I don’t believe in an afterlife, doing stuff now to get credit in the hereafter seems futile (as well as a bit tawdry). 

Based on how people behave, many folks want to be remembered by those still around or to come years and centuries post-mortem. And most are, at least by family and friends. But memory is a fickle thing, not only does it fade over the years, but by the time memories are passed down to succeeding generations, distortions are inevitable. Very, very few are likely to be remembered by anyone in any meaningful way more than 50-70 years after their own passing. Maybe that’s as far ahead as we can imagine, so we don’t care about our more distant legacies. The memories of the 5 billion (+/-) people who died over the course of the 20C are going quickly and in 30-50 years what will be left? My niece and nephew were in their teens when my mother passed. In fifty years, they will be in their 80s with faint wisps of her (either direct or through their father) remaining. I have a friend who is into genealogy and has reconstructed some of his family lines back for several centuries. It seems personally satisfying to him, but those long past have left only a name and a few traces of themselves. With all due respect to Ancestry.com et al., I don’t aspire to be an entry on a long list compiled by a greatx8 niece in the later 22C.

In terms of cultural remembrance, very few leave something behind. Wikipedia includes just over 2M biographical entries and maybe ten times that number have some sort of bio bit somewhere. Out of 100B people who ever lived, that’s not very good odds at being remembered in this way. Donating money will get you a building or a pew, but as with the above examples, the names of those that do will become only words with no meaning behind them all too soon (ditto for street names). A few folks will be memorialized in published articles or their archives dug up by Historians or click-bait chasers. Of course, with big bucks or some luck, you can play in the big leagues: sainthood and multiple churches, universities (unless your money came from enslavement), or cities (Charlestown), states (Penn-sylvania), countries (the Philippines, Bolivia); but that’s likely not more than a few thousand all in. Scientists and doctors have a nice racket going in naming diseases and natural phenomena after themselves (the Humboldt Current, Higgs’ boson, and Alzheimer’s disease all come to mind). The pinnacle is likely the Taj Mahal: prominence at the less than one-in-a-billion level.  All-in-all, for ordinary folks, we will likely get swallowed up in the maw of time. Even if some electronic record remains, who will look for it or do anything with it? (In this regard, access some record of mine by an AI/Borg as it hoovers up items from the past for some college essay in 2076 doesn’t really seem to count!)

In sum, if we can’t count on being remembered in any meaningful way beyond a generation or two, the purpose of memorialization is more likely for the self-satisfaction of those who seek to be remembered; in other words: ego. If you think you’re an exception, see Shelley’s poem: Ozymandias (1818). 

So, if personal, name-and-likeness legacy is a racket, is there anything I can leave behind? I think so, but it’s not likely to be identifiable. I used to say, when I was teaching in college, that my impact on my students was more likely to be in things they remembered or ways of thinking they learned even if they couldn’t remember my name, or even how they might have learned them originally. I suspect the same is true elsewhere, it terms of both family, social, or cultural influence (or even charitable donations). 

If the end of the world (objectively) coincides with the end of the world (subjectively), then it won’t really matter. If the objective world carries on, who knows what direction things might take? Preserving the possibilities for future discovery is as about as far as I can see may be enough and my belief that I helped it to do so is valuable to me and sufficient as a motivation. 

0 Comments

A Poisoned Chalice

2/6/2026

0 Comments

 
If it weren’t for all the mayhem and pain likely to come in the meantime, it would be a blessing that we have 2 ? years until the sturm-und-drang of next Presidential election. Of course, in our perverse media/money-driven system, the major potential candidates are already getting organized and positioning themselves. Even without the particularly dysfunctional nature of our political parties, it’s not likely that we will see a candidate who is really ready to tackle the fundamental issues facing the country. Indeed, whoever wins is likely to fail badly.

Of course, I have zero confidence in the likely GOP standard bearers. If you multiply the imagination, compassion, and integrity of either Vance, Rubio, Ron DeSantis, or Ted Cruz, you might well get a negative number. Anyone else on that side with the capability of being President has either bowed out of our Trumpian-dominated public life or has a sufficiently low profile as to have no chance of serious consideration. As I have noted recently, the entire party seems wrapped up in issues and images of the past which, combined with the remnants of its “small government” philosophy, ensures a passive approach to the dire challenges ahead.

There are half a dozen Dem Governors in the mix (Newsom, Shapiro, Beshear, Pritzker, Wes Moore, Whitmer) plus Pete Buttigieg. They all score much higher on the imagination, compassion, and integrity combination, but face a comparable internecine (“progressives” vs. “moderates”) drag within their “party.” Once elected, they would also have to deal with reconstructing the federal government, in terms of both personnel and policy, in the aftermath of the current evisceration. Even with my suggestion of an accelerated remediation (see my proposed EAGER Act, 052325), much of their first term would be spent getting systems and programs back to the ground floor from the current sub-basement level and dealing with the “normal” range of issues and crises. 

Getting Congress to act, even if it had modest Dem majorities in both houses, presents another ubiquitous hurdle to meaningful action. After all, at the end of the day, the Dems are only marginally more cohesive and effective than the GOP. They have their own share of personality squabbles, infighting, and inertia. They will also be distracted by the shiny toys of power and the opportunity to go after Trump and his many corruptions; as well as “preventing” his abuses from recuring. These are worthy targets in the abstract, but when establishing the priorities for taking care of the country, they have to be relatively low on the list

The prospect of making fundamental changes will also run into the electorate’s unwillingness to recognize root causes and bite the short-term bullets necessary for long-term improvement. Indeed, the one-word summary of today’s popular concern is “affordability.” However, the economic data show that this isn’t really a problem for most of the (middle class to well-off) folks currently complaining. There is something much deeper going on and it’s not susceptible of quick fixes. This includes the loss of the “American Dream” (some version of “Ozzie and Harriet”), uncertainty about our place in the world (aka “globalization”), and a loss of confidence in society’s and government’s ability to maintain coherence and progress.

Even if addressing those historical concerns was feasible, they wouldn’t reach the underlying problems that demand prompt and radical action: climate, inequality, housing, and the imminent disruption of AI on our workforce and demographics.

There are, of course, well-articulated proposals to deal with this list (except for AI where no one has any idea what to do). They require, however, a degree of radicalness that is alien to our self-satisfied and incremental political culture. New tax structures can generate much of the necessary revenue for comprehensive health care, housing, and basic income. Climate changes can be addressed. It is far more a matter of political will than of developing solutions.

The best model of breaking out of this doldrums in the US is the famous “100 days” of the first term of FDR’s administration in 1933. Huge Congressional majorities and a widely-recognized major economic crisis enabled some radical thinking to take hold. There is caution in this tale, however. A conservative Supreme Court struck down many components of FDR’s program and it’s not at all clear how effective those moves were in ultimately providing an exit ramp from the Great Depression. 

All in all, the chances that the US will be in better shape in 2032 than in 2028 are, therefore, not so great. Indeed, things might well be worse given the amount of damage that is currently being done (and I haven’t even touched on international complications yet). So, if a moderately progressive administration comes in, they’re not likely to look very successful four years on. Even if there is great success on fixing the current damage, rebuilding institutions, and laying the foundations for the solutions to long-term problems, that administration is not likely to be able to give much of an answer to the perennial question of electoral politics: “Are you better off now than you were four years ago?” 

Given the friable nature of the electorate, a further zig-zag is quite plausible. Indeed, the volatility of this zigzagging is part of what makes more extreme parties and leaders increasingly popular. This is especially visible in Europe. There are structural problems, to be sure: the difficulty of enacting programs with demonstrable effects within a single term. Added to this is the fundamental nature of the problems facing the country and the difficulty of devising solutions. The electorate, however, has—even in the best and most deliberate times—little patience for considering these constraints. The upshot is that whoever wins risks the specter of failure and subsequent rejection.


0 Comments

An Old College Try

1/30/2026

0 Comments

 
Everybody loves to beat up on the Electoral College. It’s anti-democratic in multiple dimensions, it’s subject to abuse, and it seems archaic. Here’s a way to make an Electoral College relevant, useful, and constructive in the 21C.

We have to start with the original purpose of the Electoral College, which was both sensible and intentionally anti-democratic. In the 18C world of limited communications and partial literacy, with a predominantly agrarian population, it’s hard to see how most voters (even if they were heads-of-household) could have a sense of those capable of being the leader of the country. All the parts of the process with which we are familiar—declarations of candidacy, position statements and platforms, live campaigning—had yet to be invented. How could a farmer in the Virginia Tidewater region be expected to know much beyond the name of a John Adams or George Clinton? In such a world, only the political elites in each state could be expected to have a knowledge of the individuals and the issues which the country would face. That this accorded with Madison’s aversion to blank democracy and a preference for keeping decision-making in the hands of “men of affairs,” is no surprise either; but this was not merely blind elitism.

A lot has happened then, including the emergence of political parties, the vast, if gradual extension of the franchise, an increase in literacy, and the omnipresence of the media, now in its unfettered “social media” phase. 

We have can’t deny the fundamentally dysfunctional nature of our current Presidential election process. Even if a precise parsing of the connections between the historical developments and the current state of things is a fool’s errand; it's arguable that the shift to Presidential candidate selection though the popular vote primaries was a significant step in this sad evolution. In our media-saturated and polarized political culture, we have prioritized fund-raising, sound-bites, and the ridiculous spectacle of states jockeying for who gets to vote early in the process. There’s a laundry list of problems, and I won’t rehearse them all here. They’re neatly captured in the point that the talents and capabilities necessary to govern are rather different than those necessary to get elected; as evidenced by the many politicians who have withdrawn from public life and the quality of those who remain.

If we credit the College’s original goals of balancing untrammeled democracy and producing a President (now far more important and powerful than Washington and his immediate heirs) who is capable of intelligent leadership of the country, then we may need to consider some radical approaches. It would not sit well in a political culture with an essential democratic premise to have a few folks make the final and unreviewable choice. Still, I suggest that we change both the way we choose the College and its role in the overall Presidential process.

The second part first: the role of the new College would NOT be to select the final winner of the Presidential election. Rather, they would select four finalists from which list the general electorate would select the winner. 

They would do so through a process designed to offer a set of sensible choices to the public. The College would be named at the beginning of July. They would meet for a week at the end of August and announce their choices on September 1. I would seal them off in some resort for a week, something like a Papal Conclave. Each Elector would place four names on the ballot. A series of preliminary votes would winnow that large list down to sixteen. Each of these folks would then be interviewed by three (randomly selected) Electors for half an hour and the videos of these interviews shared with the entire group. Then through a ranked-choice final ballot, the four public candidates would emerge. Those four names would be put on the public’s General Election ballot and (after a mercifully short two-month campaign period) voters would rank their choices among the four in order. Standard ranked-choice methods would winnow down the winner. 

While we’re at it, let’s choose an electoral college in a different way. Let’s split the country into fifty equally-sized districts (each containing 6-7 million people), each district to elect one person. No one who has held elective state or federal office within the prior five years can run. That way, no one who’s in the middle of the political process, with axes to grind and horses to trade and immediate IOUs to cash would be involved in the selection process. Then I would add the Secretaries of State, Defense, and Treasury and the Attorney-General from each of the two prior administrations, four retired Senators chosen by the existing Senate, four retired Representatives chosen by the existing House, and four retired judges chosen by the Supreme Court. 

That totals seventy people, which seems a large enough group. It would include a range of political, administrative, and judicial experience, but not enough to dominate the process. This is a far cry from the “back-room,” cigar-chomping caricature of the pre-primary era method of candidate selection. The voice of the electorate would retain its central role, but it would be tempered by the voices of experience and judgment. The requirement to nominate multiple candidates would ensure that a range of capable persons would be considered. Who knows, we might get a poet, professor, or seasoned executive in the mix.

Only a few Presidential candidates over the past fifty years have inspired actual passion. Instead, most Presidents have been chosen primarily because they’re marginally better than the “other guy.”  They have been products of a process that favors campaign skills over the ability to govern and lead. Perhaps we can do better.


0 Comments

Capitalism and Me

1/23/2026

0 Comments

 
Capitalism and Me

I’ve been doing a bunch of reading on the history and nature of capitalism lately and will have more to say about the books and the underlying phenomena in due course. However, since I believe that most history (& other) writers’ personal connections to the issues they’re writing about color what and how they write, it would be good if they articulated those connections. As a result, I have spent some time thinking about capitalism and me.

By capitalism, I mean a culture (i.e., a socio-economic-epistemic system) in which we define ourselves and evaluate others and determine how to act across our lives principally from an economic perspective: morals are secondary to money (see Das Kapital, 102122). So, one need not have a net worth in excess of USD$1M to be a capitalist (although it helps). Capitalism is global, even if it engages with different cultures (Chinese, French, Nigerian) in different ways. The technologies of capitalism (e.g., corporations, global supply chains, leveraged buy-outs, 401(k) accounts, banks, the alienation of labor from products) are the superstructure or manifestations of this culture.

So, where do I fit in? I grew up in mid-20C America. The culture of capitalism/progress/success/ growth was rife. I grew up in an upper-middle class suburban environment where business concepts and financial mindsets were endemic. Long before I could spell “capitalism,” much less comprehend it, I was immersed in it. I was imbued with a (non-Protestant) work-ethic, and was given to understand that success meant replicating the bourgeois lifestyle (comforts, choices, and standard of living) which I knew quite well. I was taught to manage my weekly allowance and had a savings account and a couple of shares of stock before I was ten. In this sense, I have been a “capitalist” ever since; I invest my assets (both earned and inherited) with an eye on their financial return (i.e., I wanted to get richer).

I soon enough ran into the moral quandaries which capitalism necessitates, including racism, classism, and the importance of charity. Still, I didn’t have a clue as to how to reconcile them with the comfortable life I was living. I saw the situations of others, but even through college, I didn’t recognize their place in society as a necessary creation of the power structure which I seemed destined to inherit. 

In college, I spent a summer as an intern at a big accounting firm. I learned a bunch about office life and business book-keeping. I didn’t need to learn the importance of keeping track of numbers and measuring things. I had already internalized them. I found a sense of security in being able to comprehend the world in tabular form. Its neatness made it easy not to pay attention to the complexities of life and feelings and moral trade-offs.  For a while, I went down the rabbit-hole of calculating my life in dollars and cents (it was an era in which cents still mattered). I dug myself out of the worst of that hole and have been climbing out since, but the underlying mentalité remains. 

As a lawyer in the corporate world of telecommunications and internet companies, I grew frustrated with the absence of morality in business dealings, the maximization of profit, and many individuals who, in the words of one colleague, conflated their net worth and their self-worth. My dislike of this environment was a significant factor in my decision to leave the corporate/legal world and go back to the academic world and history in particular.

As part of my studies, I wrestled with the large role capitalism and related topics played in modern European history.  I also found a resonance between my own experience and what I came to understand as the psychological benefits to capitalists of having their world in order: a world that faced increasingly complexity and opportunity across the middle of the last millennium, a world—particularly in Europe—amid the epistemological chaos of reformation and secularization that was coming untethered from its former certainties. Profit and greed are hardly new factors in human affairs, but there is something more going on than just the quantity of trade or the technologies of markets and money-making that marks the shift to the modern mentalité, of which capitalism—both the mindset and the practices—are a major part.

There’s a psychological concept called the “ladder of development.” It’s a way to consider how well I tolerate complexity in other people and the world in general; or, in other words, how well I can maintain my own balance amid a world of difference and change. It’s an endless ladder, i.e., one can always do better; but it’s not a matter of judgmentalism, it’s more a matter of personal awareness and self-assessment.  I suspect that my own experience and that of a broad swath of “capitalists” has to do with where I place myself on this ladder and my ability (and theirs) to cope with those challenges. 

In other words, it’s easy for capitalists to comprehend the world because they simplify it; pushing moral complexities and self- assessment to the side. Facing decisions about whether to buy a car or fire an employee becomes much less stressful if I just ‘run the numbers’ and leave it at that. Adding in unquantifiable factors and externalities (e.g., the environment or the employee’s rent obligations) makes the decision harder. 

 There is a fundamental human psychological risk of projecting my own experience and mindset onto others. It comes up all the time and it is a trap into which I often fall. Just because I think I recognize (often subconsciously) some situation or attitude in someone else, it’s all too easy to think they’re just like me. It’s an especially common and dangerous risk in writing History. Not only is the past “a foreign country,” and people are often from other cultures, but the urge for story-telling, consistency, and familiarity can override the necessity for cogent and critical assessment of difference, and respect for that difference. So, it’s important to remember that my own self-assessment might (or might not) color my historical characterizations.
            

0 Comments

Ancien Regime

1/16/2026

0 Comments

 

In December, 1783, the signed confirmation of British recognition of American independence arrived in Philadelphia to great excitement. A month later, it was ratified by the Continental Congress, acting under the Articles of Confederation. The Articles, which set the framework for joint action by the thirteen colonies had been in place for three years, but many were already unhappy with how it worked. 

By 1786, that unhappiness had increased and delegates gathered in Annapolis to see if they could propose some improvements, but only five states showed up and most states had authorized their delegates to discuss only a limited range of issues, so the Annapolis group advised that a more extensive set of reforms needed to be considered. The Continental Congress agreed that a revision was appropriate and delegates from twelve states arrived in the Spring of 1787 to discuss those changes. 

What (we call) the Constitutional Convention proposed instead was a wholesale rewrite of the relationship between the States, including a detailed structure for a new national government. It was far beyond their mandate, but they were able to persuade Congress and the country that more radical action was necessary. Even back then, the concept of the rule of law was part of  British political culture and, from that perspective, the route to the Constitution for a large part of the former British North America was problematic, not to say (literally) revolutionary.

Immediately after the new Constitution was ratified, the Bill of Rights was adopted and we have been living under this arrangement (with only a few significant formal amendments) ever since. Hundreds of proposals for updating have died somewhere in the multi-stage process of amendment established under Article V. Along the way, the (unelected) Supreme Court has (usually to much controversy) interpreted the document in novel ways. But, that’s it.

While the great powers of Europe and the ancient cultures of Japan and China can claim greater duration, the US has the oldest continuous Constitutional system of any country in the world. Perhaps in competition for their venerability, we have, for a long time, been proud of our longevity as a nation and the stability of our system of government.

That’s no longer the case, or, stated differently, our society is stuck with a governing structure that is archaic, virtually static, and unfit for purpose in the 21C.

It’s as if we were trying to run an AI system on MS-DOS (although, I guess, early AI attempts were exactly that!).

From another perspective, when the Constitution was adopted there were less than four million people in the US, of whom (excluding women, slaves, and children) there were well under one million eligible voters. (In fact, only 28,000 people actually voted for George Washington for President a year later.) So, a group roughly the size of Connecticut or Utah wrote the rules under which we still live.  By the same token, our current total population of about 1/3 billion represents about 60% of all the people who have ever lived in these United States.  The historical tail is wagging the contemporary dog.

It's an interesting question of culture, history, (and anthropology?) as to why we subject ourselves to their rules and practices. The point goes far beyond the constitutional context framed here. Culture is, pretty much by definition, the product of history; after all, we don’t know anything else. It can also be seen as a deal between the past, the present, and the future. We (in the present) accept the judgment of our predecessors as to how we should run ourselves and our society. We also represent to our progeny that how we are doing so is the best way for those to come to run themselves and their society; even knowing—in each case—that changes have been and will be made. No group at any particular point has the time/bandwidth to rewrite everything; so, most change is sporadic and incidental (however chaotic it might seem at the time).

In her recent history of the Constitutional amendment process “We the People,” Jill Lepore reiterates that we are out of practice in terms of Constitutional change. The last time we made more than minor tweaks in the document through the formal amendment process was well over a hundred years ago. Since then, the substance of our constitutional order has changed solely through de facto practice and Supreme Court “interpretations.” That each of these modes is ephemeral and reversible has become only too clear in the last couple of years. Lepore points out that progressive forces despaired of the (extremely difficult) formal amendment process and pursued socially-necessary changes through these other mechanisms. This includes both the considerable expansion of the scope of governmental activities (regulatory and social welfare), as well as civil rights for a range of groups. Now, the sauce for the goose is being served for the gander and the progressive goose is (sorry to mix befowled metaphors) cooked.

Even a dramatic political reversal and overruling of several recent Court decisions will only get us back to where we were twenty-five years ago. They won’t solve the underlying structural problems or reflect a 21C society.

A few weeks after Washington was inaugurated, a group of newly chosen representatives gathered outside Paris. Within just a few months, they launched what we call the French Revolution and, over the course of a few years, overturned the local variant of the long-established political and cultural society of Europe: the Ancien Regime. Images of the trés fabulique lives of Marie Antoinette et al. make it easy to consign this concept to history (even if the practice of “royal” elites dominating and exploiting a country continued well into the 20C). It would seem to have no resonance in our modern republican mentalité, but the Constitution is our Ancien Regime. By now, we are so deeply imbued with concepts like the “rule of law” that it’s hard to imagine that anything truly disruptive or radical could happen here (January 6 notwithstanding). Incumbents in such power structures have denied the possibility of change up to (sometimes past) the last minute, but it comes, sometimes suddenly and violently, sometimes in other painful stories. And, as the Stuarts, Bourbons, Romanovs, Pahlevis, and others can attest; no one knows what will emerge.

0 Comments

Venezuela

1/9/2026

0 Comments

 
Venezuela

Three years ago (120222, “Sauce for the Gander”), I wrote about geopolitical spheres of influence with particular focus on China and Taiwan. I compared that situation with the two-hundred-year-old predecessor to the “Don-roe” doctrine recently revived in Caracas. I noted that, based on our own historical practice, we didn’t have much basis for criticizing the Chinese for effectively claiming a sphere of influence encompassing Taiwan and the South China Sea. I don’t have much to add to the general point made there, other than to note that we have a long history of military interventions in Latin America, often (as here) from commercial motives (on top of some convenient distraction from domestic economic challenges). 

Beyond the obvious immediate problems of morality and international and domestic law arising from our kidnapping of Maduro and threats of coercion and control, our actions must make the Chinese feel smug about the implications for their freedom of action in their own backyard (even if we effectively pushed them out of our backyard); no to mention the Russians in their Ukrainian backyard. 

More fundamentally, coupled with the latest sword-rattling over Greenland (see also 011025, “Baby, It’s Cold Outside”), Cuba, and Mexico, the Administration is actively proclaiming the return of realpolitik as the basis of US foreign policy. Unfortunately, their timing is all wrong and their other foreign policy actions seem to undermine this latest thrust. The timing is wrong because China is a rising power and the US is relatively falling (see 121820, “Rising and Falling Powers”). Rather than doubling down on military might, we should be seeking other modes of constraining China, not least of which is building stronger alliances with others similarly situated (Europe, India, Japan). The practice of realpolitik, however, also requires clear-headed thinking about our strength and clear-headed thinking is rare enough, especially with the JV foreign policy team currently running the show (see 120525, “You Can’t Go Home Again"). 

You read it here first.

[and now back to our regularly-scheduled program.]

0 Comments
<<Previous

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    March 2026
    February 2026
    January 2026
    December 2025
    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly