Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Denial

5/20/2022

2 Comments

 
If you get caught in flagrante delicto, the late great comedian Lenny Bruce has some advice for you: “Deny it. Flat out - deny it! If you really love your wife, deny it. If they got pictures, deny it. … If they walk in on you, deny it. Just say this strange chick came into the apartment shivering with a sign around her neck that said, ‘l have malaria. Lie on top of me and keep me physically active or I'll die.’” – a schtick by Lenny Bruce (from the movie Lenny (1974)).

We all deny stuff (I know my own list is … robust). Since the days of the broken window and pointing at my little brother, we have done so most of our lives.  As Lenny Bruce implied, even if the facts are clear, there’s some small chance you might get away with it. Why? Because, as Bruce said: “They want to believe it!” Or, as Jack Nicholson’s character said in A Few Good Men: “You can’t handle the truth!”

We all (or at least some part of virtually all of us) want to believe a nice story. A simple understanding of the world seems vastly preferable to the stresses of dealing with its complexities; and “truth” takes a back seat to sanity. Sometimes, of course, there’s no self-deception involved; denial is a cynical/dishonest ploy to avoid blame/responsibility (“Tobacco doesn’t cause cancer” worked for some folks for a while). But, sometimes we do it because we can’t tolerate living in a world in which the (denied) fact is true.

This explains a lot of the climate deniers or Covid deniers. A world in which the world (i.e. nature) is actually running the show is scary. Things were easier when just about everybody believed in God. All the weird stuff and problems could be written off to Him and were psychologically manageable via faith in His goodness or his plan/providence from which we would all (sooner or later) benefit.

Science, however, has shrunk the scope of God’s domain. He’s only around the fringes now and faith is harder to come by and seems to have less to do with how the world works than it used to.

Left to our own devices (so to speak) we fabricate coherence.

I’d like to think that this inability to cope is behind some of the well-known phenomenon of Holocaust denial. Certainly there were those who were excessive apologists for Nazi Germany. Certainly there were those who sought the fame/notoriety of controversy. Certainly there were those who had plenty of reason to distrust conventional and governmental information and then ran a bit amok. But, some folks couldn’t handle the truth of man’s inhumanity to man (or, more particularly, the evil of their own country/people/allies). Their weltanschauung (“worldview”) was shattered.

As I have pointed out in previous postings, there are a lot of folks here in the US and elsewhere whose weltanschauung has been pretty well hammered and so, things that don’t fit are labelled “fake news.” There are a bunch of folks who “can’t imagine” that our democracy is at risk/ Russia will invade / Japan would attack the US Fleet in Pearl Harbor/ Britain would leave the EU / … (you get the idea).

The rantings and machinations of the “Stop the Steal” gang following Biden’s victory in November 2020 are a textbook example. Trump couldn’t contemplate a world in which he lost; so he created one in which he didn’t. Millions followed (still follow) this delusion. Perhaps some will wake up and admit to temporary insanity; or they will just hope this incident fades into history and they won’t be asked to take a stance on the question. But, I suspect, too many drank too much Kool-Aid and will never recover. At this stage, it’s hard to imagine that Rudy Giuliani was a respected/feared US prosecutor and (not entirely terrible) Mayor of NYC. What’s left is a sorry knock-off of Batman’s arch-foe “the Penguin” who got suckered into self-parody by Borat.

Whether recent or more dated, in order to offset these imaginings, evidence and rational analysis don’t work so well when dealing with the most ancient parts of the human brain. Those in “fight or flight” mode don’t stop to read statistical tables.

It’s an interesting question as to whether this psychostress is uniquely or even particularly a “modern” phenomenon. I suspect that core bio-psychological human capabilities have been placed, over the past few centuries, in an environment of far more complexity and rapid change than for most of our first 70,000+/- years. The bling of electronic living has not helped, nor have the hormone-stimulating activities of the media and advertising industries. Indeed, it’s ironic that the same drivers of rationalistic modernity: the “Scientific Revolution” and Enlightenment have also led to these anti-rationalist pressures and many brains can’t stand the strain.

Regardless of its historical origins, however, denial remains an apparently useful tool for many. Lenny Bruce would be proud.
2 Comments

Revival of the Fittest

5/6/2022

0 Comments

 
The vast majority of folks I know are deeply concerned about the likely path of humanity’s interaction with our planet. There are, to be sure, plenty of issues to be worried about, both immediate and long-term, and they are sufficiently well-known not to require rehearsal here.

Each of these folks carries some combination of despair and doggedness (and still a bit of legacy enjoyment of current creature comforts). Moods fluctuate: water is saved, birds are counted, even while eyes/ears glaze over at the news reports and webinars detailing the latest dire report or development. Amid this, I have noticed a streak of resignation in which the expectation of some kind of slow-motion-train-crash is relieved by a sense that we (of a certain age) will be “gone” by then and won’t see the worst parts of it. Even the well-off and (otherwise) pretty sophisticated blithely seem to assume that their progeny will be spared through some sort of “gated-community” salvation.

Of course, there’s no telling how far down the path of global distress our species will take us. Again, I won’t parse through the various dystopias and scenarios that have been sketched out. Suffice it to say that there is a significant chance that civilization will crumble and some successor will have to be rebuilt. (I will posit for this purpose that it will be by humans, not cockroaches or dolphins.) This scenario is a playground for utopians, with soaring opportunities for harmonious relationships between peoples, genders, and the rest of nature.

I won’t dive into that normative debate (i.e., what kind of world would we want?), nor the related predictive question of what kind of world is likely? There are plenty of current political views out there already which will serve (equally well) as the basis for projecting for both desiderata and prognostication. Instead, I’d like to pose some other questions: 1) Should we tell them how we got here and, if so, what should we say? and 2) How would we go about sending such a message into the future?

Regular readers of this blog have heard me warn of the perils of divining and applying the “lessons of history.” Nonetheless, the demand for such apparent comforts as a coherent human history with “actionable” lessons remains strong and whether future historians/anthropologists/archeologists will be curious or future politicians will be looking for someone to blame, there’s no reason to think that this won’t be a fortiori true for the post-enviro-calamity world.

Indeed, three renowned SciFi books each wrestle with how the past survives in a such a future. Isaac Asimov’s Foundation trilogy (1950s) takes a “hard science” perspective on a galactic scale renaissance. Walter Miller’s A Canticle for Liebowitz (1959) goes down a more religious path. More recently, Neal Stephenson’s Seveneves (2015) includes a group who survive a (non-manmade) apocalypse by preserving the Encyclopedia Britannica in memorizable bites, one of whose members is named for her portion of scripture: “Sonar-TaxLaw.”

What should we leave behind? Should we start with the “Great Books” series from the mid 20C which sought to capture the finest thinking of human history (albeit with a strong White/European/Male bent)? Even a more diverse bibliography of ideas and literature might well be indigestible without some concordance/guidance/framework.

How might we account for the state of the planet and our civilization? Who will write the histories of how we got here? What of the stories of empires, genders, wars, ideas, demographics, technology, and everyday life would be worth preserving? The possibilities are endless and the arguments among historians would be too (as if no “hard stop” was imminent!) Will we (as we are doing in the more substantive vector of actual climate change prevention) talk and hypothesize; or will someone put pen to paper (to speak in 19C metaphors)?

But then, what use is philosophy and literature (or history for that matter) in a world that is rebuilding itself from remnants. The great Encyclopédie of Diderot and D’Alembert in the mid-18C addressed not only ideas but practicalities. It included hundreds of pictures of ordinary machinery because it sought to enable people to change how they lived, not just how they thought. Perhaps we should commission hundreds of “how-to” manuals, ranging from irrigation and simple pumps to solar panels (and also how to build the materials out of which all of this is made)? If so, how far down the technological road should we go before we are (implicitly) urging our successor societies to replicate our own (problematic) path? We might include all of it and let them decide. (But what critical histories of technology and society would we include in the package?)

Of course, it’s not at all clear who would make all decisions. UNESCO? A committee of Nobel Laureates? The Texas School Book Commission? I do know some historians, maybe I should ask them? More likely, it would be a small group of smart folks chosen by whoever raised the money to launch such an endeavor.

This brings us to the last stop on this hypothetical inquiry: Once you have the “stuff” chosen, how do you preserve it—for several hundred or a thousand years—until some group comes along who can handle this compendium of knowledge/wisdom? There are serious technical problems around data preservation and compression into a manageable size. What language (s) should be used? How do you design an educational path (including languages, math, sciences) so that people could (progressively) comprehend this material? Where do you store it so it’s both safe and discoverable?

I’ll stop with the questions now. It’s an interesting thought experiment. But I can’t help think that if we leave a mess, we should help clean it up—somehow (and apologize, too!).

0 Comments

Social Darwinism

4/29/2022

0 Comments

 
"It’s not likely that Charles Darwin had any idea that his novel understanding of the range of life forms on this planet (1859) would work such a profound change in popular epistemology, much less mutate and spawn a new framework for looking at human societies which would be called “social Darwinism.” Indeed, he died in  1882 and the phrase didn’t gain much currency until well into the 20C, even if its essential concepts were developed by Herbert Spencer and other European thinkers late in the 19C.

The gist of the idea starts with Darwin’s theory that species compete for resources and those that best adapt to their environment (via mutation, procreation, and expansion) will fare better than those who don’t fit. “Social Darwinism” then applies this model to human societies (tribes/races/nations) rather than biological species. Spencer’s phrase: “the survival of the fittest”  captures both the original and the adapted theories.

There are a bunch of problems with this conceptual sleight-of-hand, but two stand out. First, groups of people are not different species. Even “races” are more of a social construct than the difference e.g., between a two-toed and a three-toed sloth. In other words, differences and divergences within our species have remained just that: within our species. In the (more-or-less) 10,000 years since humans started settlements, societies, and agriculture, we haven’t had enough genetic time to change very much. And between migration and interbreeding (sexual and cultural) the differences between “nations” are both recent and transient: they just don’t have much meaning. Second, and most significantly, actual (natural) Darwinism acts without consciousness and the adverse effects of evolution on the ‘losing’ species carry no moral weight. In contrast, “social Darwinism” invokes a conscious decision by a society to act in its own interests, knowing that other humans will suffer. In other words, if the essence of humanity is consciousness and moral judgment, then “social Darwinism” is a negation of that humanness.

Still, during its heyday, this approach to life and international relations gained a lot of support and still makes its appearance via a nationalistic perspective that seeks to control/condemn other groups/nations/races/species. What is significant about “”Social Darwinism” is not that countries all of a sudden started to see themselves in competition with each other, but that the spread of “scientific” thinking in Europe in the 19C led some elites to invoke Darwin’s ideas as justification for long-standing aggressiveness and animosity.

Another aspect of his ideas that Darwin (likely) didn’t foresee was the establishment (1993) of the “Darwin Awards" (https://darwinawards.com) as a forum to commemorate “those who improve our gene pool--by removing themselves from it in the most spectacular way possible.” The site contains some remarkable and often amusing stories of individual human stupidity.

I think it’s time to develop a comparable award for countries and leaders who, either through bull-headedness, ego, or a desire to be memorialized for Gotterdammerung-like behavior, put themselves in no-win situations, often leading to the demise of their country, regime, or economy.

Of course, Mr. Putin’s foray into Ukraine is the leading candidate from current affairs. There are many scenarios in which this retro-imperial revival could lead to a fundamental change in the structure of Russia (although there are also plenty of scenarios in which not much happens). We’ll have to check back in a year or two and see what eventuates.

World War II, from the perspective of both Germany and Japan, were long-shot attempts to revise the international order. In each case, the economic power of the aggressor was measurably less than that of the countries they attacked. In each case, questions were raised internally (albeit not too loudly) about the ability of the country to succeed. In each case, ideology and ego (including a good dose of “Social Darwinism”) trumped (so to speak) common sense and economic analysis. In each case, the aggressor was crushed and their government and society were reconstructed following the model of the victors. They both seem like good candidates for the ”Social Darwin” Awards.

Similar cases can be made for the German Empire, the Austro-Hungarian Empire, and (in its own way) the Russian Empire in the context of 1914. Each volunteered; each went down in flames. Napoleon, too, killed his Empire and many thousands of his men by marching all the way to Moscow and coming up empty-handed. Three years later, he was stuck on a tiny speck in the middle of the South Atlantic and the Bourbons had retaken the throne. Of course, the French monarchy itself had virtually bankrupted themselves by supporting the upstart Americans revolting against the British. They were so fixated on their perennial foe that they forgot to check their bank account. Six years after American independence was finally won, the French monarchy went down in flames.

There are undoubtedly many more examples we could draw upon. (I personally would go with Kaiser Wilhelm and the Germans of 1914).

One of the interesting things about the whole “Social Darwinism” thing is the intellectual dexterity of its adherents. Should we, based on the examples given, declare the ineradicable inferiority of the German “race” (or the others “losers”)? All sorts of excuses can be (and were) made (blaming the Jews was always popular). The leader who led his country over the cliff is often blamed, but not the country that followed him. There are few patriots ready to stand up and acknowledge their country’s stupidity and suggest that it should be fully dissolved or taken over by another country/culture.

All of which just illustrates that those who advocated for “Social Darwinism” not only don’t know much about evolutionary theory or basic sociology, but they also don’t actually believe it either.
0 Comments

In the Shadow of History

4/22/2022

0 Comments

 
As any magician or advertising producer can tell you, we humans are easily distracted by bright, shiny objects (rather like our late cat, Samantha, chasing after a laser pointer). Putting us in front of adrenalin or other brain chemistry-stimulating activities is pretty likely to suck up our attention. Fighting, chasing, and melodrama all fulfill this role in our popular culture and entertainment media.

The same is true for the far more sober-seeming practice of history. We pay attention to the big, bright, shiny events and personalities far out of proportion to their effect on the world and pay little attention to the dull stuff, however significant it might actually be. This is not to say that, e.g., the French Revolution or World War I (from a European history perspective) or the US Civil War were not important, but each gets thousands of books devoted to them, not to mention any number of movies, operas, etc.

However, they do tend to crowd out other developments, particularly those in close proximity. As a result, we tend to lose these fainter stars in our historical firmament.

This is part of the reason I wrote a set of world history lessons called “1905.” It connects seemingly disparate events and developments of that year: the Russo-Japanese War, the (first) Russian Revolution, the British partition of Bengal province in India, the British Parliament refusal to vote on women’s suffrage, and Einstein’s incredible writing of four papers that revolutionized modern physics. No, they’re not as dramatic as 1914 and the start of WWI, but they get lost in what I call the shadow of history.

Indeed, 1914 itself provides a fine example. Not knowing that their world would plunge into war in August, Europeans early that year were going about their business.

This is why I like to spend time in my relevant European history courses talking about the summer of 1914. The assassination of the Austrian Archduke, the diplomatic ‘to-ings-and-fro-ings,’ the downward spiral into a war of surprising length and destructiveness tend to push a bunch of other significant developments to the sidelines. Part of a historian’s joy in explicating the complexity of the past comes from the fact that these “secondary” developments don’t get the attention they deserve (and they give us some really good stories, too).

Just as in 2022 (when the Ukraine war pushed COVID off the headlines), so, too, did the assassination of Archduke Ferdinand in Sarajevo on June 28, pull attention away from the “top stories” of the Spring of 1914.

In Paris, everyone was enraptured by (their version of) the “trial of the century.” In March, Henriette Caillaux, the wife of a former Prime Minister, had shot the editor of a leading Paris newspaper who was blackmailing her husband. The murder trial started in July and raised a host of political and legal issues, exposure of the blackmail material, as well as providing a focal point for French society’s dealing with the “new” woman. At the end of July, Caillaux was acquitted on the grounds of women’s excitability.

Meanwhile, in London, the political leadership was dealing with the perennial problem of Ireland. While a “home rule” proposal was being debated in Parliament, Ulster Protestants took up arms against the British government plan. In March, 1914, rather than actively suppress their countrymen, portions of the British Army threatened to resign (the “Curragh Mutiny”). The resulting turmoil brought the resignation of the Minister for War and several senior generals, a high-profile political debate, and undermined the chain of command within the British Army and its morale generally—a great politico-military crisis only a few months before Britain was to start it’s bloodiest campaign ever.

It's hard to say what is “normal” when a major dramatic event comes crashing through everyone’s everyday lives. Things that appeared ordinary at the time, look strange in retrospect. Four days before the Archduke was shot, the Royal Navy made its annual friendly visit to the German Imperial Navy base in Kiel, with the Kaiser in attendance, while “German and British bluejackets made merry ashore.” Even after the assassination, on July 18, the German Fleet announced that they would make their traditional return visit to the Royal Navy base in Portsmouth. The visit, planned for August 8, never happened.

In July, the Kaiser went on his normal summer cruise off of the Norwegian coast. Radomir Putnik, the Commander of the Serbian Army still went to Hungary to “take the waters” and was there when the Austro-Hungarians declared war on Serbia (in a demonstration of bygone chivalry, the Austrian Emperor Franz Joseph allowed him to return to his command!).

Every historical event has comparable stories. The larger the event, the larger the shadow cast by our focus on the grand developments of the day. This phenomenon is one way in which we always look at the (relatively) distant past through the lens of more recent events. Curragh and Caillaux would have been the featured facets of a history of 1914 which stopped on June 27. If the Nazi’s hadn’t invaded Poland in September, 1939, we might have remembered that summer for the premier of The Wizard of Oz and the inaugural telecast of a baseball game the previous week.

When historians of the next century look back on the last two years, which issue will get top billing: the defeat of Trump, the pandemic, or Putin’s invasion of Ukraine? We can easily spin all sorts of scenarios in which each one frames a decisive moment in world history. Of course, we don’t know which it will be, but the other two risk falling into the shadows.

0 Comments

History of the Future

4/15/2022

0 Comments

 
What is the history of the future? By this, I don’t mean trying to predict (in 2022) what will happen in 2222 or even 2023; rather how was the concept of the future was seen in the past and how have prior thinkers and writers attempted to describe the future?

As with many studies in the history of ideas, the main thing is to bear in mind that concepts are a reflection of the culture/epistemology from which they arise and they change as that culture changes. As a result, it’s important to remember that the meaning of words change, even if the words themselves don’t. This is true whether you look at the ideas of intellectuals or the beliefs and understandings of ordinary folks (usually harder to find since most folks didn’t leave published works behind).

Much of how we think about the future these days is a product of the modern mentality, by which I mean a sense that the future is, if not definitively knowable, then is at least is plausibly conceivable by use of causal analysis and (often) probabilistic thinking (usually implicit). Perhaps the most common demonstration of this is weather forecasting. We all get to complain about erroneous forecasts even as we enjoy the confidence of knowing what is (likely) to come. It’s no coincidence that this developed in the late 19C shortly after statistical analysis emerged and just as some literary authors started to apply rational extrapolation to the technological developments of their age, producing the first recognizable science fiction. Jules Verne takes pride of place here; followed by H.G. Wells.

Another example of how this period was crucial to our modern sense of taken hold of and controlling the future can be seen in the rise of conscious urban planning in the mid-19C, although even then, it seems that “planning” was more focused on remedying existing problems than planning for how cities appeared to be evolving. For example, in designing the first comprehensive London sewer system in the 1850s (the largest public works project of its era), the engineers didn’t even project that the population to be served would grow much.

The 20C saw not only a burgeoning of future-oriented science fiction, but the rise of planning, scenario construction, and contingency planning on an organized and rational basis. This has taken place primarily within the context of larger organizations, usually businesses or governments, of which the current work around projecting and preventing the impacts of climate change are the most prevalent exemplar. We are now used to all sorts of projections about how the world will be in the future. Indeed, we could say that the future is now part of the present to a degree not known before the 20C. Much of this is built around technology and the extrapolation of recent trends.

In earlier times (although with some continuation even in the 21C), the future was spoken of in terms of either revelation or some mode of prophesizing. These notions are embedded in our culture and are difficult to shake off, whether from the Bible or Nostradamus. Simply stated, the future was unknown, and faith in deities sufficed to explain the workings of the world. The hints from divine sources, whether in Revelations, Kabbalah, the visions of seers or sorcerers, or various mythological systems, would be studied as the basis for seeing what lay ahead.

By the same token, all manner of omens and signs were considered as guides. Astrology, inferring a terrestrial pattern from the alignment of the heavens, was the most developed. However, as late as the mid-18C, the great French Encyclopédie  discussed about 70 methods of divining the future (even as it was proclaiming the rationality of human thought which was the core of the Enlightenment).

All of these show that we look to what explains the present to also explain the future. Humans are hungry for psychological security; to have a sense of order and the potential for control over their lives. Feeling like we know what’s coming is comforting (and potentially profitable or otherwise beneficial) and we’re often willing to pay commentators and consultants to tell us “what the future holds” (and all too rarely actually check on whether their predictions prove accurate).

Of course, we can never know what’s coming. Nothing is inevitable. But, with the rise of probabilistic thinking and statistics in the 19C, the future began to seem more knowable, or, at least, that we could have a pretty good guess (better than reading chicken entrails) based on extrapolating trends. In this way, we can see how the “Scientific Revolution” of the 17/18C slowly permeated the mentalités of modern men. Science promised stability, replication, and confidence in results through the application of reason. Partially through technological developments, partially through mathematical/statistical thinking, partially through repeated demonstration of its form of “knowledge”, we have come to expect things to happen in ways which we can plausibly and reliability (if not with certainty or utter faith) predict.

As we have come, over the past several centuries to understand how the world works,  there is less space for mysticism and religious cosmology. The future (i.e., how we see it) is a place where the world has become more confident, and more secular.

Still, past practices take a long time to change; social habits die hard. Most of us still partake of mysticism (do you ever “knock on wood”? Or pray for a future other than that which seems likely?).

In the 21C, big data and AI promise even more certainty (or at least the appearance of certainty).  However, as a historian, I will split my bets. I will read SciFi to stimulate imaginings of possible futures, read sophisticated scenarios for the near-future, I check out the weather reports; but I will always leave a space for contingency and the butterfly effect because, you never know….
0 Comments

Failed States

4/8/2022

0 Comments

 
While all eyes are on Ukraine, let’s not forget the rest of the world.

The news from Afghanistan, for example, is pretty bleak: lawlessness, social collapse, “humanitarian crises,”. If you add civil war as a descriptor, the list of countries with existential problems would include: Burma/Myanmar, Cameroon, Chad, Congo, Ethiopia, Haiti, Sudan/South Sudan, Venezuela, and Yemen (and likely a bunch more). There is a concept being applied to all these countries: the “failed state.”

To be sure, there are any number of locally unique circumstances and events that have led these groups of people to their (respective) current situation.  It’s reminiscent of Tolstoy’s famous opening line from Anna Karenina: “All happy families are alike; each unhappy family is unhappy in its own way.”

Still, it’s worth taking a moment to ponder all this “failure” at a broader level.

The concept describes something different and more dire than the (well-justified) concern about the threats to democracy here in the US and a bunch of places elsewhere in the world. Neither Russia nor China may be described as the “home of the free,” but no one doubts that there is a state in place and in charge. History is strewn with monarchies (of varying degrees of absolutism) which constructed robust and bureaucratic organizations. Indeed, the very nature of historical fascism implies the existence of a strong state as the locus of group identity.

So, in using the concept of the “failed state” means we have to consider what a state is, what it’s supposed to do, and why its “failure” could be a problem.

The “state” can be fundamentally defined as the crystallization of the power structure of a society, institutionalized to preserve order (domestic and internationally) and to keep itself in operation. Historically, the modern “state” emerged (over the last 500 years or so) marked by the separation—conceptually and operationally—of a monarchy from bureaucratic operations necessary to achieve those functions. Traditionally, the state has claimed or at least aspired to a “monopoly of legitimate violence” in society (Max Weber’s classic early 20C definition) which enables it to create and preserve order.

The means by which a state maintains “order” in society has evolved over the centuries, reflecting changes in economic relationships, democratic power structures, technology, etc. Pre-modern states didn’t do much more than make war, host parties, and resolve disputes among its people. Now, we expect the state to provide health, education, roads, information, and a smoothly-functioning economy, in addition to maintaining internal and external security (oh, and collect taxes to pay for the whole thing).

In other words, if a state can’t do at least its core job, then (pretty much by definition) it has failed as a state. The presence of an extended civil war (of which many configurations exist) or being invaded or otherwise taken over by another country (more typical before the 20C) are the principal pieces of evidence that failure has occurred.

In most countries that were created and stabilized before WWII, this all seems pretty straightforward; in large part because the state is embedded in a more-or-less established and coherent political society. There is some social ‘glue’ that holds each society together. For those of us in the “modern” West, this is the norm. When a society and state don’t hang together in this way, the eventual result is a “failed” state.

From this perspective, it’s not surprising that most of the places that have had civil wars, or other violent changes in management since WWII have been in places that were run by Western empires. With boundaries that were often artificially drawn, and relatively little time to accomplish social integration (remember it took France and Britain about 500 years (and no few wars) to get themselves together into the entities (with infrastructure and resilience) we now take for granted as the standard for what a country looks like).  As we repeatedly forget (e.g. in Iraq), nation building is not the work of a decade, but a century. In this way, “failure” is another example of the way in which Western power has made it easy for us to look at others, judge them negatively for being “not like us” and represents a culturally-blindered view of how societies “should” be organized. The implicit solution is that they should adopt our models, we’ll write them a check to help tide them over, and it will all work out.

Backing off from judging these other societies or replicating ours is far from saying that we or anyone should aspire to the situation in Afghanistan, Venezuela, or Chad. Indeed, if we take the preservation of order as the essential function of a state, they’re not doing it.

So, what I’m really arguing for is not that the “failed state” characterization is wrong, but that the (usual) judgmentalism isn’t helping. The solution for such places may well be a different configuration or new borders. It may require a model of governance that isn’t hung up on sovereignty and independence or democracy. It’s difficult to contemplate, given the highly problematic history of empires and the many points I have made in previous postings about democracy.

Perhaps it’s time to revive the model of trust territories which the UN applied to a variety of post-imperial countries in the 2d half of the 20C? What if some groups within such a country were to vote to surrender certain aspects of their freedom/independence for a sufficient period (30-40 years?) to let someone build socio-economic stability and foster a national culture? Life is full of trade-offs, making tough choices as between key values. There’s a lot of “life, liberty, and the pursuit of happiness” being sacrificed on the altar of state sovereignty. We (which is to say the people in those places) may need something different.
0 Comments

Middle Kingdom

4/1/2022

0 Comments

 
Zhongguó (or Middle Kingdom) is a concept that has been deeply-embedded in Chinese political consciousness for centuries. While originally only regional in scope, it has come to embody the sense that China is the center of the world. The revival of China as a global geopolitical and economic power during the past few decades, following a couple of centuries of decline is one of the remarkable stories of our time. China is acutely aware of its power and its place in the world and has been leveraging that power to secure that place.

The US and others have been slower on the uptake. For decades in the middle of the last century, we were stuck in the Cold War mentality, compounded by “Western” superiority; reflected in the McCarthy era question: “Who lost China?” (as if it were ours to lose). Nixon and Kissinger started the “normalization” process in 1972. But it took the economic revival spurred by Deng Xiaoping in the 1970s/80s to really change things. Still, it took more than twenty years following the fall of (European) Communism for President Obama to announce (2011) that US foreign policy would “pivot to Asia.”

That was a bit of a muddled policy (followed by the rudderless animosity of the Trump years) didn’t bring any coherence to how we think of China (as long as they make good and cheap iPhones).

All of this is prefatory to a couple of observations about China’s place in the world in the 21C:

First, China was an empire for thousands of years and, just like Russia (whose empire was recast under 75 years of Communist/republican governmental structure), China continues as an empire even while spouting Marxist/Leninist/Maoist/Xi-ist theory about republicanism and the power of the “people.” While over 90% of the population remains Han Chinese, that still leaves well over 100 million other ethno-linguistic groups within its borders. As Tibetans and Uighurs (among others) can attest, the People’s Republic has been brutal in its efforts to marginalize and convert these other people, cultures, and religions into compliance. In this regard, China is following a pretty standard program of domestication and cultural assimilation (often forcible) practiced by other great land empires (e.g. Russia, the US, Nazi Germany).

Second, China is still sorting out the nature of its global “informal” empire. Classic European empires (e.g., Spain, Britain) profited by extracting raw materials from dominated areas and selling processed/manufactured goods and services back to the world. The Soviet Empire of the 20C reversed this model by shipping its own raw materials to Central Europe for processing and re-importing the finished products.

More generally, there is no question but that China has been constructing its own global informal empire, most notably through its “Belt-and-Road” initiative. They have been buying and building ports, factories and mines around the world in order to facilitate the flow of raw materials in and finished products out across Asia and extensively in Africa. They even started a project to build a new canal in Nicaragua to compete with the Panama Canal. It’s not clear that they studied and learned the lessons on how to do this from the (pretty successful) British experience, since many of these investments have proved to be dead ends or have engendered resentment from these new quasi-colonies.

However, one of the most interesting juxtapositions of recent geopolitical history has been the rise of China and decline of Russia. Russia has lots of natural resources, a need to protect its Asian flank, and a shared desire with China to constrain US global hegemony. China has capital, booming technology, and a sense of energy; all of which Russia lacks. If “the enemy of my enemy is my friend,” then this is relationship worth watching. After the arrogance of the Stalin/Khrushchev/Brezhnev years towards Mao’s China, it would be a delicious irony to see Russia as the “jewel in the crown” of a 21C Chinese empire.

More immediately, Chinese expansionism (Hong Kong, Taiwan, most of the South China Sea) reflects the reality of actual global military/economic power. In terms of its regional aspirations, there is really no one to overcome Chinese desires. But things are more complicated further afield. Even with modern technologies, there are real limits on the ability of China to project its power. Even with a population four times that of the US and three times that of the EU (although likely to be passed by India in five years), and a (so far) dynamic economy, the likelihood of a multi-centric world for as far out as we can reasonably plan remains quite high. China’s Zhongguó vision works at the East Asian level, but not globally, but this limitation will be difficult for the Chinese to comprehend.

A final, historical note: the Chinese plausibly trace their lineage as a coherent political entity back for thousands of years, longer than any other  group in human history. That’s not to say that there weren’t many interregna, invasions, take-overs by Mongols and Manchurians, wars, and revolutions along the way. In other words, there were long periods when China wasn’t a coherent entity, much less the center of the world. In addition, the track record of multi-national empires isn’t great from our perspective in the modern/techno/global 21C. As a result, there’s no clear “lessons from history” here with regard to the path or success of the current Chinese Empire, ancient ideologies/myths notwithstanding.

0 Comments

Little Brother is Watching

3/25/2022

0 Comments

 
George Orwell crystallized the image of the omnipresent state with his 1948 creation of “Big Brother” in 1984.  Since then, amazing advances in technology have made it clear that our privacy is more a function of the resources and priorities of those who would watch, rather than any legal/social constraints. In other words, if the NSA wanted to watch you, they could and they would.

Movies and video ads make clear that minute cameras and microphones on increasingly smaller drones can get to lots of places and see/hear/record events. Who can doubt that a large percentage of people who eagerly live much of their lives with ear buds virtually glued in place will not also quickly move to technologies embedded in various nooks and crannies of their heads for both input and output.

We’re already well along. Police body-cams and increasingly ordinary video cameras in public spaces  have exponentially increased the amount of “film” available to law enforcement. Google glasses and virtual/augmented reality headgear are bringing these kinds of capabilities even closer to our brains, seeing what we see and hearing what we hear. The “internet-of-things” will ride on ubiquitous Wi-Fi and other networks to send that information to any data warehouse instantly. AIs will enable sorting and retrieval of particular incidents.

Still, my concern today is not with the State crushing personal privacy aided by the onslaught of technology. After all, most of us have precious little of interest that would be of interest to spymasters or even to Mark Zuckerberg and other would-be moguls of our techno-future. Rather, I’d like to consider the implications on a smaller scale: How will the presence of (seemingly) ubiquitous recording devices affect day-to-day interactions in stores and offices, comings-and-goings in homes and cars, and conversations with families and friends.

Historians are quite aware of the fallacies of memory. Indeed, anyone who has ever played a round of the “telephone game” knows that even instant repetition is fraught. Memories suffer not only from wholly unintentional omissions and limited perspectives (increasing over time), but also from unconscious desires to construct a friendlier, more self-supporting, and more coherent past.

Now (or in a not-very-distant future), we will be able to reach that defining goal of modern history: to find out what "actually happened.”
  • Did little Kim start the fight with their sibling Robin over the use of some critical toy or vice-versa?
  • Did a certain spouse wink at an ‘ex’ at a recent party as was later debated in the car on the ride home?
  • What was the expression on Rosemary Woods’ face when she famously erased 18 minutes of the Nixon tapes?

Will family life become more peaceful? Will law enforcement become more mundane?

I suspect that each of us often take refuge in the ambiguity and irretrievability of the precise past. A little fudging has undoubtedly preserved many relationships, not least of which with each of us ourselves. We may find out whether, as Jack Nicholson’s character posited in A Few Good Men: “you can’t handle the truth.”

Are we ready for the “truth”? How will the awareness of an “objective” record affect how we behave? Will little Kim stop picking fights with Robin? Will cocktail party flirting cease? If the case of George Floyd is any indication, it has taken some time for police body cameras to change their habits and patterns. Likely for ordinary folks, in ordinary situations, it will take longer. But perhaps, we will internalize the presence of this cyber Jiminy Cricket, who, perched on our shoulder, regularly reminded us to “let your conscience be your guide.”

From another perspective, the ubiquity and constant nature of such surveillance might well work as a further modification for how we record and process information. Why take notes in class or meetings when you can easily call up an actual recording (or have an AI do it for you)? Fascinating work has been done on the impact of writing on the social practices of memory, which became less central to accessing the past. It’s easy to imagine that we won’t tax our organic memories nearly as much as techno-memories become more commonplace. The rap against older people that their memories are failing may acquire less resonance, since the quality of their internal neural network won’t matter as much.

For historians, one of the fundamental challenges of research was finding enough original material to try to discern what “actually happened” in the situation under study. Did anyone actually argue with Napoleon about launching his ill-fated adventure into Russia? Who did stab Caesar? Who was the first to make a salad named after him? The problems generally get worse the further back we go. In the electronic ages, Historians are already getting swamped with thousands of emails and, soon, hours of videos. We will have to turn to AI Historians to parse through all the material (there aren’t enough underpaid grad students to throw at this).

It may be that, as with many aspects of human social/psychological adaptation, making any of these fundamental behavioral adjustments will take generations. The transitions, especially inter-generational, promise to be somewhere between “interesting” and problematic. In any event, this revolution will be televised.


0 Comments

The Limits to Growth

3/18/2022

0 Comments

 
God tells us, according to Genesis 1:28, to “be fruitful and multiply; fill the earth and subdue it.”
Well, we’ve done a hell of a job ever since, especially lately. Indeed, one might say that growth is a deeply-rooted human epistemology. For a long time, it was a nice concept with few practical consequences. It’s now driving us off a cliff.

For millennia, families would have lots of kids (far higher than the “replacement rate”). Land was cheap, many children died very early and who didn’t want progeny to carry on the family name and perpetuate our genes.

What we now call the “Anthropocene era,” the geological moniker for the period of time in which human intervention into natural processes became significant, began roughly 400 years ago. Since then, human population has soared (see my piece on “Pop Culture”).

Also about that time, Europeans began to recognize the relationship between the number of people in a country and its military prowess. Ditto for economic strength. And countries began to consciously foster both, in order to enhance their national/regal/imperial glory. [This awareness and policy may have happened elsewhere as well, but I haven’t done the research.] Growth thereby became a matter of inter-national competition.

By the 19C, the rise of the science of economics and the political pressures of democracy added further rationales for growth: profits and domestic peace. The profit part was pretty straightforward: bigger groups could mine and grow more stuff, and then make and sell more things; and the owners/investors could build nicer houses. Competition between countries spurred (European) imperialism and the take-over of most of the world in one way or another.

The democracy connection is a bit more subtle. As ordinary folks began to be more aware of their political power in the aftermath of the French and American revolutions, ruling classes (who (unamazingly) combined political and economic power) had all sorts of reasons for maintaining their position/privileges. Growth as a doctrine answered multiple needs. It provided a basis for more/bigger activities whose profitability was directly beneficial to those groups and provided a public theory of opportunity for those well down the socio-economic ladder.

Since (the implicit theory ran) no one would (should!) expect the rich to have less, the only other way for the “rest” to get more was for economic growth to produce more wealth and thereby provide a means by which the “rest” could advance their standards of living (even if those new comforts were always trailing the luxuries of that era’s “1%”). It’s sort of macro-economic version of “trickle-down” economics. Other than its disingenuity, exploitation of most of humanity, and on-going inequality at both the national and international levels, this “growth” thing looked like it was just the ticket to wealth, domestic peace, and a stable international structure.

But, as with most good things, they can be overdone: one glass of champagne is divine, the second is excellent, the third is fine, and after that, it’s a blur.

Growth, both in terms of population and economics is running into a wall. The planet can’t support the amount of economic activity necessary for all 8 billion of us to have the standard of living “enjoyed” by those at the US poverty level (currently about $13k/person) without those at the top getting a haircut. For everyone to live at the current average rich country standards, we would need to “grow” the world economy to more than twice the current amount. Needless to say, in the midst of a slight constraint on planetary output imposed by global warming and environmental disruption, this isn’t likely. [Just to do the math: 8B people, OECD average income- $30k = $240T; current global GDP=$94T.]

If our current trajectory isn’t tenable, then we need to figure out a new way of thinking about wealth, growth, and economics in general. Early in the 20C, the Bauhaus architect, Mies van der Rohe said “Less is more,” as the justification of a minimalist aesthetic. Modern minimalism was the result. There’s something to be said for a simpler line in design (though it, too, can be taken too far). Various groups (the Amish, hippies, Marie Kondo) have urged simpler lifestyles over the years. Malthus (1798) warned that there were real limits on the population of human societies. Many of us remember Paul Ehrlich’s 1968 apocalyptic (and a bit overhyped) warning: “The Population Bomb.” In 1972, the Club of Rome famously published a more sober precatory report on the modern global economy entitled: “The Limits to Growth.” All have been downplayed or “pooh-pooh’ed” by conventional thinking. The consequences of the necessary change are fearful to contemplate. It’s difficult to conceive of most people I know of voluntarily living on $30k (OECD average), much less the US poverty level ($13k). So, we have all implicitly bought into this “growth” and inequality model.

Stanford historian Walter Scheidel has persuasively argued that the only way economic inequality has been alleviated on a broad level is by disasters, especially wars. This makes sense, since the rich have the most to lose and, in extremis, their property is less protected. This historical perspective is not pretty, but it reinforces the attractiveness of the growth epistemology as well as hinting at the likely outcome of the upcoming climate disaster.

In sum, there are only a few factors driving a solution to the global mathematical equation: 1) fewer people, 2) lower standards of living, 3) reduced economic inequality, or 4) technological solutions that increase the economic capacity of our planet to support our species. The debate about what share of the burden each of these factors will bear will underpin the geopolitics of the 21C, as well as decisions at societal and personal levels.

I was once given the sage advice: “Never confuse your net worth with your self-worth.”  It’s good counsel. If we give our species a chance at a choice over the next few decades, we will have to sort this out.

0 Comments

This Means War

3/11/2022

1 Comment

 
The startling re-emergence of war among the consciousness of Western elites in the past few weeks has occasioned all sorts of sympathy and support for the Ukrainians who are fighting and killing on behalf of ideas—freedom, independence—that the readers of this series presumably share.

This particular conflict has seen some well-established military tropes, including tank columns (prominent since WWII), guerrilla conflict (active since the early 19C). It has also seen some emerging techno-tactics such as drone strikes and cyber warfare and, the notable application of economic “sanctions.”

These last two are distinctive in the “annals of warfare” because they don’t involve killing/wounding/capturing the enemy, but are directed at the civilian population and the communications and logistical infrastructure of the military. Still, while the technologies involved might be quite 21C, the principles of leveraging something other than swords and guns to coerce/defeat some other country are hardly new.

One of the first major foreign policy crises of the young American republic in the 1790s and 1800s required the US to navigate between its former foe and ally as Britain and France engaged in a series of conflicts known as the French Revolutionary and Napoleonic Wars. President Jefferson imposed an embargo on trade with Britain in 1807. Napoleon had already worked to construct the “Continental System” under which mainland Europe (at least the parts that he controlled) would also cut off trade with Britain. All this didn’t work out great for the US, and our unpleasant War of 1812 resulted.

Of course, the use of blockades to prevent supplies to embattled ports had already been a well-established mechanism of war since the ancient Greeks and were a regular part of warfare since the 18C. One of the reasons we know this is that the subject of blockades was regularly addressed in early modern treatises on international law (Hugo Grotius’s work of 1625 being the most famous). This was all part of an effort to think through the nature of war and peace and establish rules of “civilized” behavior for both situations.

In a world where all economic relations were physical, physical blockades with ships intercepting other countries’ shipping or military sieges were the sensible means of depriving the “enemy” of economic support. Now, in a world of services and data and digital money, we have “sanctions,” formal seizure of assets and denial of licenses to do business with the enemy country and its citizens, including access to shared systems of commerce and finance such as the SWIFT banking system. By one count, sanctions have been imposed over 1100 times by various countries since WWII.

All well and good, but the imposition of these sanctions and the concurrent (and usually secret) use of cyber attacks (again, non-physical disruptions of enemy property and systems) raises the question of whether they constitute “war.” As with most concepts, “war” can mean many things. Political scientists have compiled long lists of events of mass violence according to certain criteria, in order to demonstrate certain patterns of war in history. For most folks, looking back in history is more visceral, marked by big, deadly, or long-lasting military conflicts with familiar names; often landmarks in national, regional, or global history.

In this context, we can also note the long-term effort of “international lawyers” to demarcate war from peace is rather artificial. The effort to impose rationality and humanity on states’ exercise of power is a testament to either a noble perseverance to improve the human condition or futility.

The same is true in the vague treatment of war in the US constitutional system, where Congress is given the sole power to “declare” war, but no one doubts the effective power of the President to do all sorts of things (physical, economic, electronic) that attack/harm an “enemy” or protect the country. This well-fudged line has plenty of precedents, including (just from recent US history): the “police action” in Korea,  the Gulf of Tonkin resolution (1964), and the authorizations for US military actions in Afghanistan and Iraq early in the 21C.

The Russian invasion of Ukraine certainly meets all the usual criteria. However, whether the actions of the US and its allies also constitute “war” remains to be seen. Putin warns the West that its current steps “threaten” war; raising the spectre of nuclear retaliation. NATO makes clear that its core principle of mutual self-defense (Article 5) has not been triggered since Ukraine is not a member; but since virtually all NATO/EU states have been shipping weapons to the Ukrainian Army, providing intelligence, and imposing a wide array of economic sanctions on Russia, it’s not clear exactly why we’re not already “at war.” So far, it hasn’t behooved Putin to invoke the term, but otherwise, “a rose is a rose is a rose.”

Looking back at the Afghanistan and Iraq “wars,” we can see not only the presence of US/NATO troops and the full range of overt military activities and structures. I guess we were at war, even without an official Congressional declaration. Both places also saw a wide array of “war” activities, even if non-official (via the CIA, various contractors (mercenaries), etc.). The “war on terror,” of which the Afghan conflict was at least nominally a part, highlights that in the modern world, non-state actors can trigger or become the recipients of state military action.

This line has many antecedents in terms of guerilla actions for centuries. Indeed, the office nature of “war” is closely tied to the rise of the “state” and its claim to monopolize the use of force in society (both domestic and international). War became an activity in which only states could engage; other military actions or, indeed, all non-military actions were defined (with help from the international lawyers) as outside the meaning of “war.”

We can only hope that, by whatever name, the killing stops and this situation gets no closer to whatever definition of “war” leads to more death and destruction.

1 Comment
<<Previous

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly