Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

A Crook

5/31/2024

0 Comments

 
It’s hard to hold Richard Nixon up as a paragon of moral behavior. But pretty much everything in history is: “compared to what?” and we now have Nixon bracketed among the 46 people ever elected to be the chief magistrate of the country. His champions must be relieved that he is no longer the limiting case of bad actors among Presidents.

Presidents are, in the end, only people and have committed a multitude of sins both in and out of office. Few—beyond Jimmy Carter—have been willing to acknowledge their shortcomings. Large egos and a keen eye on their legacies have minimized their time in the confessional. There has been denial and dissimulation. Nixon famously proclaimed in the midst of the Watergate scandal that “I am not a crook.” But he was. Clinton famously asserted that he “did not have sex with that woman” [Monica Lewinsky]. But he did. Pretty much everyone has cut corners on policy matters. Promises made during campaigns were too dear to keep once they were in office and all manner of disingenuity resulted. Some were pretty scurrilous on a number of grounds (LBJ comes to mind, as do several in between J.Q. Adams and Lincoln).

Indeed, it is one of the shortcomings of the American electorate that we construct myths of perfection around our political leaders and then we declare (as Captain Renault said in Casablanca:) “I'm shocked, shocked to find that gambling is going on in here!” But popular sanctimony is not the central issue here. So, I am not arguing for the necessity of sainthood in politics.

On the other hand, there is a thing called “conscience” or, as Jefferson wrote in the Declaration, “a decent respect for the opinions of mankind.” There is some considerable evidence that many Presidents had one and wrestled with intense moral conflicts. It is notable that perhaps even our most admired Presidents felt the weight of their decisions. Washington and Lincoln come most immediately to mind (perhaps it is because of this that they are held in particularly high regard).

As we know from just looking around at our fellows—family members, co-workers, business connections, “friends,”—not everyone has a conscience; or at least many don’t take theirs out for daily exercise. Self-indulgence, rationalization, and the full range of the seven deadly sins are all powerful ammunition. Acting with integrity—all the time—is tough. Power and pride are a particularly dangerous combination; especially at the Presidential level. So, we have plenty of reason to be wary in general.

Nixon’s proclamation of innocence kept him afloat for nine months. Then, he had enough political clarity and enough integrity to realize that the gig was up. The fiftieth anniversary of his resignation is coming up this summer. In fact, he was a “crook,” even if denied and pardoned. But, he had enough character to leave the stage. I’m not saying it was a lot; but he had something.

We are now faced with another Presidential crook. Duly tried and convicted. He may appeal. He may never see the inside of a jail cell. So far, however, there’s no indication of conscience. No sense of easing himself off the stage. We may, as a result, see a felon in the White House; which says more—much more—about the state of the nation than about the character of He Who Shall Not Be Named. Suddenly, Tricky Dick doesn’t look so bad.

0 Comments

The Meaning of Truth

5/24/2024

0 Comments

 
I’ve been reading in some curious corners of history lately: controversies in medieval and early modern Christianity and their implications for the modern world. One thing I’ve noticed in several places, even in works by thoughtful and considered writers, is their looseness with the truth. By this I don’t mean that they’re historically sloppy, glib, or generally prevaricating; rather, their conceptualization of “truth” needs to be tightened up.

Take, for example, the ideas of “Christian truth” or “scientific truth.” They’re not the same thing. They’re great illustrations of the fact that faith and knowledge are two different realms, that their adherents speak, in effect, two different languages, even though some words appear in both. (“Son” means male child in English, “are” in Spanish, “his” or “hers” (depending on the nature of the object) in French, or “the end” in Turkish). So, even when we hear two English speakers use the word “truth,” we have to figure out what they mean. If we just assume that a person of faith and a person of science mean the same thing, we’re headed for semantic confusion. And, in the case of this word and these two cultures/languages, we’re headed for a pretty important divergence in understanding epistemology and history.

Historians, of course, are acutely aware that meanings change over time and I caution my students that “democracy” in the 17C resonated quite differently in the 20C. “Liberalism,” “enthusiasm,” and “physician” are just a sampling of such terms.

“Truth” on the other hand, seems such a fundamental concept that it might be seen as constant; but it’s not. (Nor, for that matter, is “fact” a concept whose ordinary meaning today arose only in the 19C.)

“Scientific” truth carries two essential, if implicit, conditions. First, what we all know as scientific progress and second, what we all know of scientific method. The first says that science is tentative, that it’s statement and findings are “true” insofar as we have figured things out (to the best of our ability) so far. Newton may have cracked the laws of motion, and optics, and calculus, but he believed in phlogiston and the ether. Darwin figured out evolution by variation and descent, but he had no idea about genes or DNA. Einstein proposed theories of relativity, but couldn’t quite wrap his head around quantum mechanics. These are not criticisms, but merely reflections of the nature—the inherently incremental nature of scientific discovery. What is stated is the best we know so far.

The second constraint is that science only accepts as truth that which is demonstrated and which is replicable. Ideas and intuition are great, but they’re not true until experimentally demonstrated in the real world. Thus, we come to “metaphysics” which (as the word says) is ‘beyond’ physics. That is, pretty much by definition, it can’t be proven by science and so, it can’t be “true” in the same way gravity seems to be. In our modern world, this consigns that which is outside science to a diminished limbo, unavailable for truth and classified as merely “belief.” However, if we don’t postulate science as the essential standard, then the line between physics and metaphysics can be seen as a limitation on  science; confining it to those areas which are demonstrable.

We moderns are, for the most part, so deeply imbued with this outlook that even raising such questions looks like nonsense. There are two reasons, epistemological and historical why such a dismissal is problematic. The epistemological frame asks how and why we are sure that science is the definition of truth. Pretty much by definition, science cannot address, much less resolve metaphysical questions. How many angels can dance on the head of a pin? Who is “god”? How many gigawatts of power does God have at hand? (for that matter, does God have hands?). If you start from science, all this is excluded (i.e., the entire class of religious claims are, per se, not true). If you start elsewhere—in metaphysics or faith—the questions are different. Such a stance allows for—indeed, demands—the creation of a space beyond science beyond “knowledge” (which, after all is just a descriptor of scientific output). It is limitless, non-rational and non-sensical (i.e., outside our senses). In addition, the very divergence of religious views and interpretations (whether Christian or otherwise) seems to undermine each of their claims to truthfulness on their own terms. Science says everything is (sooner or later) provable or disprovable; faith says: no; your “facts” and your “methods” are just projections (see Plato’s Cave); they are fine as far as they go, and even if they have gone far and seemingly squeezed God into a corner, behind genetics and quarks and dark matter, they have only made a minute dent in the infinite. In other words, at root, science is just another term for disbelief in the truth of faith. Moreover, the same applies to the divergent claims of religious truth made by other beleivers.

The historical basis for doubting the power of science finds roots in the politics of church and state in medieval and early modern Europe (1100-1800); a politics which was absorbing and distracting and pushed such issues to the side. The nominal certainty of scientific method, especially as it emerged in 17C England, provided a much-needed respite from the sectarian strife of the Reformation. Its promise of mastery of nature and (implicitly) power and riches (much of which have furnished our world today) was seductive as compared with deferred grace. Its momentum and (apparent, but questionable) links to technologies, the “industrial revolution” and capitalism have become inextricable parts of modernity and how most moderns think.

I’m a pretty modern guy. I guess I would take the mantle of “skeptical agnostic” when it comes to God. I don’t think “atheism” is quite right; after all, how can science say definitively there’s nothing beyond (see Godel’s Incompleteness Theorem); it’s certainly not “provable.” Besides, there are billions of people who claim a worldview based on faith and if I’m any sort of tolerant/diverse/democrat, I have to allow for them in the world and not claim an exclusivity on my scientific approach to “truth.” The successes of modern science have bred an unhealthy smugness.

One final note, which combines both history and epistemology, is the observation/suspicion that there is a strong correlation between faith/belief/trust in science on the one hand, and wealth and power on the other. Satisfaction and comfort in this world makes it easy to downplay the glories of the next. On the flip side, earthly lives that are, in Hobbes’ words: “nasty, brutish, and short,” may be more tolerable and sensible if there is something to look forward to. There are, as soldiers say, “no atheists in foxholes.” Faith has little presence among intellectual and commercial elites. I’d be curious to see a global study of the correlation of religious faith and socio-economic status.

0 Comments

Wants and Needs

5/17/2024

0 Comments

 
There is an old Sufi teaching story (they’re almost always old) about a woodcutter named Mushkil Gusha who, amidst various travails, is told: “If you need enough, and want little enough, you will have delicious food.” When he remembers this, it’s a big help amid the turmoil of his life.  

This axiom is an important touchstone for me and it has, I believe pretty wide application to both individuals and our society as a whole.

One way I use this advice is to recall that I have appetites and desires, often driven by unconscious motivations and memories. They’re easy to get lost in and surrendering myself to them is momentarily fulfilling: another slice of cake, another hour of games or TV, more sex, more opportunities to be applauded  by others who can’t help but recognize my genius…(you get the idea). What of these wants/needs is real and what are ephemeral?  If I can answer that question, I will get closer to wanting “little enough.”

On the other hand, I have a long and deep history of not taking care of myself, of deferring and demurring, of giving in to others (whether friendly or demanding or asleep). In other words, in these ways, I don’t “need enough,” and I starve myself (of food, of opportunities for fun and growth), I give up “my” time, with the result that I can be silently resentful, and unhappy.

There is a fine (for me, as yet unaccomplished) art of figuring out the right balance: when to say yes and when to say no. If I can find it, so Mushkil Gusha tells me, I will have the right amounts of self in the world or, in other words, “delicious food.” Most spiritual traditions have some version of this principle; Western philosophy urges me to “know myself,” Buddhists say that attachment to the world and things is the road to suffering.

The same is true at a macro/societal/global level. There are signs all around that we have succumbed to appetite. Genetically-driven accumulation of fats and sugars have produced a global population of one billion (+/-) physically obese people. It’s a dire situation even before we get to the moral dimension of the presence of hundreds of millions who are starving.

Capitalism (i.e. the epistemology of viewing the world through money and markets) has leveraged the perennial human foible of greed so that the personal accumulation of economic resources is so far beyond the needs of individuals (even if assessed at a level of a fair amount of  luxury) that it’s mind-boggling. There are about 8B people in the world. Estimates of total global wealth are north of $600T, or an average of about $75,000 per person (the US figure is well over $500,000 per person). The differentials are not just a few Bezos-sized yachts, or even five-figure Birkin bags. This is the foundation of claims for social justice in this country and the world and a great reason to tax inheritances at a high level. It is also a framework within which to view much of international economics and politics, not to mention the history of Western domination of the world across the 19/20Cs. The wants/needs differential, when accumulated across global populations, produces its own set of dysfunctions.

But tops-down policy perspectives will only get us so far. There is no reason I have to wait for global socialism to make moves in the right direction.  

The combination of media hype and technological acceleration has only exacerbated the situation. The speaker in my computer (to which I am listening now) is far better than the one I had in various component stereo systems I had “back in the day.” It is, however, not “state-of-the-art.” Do I “need”  an increment more of high fidelity (especially at the cost of some cash, more cables/plugs, and more things to keep track of)? The frenzy of acquisition around the “iPhone de jour” echoes the middle-class practice in mid-20C Detroit (where I grew up) of trading in one’s car every two or three years in order to get the latest model. I feel the same about “fashion” and the fetish of food endemic here in the Bay Area but also wide-spread wherever folks with “more money than sense” gather. Recognizing my own proclivities in this regard is the first step; then, making conscious choices is the next. Smug (usually implicit) self-indulgence is my enemy here.

The flip-side is self-righteous denial. I have to be wary of that part of my personality which thinks I’m not “good enough” to deserve whatever bounty or benefit is currently at hand.  It’s been easy for me to fall into the trap of over-rationalization and self-deprivation; to count pennies and forget the scope and opportunities of my life without solving all the problems of the world.

I don’t need to don a simple robe and live in a monastery. I enjoy a good steak, but beef sucks up a huge amount of the global water supply. Living a luxurious life is delightful and comfortable and fun. It also makes it more difficult to be aware of those who can only spend in a year what it costs for two people to dine at one of SF’s top restaurants.

I’m still working on the psychological sources of my attitudes, but in the meantime, I have to sail between Scylla and Charybdis. The only way through is calmly considering the choices I make. Mindfulness is not just a matter of meditation, but of day-to-day decisions on how to spend my time and money and attention.

What to emphasize, then, in my life: charitable giving will make a dent, as will doing good in the world. Human connection, intellectual sparks, and a sense of accomplishment are, for me, the delicious food.

0 Comments

Stalingrad

5/10/2024

1 Comment

 

I’ve taught 20C European history a bunch of times and figuring out how to frame the two “hot” world wars always presents a challenge. Students expect some coverage of each war, but I’m much more interested in their cultural and epistemological effects than tanks and tactics. While WWI was nominally global, almost all the military action occurred in and around Europe. WWII, arguably started and finished in East Asia, with European military action confined to 1939-45.

Even so, I think it’s useful, especially for US students, to consider the European War in  six phases: 1) Germany & Russia vs. Poland (1939), 2) Germany vs. Denmark, Norway, Netherlands, Belgium, and France (+ the Battle of Britain) (1940), 3) Germany vs. Russia (1941-43), 4) Russia vs. Germany (1943-45), 5) US and Britain vs Germany and Italy in the Mediterranean (1943-44), and US and Britain vs. Germany (1944-45).

We in the US have a pretty intense mythology around the European War, centered on D-Day and the triumph of American arms. The most recent popular video treatment is Spielberg’s “Band of Brothers,” which is quite good. But it is too easy for us to fall into the trap of thinking that WE won the war, defeating the evil Nazis and their Italian henchmen. Yeah, we had some help from Churchill and the Brits, and a little from the French, and—oh, yeah—the Russians did their bit, but they’re kind of coarse, and besides, they’re Commies. So, as is too often the case, our history makes it seem that it’s all about us. However, a pretty good case can be made that in terms of sacrifice, firepower, and strategic impact, there would have been no V-E day without Russia/Soviet Union. The pivot on the “Eastern Front” took place in 1943: the Battle of Stalingrad.

Stalingrad is a city (known until 1925 as Tsaritsyn and after 1961 as Volgograd) about 500 miles SE of Moscow and now has more than a million people. When the Germans decided to attack the Soviet Union in 1941 they swept eastward, reaching the fringes of Leningrad (now (and previously) St. Petersburg) in the north and nearing Moscow in the middle of the country. In the south, they scooped up all of the Ukraine until they were finally stopped at Stalingrad.

The battle in and around the city lasted about seven months and was noted for its intensity. It remains the exemplar of urban warfare in the modern world. The Germans were already strategically overextended and proved unable to sustain the projection of massive military power 1400 miles from Berlin. Several Axis armies were surrounded and surrendered and the long campaign to reach and destroy the entirety of the Nazi war machine was launched. Sixteen months later, the Western Allies landed in France and the European war was concluded in another eleven months.

Throughout the ebb and flow of the battles on the Eastern Front, both Germans and Russians were acutely aware of the historical echoes of the previous mass invasion of Russia: by Napoleon and his “Grand Armee” in 1812. Napoleon was defeated by poor French planning and smart Russian strategy, and a brutal winter rather than by superior force of arms.  In many ways, Hitler fell into the same traps—the rhyming of history.

All of this was brought back to me by a recent reference to an underappreciated Russian novel: Vasily Grossman’s “Stalingrad,” which was written in the aftermath of the war and which is now available in a new (2019) translation. I’m only partly through the massive (900+ pages) book, but so far, it lives up to its billing. Much of the critical respect for the book draws on a  pair of parallels: Napoleon’s 1812 expedition: Hitler’s 1941-43 invasion; and Tolstoy’s “War and Peace” and Grossman’s works. Tolstoy’s epic chronicles the impact of Napoleon’s invasion on a set of Russian families and Grossman (fully aware of the precedent) does the same for his 20C story.  While Grossman’s book is large, it is only the first half of a saga (continued in “Life and Fate”) which, together outweighs “War and Peace.”

Both bring a sharply observed sense of the individuals involved (the fictional equivalent of the anthropological “thick description” that gives the reader access to psychological and sociological understandings of the subjects). Both combine a fictional history (although Grossman was writing current events and Tolstoy was writing from fifty+ years distance) with philosophical reflections on the nature of war, family, and history. Both are self-sufficient as door-stops.

Big books aren’t everyone’s taste. I rather appreciate the opportunity for deep immersion and have a decent set of this distinctive genre, both in fiction and in history. Perhaps it is a certain fatalism and the long, dark winters that make the Russians particularly prone to write them (see, e.g., Solzhenitsyn’s Gulag Archipelago).

Such a novel brings an important perspective to the usual historical treatment of a major geopolitical event such as the Battle of Stalingrad. We have to bear both the macro and the micro in mind. The travails of the people in Grossman’s story tie down the gloss of the summary “the Germans invaded, got beaten, and pushed back” historical take. Such great events are, after all, entirely comprised of dozens or thousands of such individual experiences. The death and destruction of war makes it hard for historians to capture this level of detail (sources are sparse) and it often falls, therefore, to novelists to fill the gap. Tolstoy did it with a stunning scope of imagination. Grossman, present in the specific time and place, brings more of a reportorial eye. We can learn much from both.

I don’t know when I will teach 20C European history again; but I’m sure my handling of this aspect of the war will be different as a result of Grossman’s deep dive.


1 Comment

Life, the Universe, and Everything

5/3/2024

0 Comments

 

OK, so here’s a deep dive (best to read slowly).

The universe is in process of evenly distributing the matter unleashed at the Big Bang. When everything is in place, all the energy will have been used up and the so-called “heat death” will occur. Don’t wait up: the duration of the universe has so many zeroes that I can’t even fit the number in a blog posting of 3-4 pages.

Physics is about what happens until then. Biology is about a small subset of that activity called “life.” Psychology, philosophy, and epistemology are about how we and other “intelligent” life forms think about all this. By this accounting, psychology will, therefore, go extinct first, followed by biology, leaving it to physics to (so to speak) turn out the lights.

Entropy is the process by which everything (starting with the basic physical structure of matter, but also including ordinary stuff like desks and apples) is falling apart. It’s not just a matter of things rusting  and wearing out, gravity is doing most of the work here (at least the part that we can actually see); other atomic forces will do the rest. Life is not only built on those structures, but embodies a way in which that matter evolves in response to external forces.

In many ways, there’s not much difference between things that are “alive” and those that are not. Planets accrete matter via gravity, animals accrete matter via eating. Falling off a cliff will smash a stone as well as a skull. Those globs of matter that are “alive” adapt to and alter their environment. There’s no particular “intent” here, those that do something useful and effective will live longer and reproduce more (Darwin). Those globs of matter that are “conscious” (a term to which I will return shortly), actively and “intentionally” try to adapt to and alter their environment.

You can’t say it’s the “job” of consciousness to fight entropy (since that implies an intent which has to come from some outside source (aka God)), but it is an important effect. Since we are aware of entropy and the heat death of the universe, we can see that everything we do has an anti-entropic effect (even stupid and destructive actions). This is the battle of life; one which, on our current understanding, we are destined to lose (these are the basic physical laws of thermodynamics). The ultimate futility (and its more immediate ramification in our own individual death) is the “human condition” and the source of most psychological stress we know.

However, getting beyond our individual lifespans and looking at our species (and, indeed, the genus of all “intelligent” species in the universe), it’s pretty clear that we’re nowhere near understanding what this “heat death” business is all about (much less if it’s actually going to happen). Millennia of metaphysics and religion have postulated frameworks, but none seems satisfactory as yet. If you date the beginning of “science” to Egyptians and Mesopotamians from 5000 years ago, or the Greeks of 2600 years ago or even Isaac Newton from 350 years ago, we have come far in a short time. Certainly, if we have another few millennia, we will look back on the modernity of the 20/21C as being as primitive in our understanding as we think of the Babylonians. So, there is hope (immediate crises notwithstanding) that we will figure out a lot more well before the Sun runs out of hydrogen in five billion years, much less the Universe checks out (apparently) permanently.

Of course, other than our (highly localized) experience, there’s no reason to think that soft, fleshy carbon-based physical containers are the exclusive mode of life. We are in the process, via computers, robots, and AI, of creating an alternate morphology. The universe is a big place and no one should be surprised if something (that seems to us) bizarre shows up.

Ditto for “consciousness” (a social construct if I ever saw one). Whether AIs achieve this exalted level will depend more on the definitional debate than their capabilities. We seem to be reluctant to contemplate the possibility, which is more about humans trying to preserve the last vestiges of their (our!) claims of uniqueness, power, and meaning than it is about whether some electronic/quantum device can pass a Turing test or whether some chimps or dolphins can learn some smattering of language. There would be no small irony if, after such a long time wondering if “there’s anyone out there,” it turned out that those “others “were first found here, on the planet, likely in some lab in Cambridge or Mountain View.

Consciousness is premised on the creation of an “I” that is distinct from everything else in the world. It may be an illusion, but it’s one that has got us to where we are today. As with Newtonian physics, it may not be entirely accurate (thanks to relativity and quantum uncertainty), but it certainly works well enough for all of us who rarely travel at a rate near the speed of light. So, too, with consciousness; our habit of envisioning the world as a thing apart may be a fact, as may the uniqueness of our independent creativity (the apparent mark of humanness). But, if not, we can still carry on: loving, writing, painting, enjoying ice cream. Actually, I kind of suspect that Descartes (or, to be more precise, his popular interpreters) got it wrong. Descartes likely meant that self-recognition of the act of thinking presupposed existence. Rather than cogito ergo sum (a phrase that Descartes never wrote), it may be better to say with Kierkegaard (the 19C Danish philosopher): I exist, therefore I think,” in other words, the ability to recognize self-existence is the first act of thinking.

The AIs of 2024 already seem to be at this point. So, we humans are perhaps not so special (in this regard, check out my 093022 posting “Centric”).

There’s an amusing SciFi series called the “Boboverse” in which friendly, moral AIs in spaceships roam the galaxy indefinitely, long after humans are extinct. Perhaps they will figure out this “heat death” thing. But, like I said: “Don’t wait up.”




(btw, thanks to Douglas Adams (whose Hitchhiker’s Guide to the Galaxy is 45 years old) for the title)

0 Comments

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    November 2025
    October 2025
    September 2025
    August 2025
    July 2025
    June 2025
    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    December 2024
    November 2024
    October 2024
    September 2024
    August 2024
    July 2024
    June 2024
    May 2024
    April 2024
    March 2024
    February 2024
    January 2024
    December 2023
    November 2023
    October 2023
    September 2023
    August 2023
    July 2023
    June 2023
    May 2023
    April 2023
    March 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly