Steve Harris
  • Home
  • About
  • Courses
  • Research
  • Other Sites
  • Contact
  • Condemned to Repeat It

Rights and Wrongs, Roe and Wade

7/1/2022

1 Comment

 
The leak of the Alito draft opinion in May this year accelerated the inevitable strident debate on the legal treatment of abortion in this country. It was a dress rehearsal for what we are seeing now, but the leak didn’t change anything and the opinion finally released last week got some things right and some things wrong.

What it got right was process and structure. What it got wrong was the impact on millions of people.

The legal status of abortion in the US has been set for almost fifty years by a court decision that had to stretch to find constitutional justification for immunizing women for the terminating a pregnancy. It preempted state regulation of abortion prior to fetal viability.

The substance of Roe—the line-drawing and moral balance it struck—make a great deal of sense to me (and, apparently, to the majority of Americans). The thought that the government should regulate a person’s control of their own body (in the absence of a) other criminal activity or b) harm to another person) seems a direct application of the core principle of personal liberty on which our country and most other modern liberal democracies are founded.

At the same time, the thought that the government should allow the death of a fetus is also deeply troubling. Of course there are important issues of viability and “personhood” here. They are further complicated by the advances of medical science that has been bringing forward the date of fetal viability.

So, we have two sensible, plausible principles in the abstract that run directly into each other. It’s hard to say there is a plain, ethical “right answer.”

This clash of principles is fundamentally complicated by the sex-based difference in who writes and who is affected by such regulations. If both men and women got pregnant, then the tensions noted above would still be present; but sex differences inevitably skew everyone’s analysis. In fact, millennia of patriarchy have led to laws written almost entirely by men and broader social values also articulated almost entirely by men. While this has begun to change over the past 50-ish years, since social inertia is even more embedded than are legal provisions, our traditional worldview is still very much with us.

This is an intractable analytic problem since women will always be the only subject of abortion regulation. Simply stated, there is no objective (sex-neutral) stance on this issue. Men’s conception (so to speak) of the impact of governmental intrusion into an individual’s body can never be as organic as can women’s and it would require a concentrated empathetic effort to approximate it. Getting past the embedded social inertia of patriarchy to do so is difficult indeed.

In any event, ideally this issue of moral balancing should be a matter for legislative action rather than judicial. But even back in the 1970s, the politics of abortion were fraught (although placid by today’s standard). The Court’s intervention effectively took the issue off the political table for decades. This is not to ignore the many efforts at regulation that have been advanced, particularly in the past thirty years. But there was virtually no activity in Congress on this issue, and most people viewed the relevant governing rule to be judicial, not legislative.

I have written before about the problems that result from the disenfranchisement of the political process. It’s not healthy for the body politic. It alienates people from their democracy and makes the judiciary an easy target. It also facilitates reliance on the judiciary and made it easier for those who are “pro-choice” to not pay as much attention to legislative solutions (e.g. codifying Roe) as they might have.  This decision has already incited a new activism, but it’s way too early to tell whether the response will have the necessary staying power or breadth of engagement with the wide range of challenges our society faces. Although, the other recent decisions—on the EPA’s powers (& therefore the climate disaster), various voting-related rulings, and gun control—will broaden the group who see a fundamental constitutional problem.

One problem is inertia. Existing rules—whether legislative or judicial in origin—continue along and they are difficult to change, regardless of the will of the “majority.” Our system has become sclerotic and change is difficult. The Roe Court preempted this inertia and jumped to a plausible, sensible compromise solution to the conflict of two sensible principles. “Pro-choice” forces have to come to terms with the fact that there are many in this country who would draw the line elsewhere and that overcoming the vehemence of the “pro-life” forces will be a difficult and perhaps impossible task, but it must be a political task, not a legal one: changing legislatures, changing constitutions are the work of the day (and the decade). Here, as in many issues, table-pounding is of limited use.

Another important angle is that a “conservative” court whose rationale speaks to historical traditions as the premise for law is also enamored of “originalism” in judicial interpretation. There are many problems with originalism, not least that it assumes a clarity of historical interpretation that almost all historians disavow. It is ironic that this decision and the “penumbra” it casts over dozens of heretofore seemingly well-established rights (as pointed out by the dissent) will likely be an important part of the spur for modernizing the country’s laws and constitutional principles. The more things get out of whack with reality (which is the essence of conservatism) the greater the opportunity for some sort of revolution. This decision increases the tension on the social cohesion implicit in a democratic society.

In the short term, I don’t have any simple solution to all this. I think a Roe-ish statute (a la the Collins-Murkowski bill) would make the most sense, but our politics is far too poisoned to get to a sensible compromise. But we’ve got to get to work to see what is possible. More broadly, we have to take on the fundamental test of democracy: getting enough people to care about the nature of their (our) society to write rules and build a culture which is willing to work together

1 Comment

The Meaning of Fifty.2

6/24/2022

0 Comments

 
The great fiftieth reunion has come and gone.  Back in ’72 we might have predicted that it was going to happen, but had no frigging idea what it meant; indeed, we had little sense of the concept of fifty years at all and no patience to contemplate it. Even incremental reunions along the way seemed to exist on their own, with no apparent connection to prior or subsequent events on the list. Indeed, the most common refrain was on the absurdity of it all. For me, this reunion brought home that my self-mythologizing of my life is far more fragile and constructed than I might like to think.

So, before I go too far down that trail, congrats and thanks to those who, over the past 18 months or so, chatted up the idea and then did the unglamorous work of putting together what will likely be a model for other alumni classes in terms of events and esprit de corps. I was glad to be part of the team.

As in the past, we were—entirely justifiably—self-congratulatory. Back then, we knew we lived in unusual times and were given rare opportunities to learn and live with instructors and a campus of extraordinary beauty.

This time, we had about 80 folks present (almost half of those still around), a remarkable showing especially since only about 15-20 still live in the Metro Detroit area. We had gracious hosts from our class for some special evenings, a well-executed program from the School, musical talent, and an outstanding guide to the revival of the city of Detroit (an amazing story of history, demographics, collapse, and innovation).

As a historian, the saga of Detroit’s mid-century power, late-century decline, and recent resurgence illustrates the cyclical nature of many historical phenomena and the futility of prediction based on blithe extension of current trends. In the end, I came away optimistic; having seen the effects of intelligence, inspiration, and effort that stand as a rebuttal to the easy despair of our current national/global situation.

We were fortunate to have with us one of our teachers with us for the weekend. He (too) was young then (fresh out of college) and was part of the English Department which gave me great gifts of literature, criticism, creativity, and the discipline of writing. As a professor now, I was especially glad to be able to thank him (as did many others) for his work and to give him a glimpse of his effect on me and the world (which is the secret and rarely-found food of all teachers).

I had noticed, as early as my tenth reunion, that the campus, even if a bit more buffed up than in my day and graced with all manner of new facilities, seemed smaller than when I was bustling through as  teenager. The new buildings and refurbished interiors made clear the distance from then ‘til now. We were there, in the words of one school song: “shorter in wind as in memory long,” but it wasn’t ours anymore (if it had seemed so at the time). The school, the students, the styles were clearly of this time and era; and, in that way ordinary. Our memories—of different clothing, different music, different sophistication, and different presence—stood out for their difference (not to say ancientness): faded ghosts running down the same hallways.

Of far greater richness was the time—three-days—of hugs and back pats, winks and knowing smiles, some grimaces, some tears together. How can words capture the immersion in memory? All the cliches are rampant: in the mind’s eye hair is longer, fuller, darker; faces fresher; steps springier.

Even richer was the chance, if only for an hour, to leave those cliches to the side and have some serious talk and reflection about our state(s) of mind.  I was gratified that about 30 folks came together to push past the glad-handing and make an effort to see ourselves and each other. We have now all returned to our “normal” lives; but I, for one, feel a bit more anchored, both to my past and to my fellows. The time has come, as I said to the group, to “put down the baggage” of the competition and insecurities of that youth. Getting some understanding of the intervening years/lives of all those with whom I worked and played has helped to clear away those ancient burdens and see my self (both then and now) more clearly.

As our discussion ended, I challenged the group: The clock is ticking. There will be fewer of us at the 60th. When we re-gather, let’s report back on how will I account for the intervening decade (beyond the inevitable physicalities)? How will I leverage what I learned long ago (and since)?

Back in 9th or 10th grade English class, we read and had to memorize Coleridge’s poem “Kubla Khan” (1816) [it was an English-inspired prep school, after all].

I can still recite it (mostly). The closing lines read:

 Could I revive within me
   Her symphony and song,
   To such a deep delight ’twould win me,
That with music loud and long,
I would build that dome in air,
That sunny dome! those caves of ice!
And all who heard should see them there,
And all should cry, Beware! Beware!
His flashing eyes, his floating hair!
Weave a circle round him thrice,
And close your eyes with holy dread
For he on honey-dew hath fed,
And drunk the milk of Paradise.

Part of my mission was to find the boy who was entranced by that language, to reconnect to the energy and the possibilities it splashed across my consciousness; to push the intervening years aside a bit and look forward as I did then.


0 Comments

Games Historians Play

6/17/2022

0 Comments

 
Just so you know, in the last few days, I’ve been a Jesuit, Cardinal, and Pope; then an English Viscount. IRL, I just got back from Boulder, where I attended a conference of history professors who play games. In class. And get paid for it!

Actually, we spent four days talking about how to teach using historically-embedded live-action role-playing games in history (and other) courses, mostly by practicing playing the games ourselves. This time, I played one game based on the Trial of Galileo (17C Italy) and another based on the Industrial Revolution (19C England).

In the first game, I was Christoph Grienberger, an Austrian Jesuit mathematician in the College of Rome. I played Vatican politics so that I was elected Pope (the first Jewish guy to do so since St. Peter!) and tried to protect Galileo, but the conservatives were unhappy and elected an antipope! In the second game, I was Viscount Melbourne, the aristocratic magistrate of Manchester in 1817-18. I tried to keep the peace between workers and merchants, all the while investing in the newfangled industrial factories. I made a pile of money and ended up with a fine country estate and a splendid house in the city as well. I had to “read the Riot Act” to the disgruntled workers and sentence one troublemaker to be banished to Australia.

For over 20 years, Reacting to the Past has used these games to teach history, critical thinking, and communications skills in classrooms in over 500 colleges across the country. The group now deploys more than 25 published games (including instructor’s manuals, student gamebooks, and individual character roles), and has over 100 games in development addressing situations from pre-history to the 21st century and engaging students with primary sources from dozens of cultures. Through structured debates and the motivating elements of collaboration and competition, students teach themselves and one another about conflicting ideas and motivations from political, social, strategic, and cultural inflection points in history. Check it out here.

It's not “re-enacting,” i.e., replicating history. Students are not obligated to do what “actually happened” in history. Sometimes, Socrates is not convicted by the Athenians. I’ve had Constitutional Conventions where they couldn’t agree on a document and the US stayed under the Articles of Confederation. Sometimes, slavery is abolished; sometimes not. Sometimes, World War I starts “on schedule,” sometimes peace is maintained. Students get to see that individuals matter and if they don’t speak or vote, their side will more likely lose.

I’ve been part of this group since 2017 when I stumbled across some of the game books at a publisher’s table at a regular historians’ conference. I was looking for some way to get my students more engaged in their own learning. What I found was a remarkable community of teachers and a way to combine fun with history. I’ve been to a bunch of conferences since and was glad when we re-emerged from COVID to gather again this past week. It was the first time I’ve been in a room with dozens of historians in over two years (oh, we also have teachers of philosophy, politics, economics, communications, rhetoric, literature, etc.).

I use several games regularly in my courses: Athens 403BC and South Africa 1993 feature in my freshman seminar on the history of democracy; Philadelphia 1787 is the focus of my course on US Constitutional history (along with a short game on the 13th Amendment); a game on the diplomatic crisis leading up to WWI was the culmination of my 19C International History course. I have also run games on the English Glorious Revolution in 1689, the French Revolution in 1789, Kentucky’s decision on secession in 1861, as well as some short introductory games.

I really love using this way of teaching (and it is not (obviously) because I don’t like to give lectures)). Student’s (most of them at least) read primary sources and argue about the issues of the day. They have to figure out practical politics of teams and factions, negotiate deals, based on the beliefs and the goals that their characters actually had. I sit in the back and watch, guide, and grade. My workload is actually about the same as a “regular” class, but what I do is different. I’m more of a coach to their learning than the font of wisdom and knowledge which is the usual stance of university professors. More importantly, students come away with a deeper knowledge of some segments of history as compared with whatever they might retain (usually not too much) from lectures, especially in terms of the context and mentality of the time and place we’re studying. They have fun and they learn more (not a coincidence).

As for me, I get to work with authors of the games I am playing and trade tips and ideas with others who are running games. As a recovering lawyer/business guy, I’ve also been able to help the organization improve its structure, management, and strategy.

Overall, Reacting has been great for me as a teacher and as a colleague. Based on many of their comments, it’s been terrific for my students too!

0 Comments

Vox Populi

6/10/2022

0 Comments

 
It seems that much of what passes for ‘news’ these days is really just reporting on what people (ordinary folks, that is) think, whether via structured polling or person-in-the-street interviews that purport to represent the sensibilities of a broad group. From one perspective—as a tool to predict political or commercial activity—there might be some substance here, but more often, not. When compounded—one media story reporting on what other media are reporting or reaction to such reports, I usually have the sense that (per McLuhan) “the medium is the message;” i.e., that all this airtime/ink/electrons are being spilled to create and validate media power rather than actually tell us something. Particularly useless in this regard are the reports of what “the people” think about thus-and-such factual matter (e.g., whether COVID is a bigger threat than the flu or whether China will leverage the Russian debacle in Ukraine), as if public opinion could change (or create) the real world of epidemiology or geopolitics. Just as bad are reports which (in the interests of “fair and balanced” news reporting) offer opinions on “both” sides of some issue as if the presence of some “controversy” justifies coverage when often it’s merely a tool for the media to hype up the adrenaline in order to get us to watch.

I suppose that this noise is, to some degree, the product of the modern global culture (fetish?) of democracy. It’s not enough that the masses should choose the President, but now they should also tell us what is true or good.

Looking across the world, it’s striking how much the mantra of “public opinion” seems to count, even in countries with considerably less “democracy” than the Western model. Since relatively few countries experience popular revolutions (as compared with military coups), the substantive weight of the “people” in such countries wouldn’t seem to matter much. Does anyone expect marches in Beijing, Moscow, or half of Africa to topple the existing power structure? Popular unrest might prompt insiders to launch a coup, but that aint democracy. So, what does it matter? In other words, “public opinion” (as mediated/defined/invented?) by the press represent anything real or is it just a construct for the chattering classes/politicos?

From a historical perspective, it is hard to find countries where democracy was more than an aspiration before the 20C. Thus, it has always been intriguing to me to see references to “public opinion” in the19C, even in Britain or France (which had as much of a claim to democracy as any (even with only small fractions of the population having a vote)). So, it’s important to read the phrase “public” as meaning “the powerful outside the formal governmental structure” rather than fall into the anachronistic trap of thinking of the subjects of a Gallup poll or the number of “re-tweets.”

Most modern historians would argue that there wasn’t even such a thing as the “public” until the late 17C (England, France, Netherlands, US). The 20C German writer Jurgen Habermas articulated the concept of the “public sphere” to describe the emergence of a group of people who developed and debated opinions on the political and cultural issues of the day, often in the context of salons, newspapers, and newfangled coffeehouses, they were outside the scope of official discussions within the royal court, but including an open-ended group of people, beyond a private dinner party; thus: “public.” It’s hard to imagine the birth of modern democracy without such a space and so, the “public” and its opinion became an essential part of the model.

Very often, pre-20C (indeed, pre-WWII) “public opinion” really meant whatever the leading media of the day said it did, sometimes based on their own views, sometimes based on (the very limited range of) the people they talked to. Sometimes, “public opinion” was a shorthand, used by leading public figures to project their own views and clothe them in the garb of popular support (when it really was just the views of three fellows at the club last night). All-in-all, not worth very much except to create the impression of democracy.

Now, we have polls—lots of polls—which tell us with apparent statistical validation what it is that “the people” think. For many reasons, they can be (at best) a coarse diagnostic tool for understanding our society and its politics. Certainly, they provide fodder for breathless news reports (e.g., “exit polls” for those who can’t wait a few hours for the actual election results) or for the apparently more considered question of whether “the people” have a “favorable opinion of______.” Such questions conflate people’s feelings (i.e., whether they are happy about the state of their lives) with something more objective and analytic (i.e., whether the President is effectively addressing the issues of the day or implementing his promised agenda). It’s all part of the commodification/marketing of politics.

The extent to which we now commonly take polls for politics is symptomatic of the superficiality of modern democracy (both a bug and a feature). The specialization/division of labor inherent in periodically-chosen representative democracy is based on the complexity of modern life and the inherent limited education and short-attention-span of most of the electorate. At best, polls can only provide a directional indication for policy (another reason why California’s law-by-referendum process is so cockamamie).

“Public opinion” is a great concept—in theory. But it is redolent, in Shakespeare’s phrasing, of “sound and fury, signifying nothing”  or, to paraphrase the famous line about the effectiveness of advertising: “half of it is meaningful; we just don’t know which half.” It provides news filler, rationalizations, and a substitute for sound thinking. Whenever someone cites it to persuade you of something, makes sure your wallet is secure

0 Comments

Sound Tracks

6/3/2022

0 Comments

 
Growing up in the era of Motown and the explosion of rock-n-roll, it’s hard for me to imagine young people having an affinity for the music of my parents’ era. Sinatra, Streisand, Mitch Miller, etc. seemed awfully ‘white bread’ compared to the rhythm of the Four Tops or the juice of The Who. While I now have some appreciation for the standards of the American Song Book (mostly from watching my wife perform much of it in the last twenty years), back then I found it b-o-r-i-n-g. And, while I’m not sure that (even now) I can articulate what “Stairway to Heaven” or “Sgt. Pepper” were all about, their meaning and their power were evident to me at the time (preferably at loud volume).

I pretty much disconnected from pop/rock music in the ‘80s and have shifted to classical  (especially baroque) and jazz, but I have observed that one of the ways in which late 20C/early 21C culture is different than ‘back-in-the-day’ is the taste of young people. Naturally, their tastes run primarily to the popular music of their day, especially various flavors of rap. It’s not my style, but what is remarkable is their affinity for the music of my era. What are my friends’ kids doing lining up for a Stones’ concert, or going to see the Eagles’ (umpteenth) reunion tour??? Why are they downloading  tracks from Santana or Stevie Wonder onto their iPhones?

There is part of me that would like to think that the popular culture of my era was distinctively great; with universal appeal that transcends generations. The first era of rock-n-roll (i.e. from 1956 up to Disco) was awesome, but I’m sure much of my opinion is solipsistic. Actually, however, I suspect that there is something else going on here. It has to do with the infamous “generation gap” and the evolving nature of mass culture in an increasingly technology-driven media environment.

For one thing, culture (especially music) is much more easily available today than 60 years ago when getting a record player or an “8-track” was a big deal. This has been true since CDs (the fax-like music technology of the late 20C) and even cassettes made it simple to get your hands on and swap the latest tunes. Making the “back catalog” available was low-hanging fruit for the music companies (and now, with streaming, even more so).

Second, we “owned” our music in a way that differed from our parents’ relationship with their music. Part of this was due to the inherent rebellion of Rock and its tight generational affiliation. We were proud of our music in a way I don’t think our parents ever were. It’s not surprising, therefore, that we were prone to push it at the next generation. Plus, the economic and cultural power of us boomers made it harder to avoid our tastes and memories than those of earlier generations. We can see this in the innumerable (and often horribly-named) tribute bands reviving the “Golden Oldies” of yesteryear; not to mention hearing “Layla” on Muzak.

Third, the evolutionary nature of popular music since the ‘60s made our sales job easier; i.e., our music is closer to our kids’ music than our parents’ music was to ours (e.g., in terms of volume or rhythm or angst). So, it’s a much shorter bridge and we were more willing to cross it and to bring our kids over to our stuff (and they were more willing to listen than we were, too!).

I wonder how much of this transfers to other modes of popular culture? Remakes of movies and play revivals hearken back to their originals; but that’s not new, Shakespeare and Euripides have been performed for centuries. What is different is the continuing availability of the originals. Although the first two tries at Dune are best left unwatched; the 1951 version of A Christmas Carol is still the best. The remakes say more about the profitability models of media companies than about an aesthetic judgement of earlier versions (see also King Kong). Still the continued availability of the earlier works makes it feasible for multiple generations to share the same cultural experience.

Overall, it seems that the presence of electronic media has fundamentally changed the nature of the inter-generational transfer of culture. Given the rapidity of technological change, it’s likely that the 20C phenomenon noted here will morph considerably on the rising tide of games, VR/AR, mash-ups, and other experimental/experiential genres. It will be interesting to see if grunge bands or rap artists carry the same longevity twenty to thirty years hence or if the current phenomenon was/is a “one-off.”

But for now, we can see that the nature of culture has changed. The past is still available in ways that weren’t previously possible. The ties between generations is different. Perhaps have we found one way to bridge the famous “gap?”

0 Comments

The Meaning of 50

5/27/2022

1 Comment

 
Next month, I will be going back to the Detroit area (where I grew up) to participate in my 50th high school reunion. I was given the extraordinary opportunity to go to the best private school in the state during a period of remarkable change in US society (1968-72) with a group of students who knew, even then, that we were a unique group.

While I have stayed active as an alumnus as part of the school’s official program, our class in particular, and (in terms of personal connections) to some of my classmates, this anniversary of our graduation has opened a new perspective, leading me to realize, in a personal and profound way, the path of my life, the nature of memory, and the meaning of history. There are those who dismiss reunions and their accompanying memories as either nostalgia, a place for mental indolence and rampant historical revisionism, or a celebration of contingency, exalting the chance meeting of people for a few years out of an extended lifetime which carries no more than its proportional weight (<10%). All are certainly risks, but they seem worth taking for the benefit of reflection (and not a few beers and laughs).

All sorts of influences burst into the consciousness of the era—political, aesthetic, technological, and herbal—on top of the usual strains of the hormonally-defined world of adolescence. It all seems impossibly distant now, but it’s equally impossible to know how our experience of this fifty-year distance squares with that of our grandparents’ generation or that of our grandchildren. I suspect that the pace of change has accelerated across the 20C; so, while we have seen more change than our forbearers, we are more used to change and acceleration which makes it, in a way, less disorienting; and the same is likely true for millennials compared to us.

Some members of our class recently had a Zoom call with some current graduating Seniors from the school. It was great to reconnect with their energy and sense of opportunity. It also struck me how many of the great cultural changes that were so urgent in our era are embedded and ordinary now. We “matured” in an era when “sex, drugs, and rock-n-roll” were not only our bywords, but were fairly new cultural descriptors. The cultural power of us “Boomers” has carried much of that forward. We saw (and often helped make) cracks in the social rigidities of gender, race, and religion that have, in the ordinary slow pace of historical development, brought us to the choices and confusions of the 21C.

One benefit of accumulated experience (a nine-syllable euphemism for “age”) is that we lived it and know it. Of course, we should always be wary of conflating memory with history. One of the interesting aspects of talking with people with whom I shared a lot of experiences back “in the day,” will be to see what of my memories are corroborated by others and what events and interactions I have stored so far back in my internal filing cabinet (another incipient anachronism) that their rediscovery will be revelatory (and disorienting in their own way). Perhaps I will get to see what aspects of the persona I have constructed over this half-century are truly rooted in those years or since or what was just a mask. Perhaps I will see what aspects of the “Steve” I created at 16 were real or useful or costly. Perhaps it’s finally time to put down some of the baggage I decided was so crucial to pick up back then. If I am fortunate, perhaps some of my group will—with candor and kindness—tell me how I appeared to them.

Anniversaries are always an opportunity to look back; to press “pause” on the day-to-day stories of our lives and try to comprehend the big chunks of change in ourselves and our world. (Back then, we “pressed ‘pause’” on the coolest tech of the day: an 8-track cartridge, rather than just asking Alexa to do it.) We get to try to see what of all the things we were excited about (then, and since)—cars, sports, creativity, relationships, moon landings, the War, college choices—really mattered when assessed over the course of a lifetime (so far). We redraw a line from there to here. It lets us see the turns not taken; the flukes, the choices, the plans and the surprises: it’s a life.

For all the advantages I had (socio-economic and genetic), did I actually make better choices? How did I use this launch pad to benefit my life and maybe even to craft it? Was my learning curve towards what I currently conceive of as “wisdom” any steeper? Were the regrets over which I anguished worth the stress?

The other end of this particular rainbow is today. Of our class of ~180 (boys and girls (in separate schools when we started)), we have lost 11 (that we know of). Their memory will hang over us, but likely less for what I remember of them than as an insistent reminder that our 60th reunion will see a lot more of us gone. Can I relish our time with this shadow on the one hand and an appreciation that a 90%+ survival rate in our late 60s would seem remarkable to our grandparents’ era? And, given the decade or two (on average) remaining to me, what will I do with this combination of memories and inspiration?

As we updated the contact information for our group, I was initially startled to see a lot more addresses in Florida, North Carolina, and southern California than I remembered when I was keeping active tabs on our class. Then it struck me: “geez, we’re ‘retiring’ now.” After all, we have already spent most of our lives. Fifty years on: careers increasingly completed, families raised and dispersed, so much to digest, so little to grasp onto.

As a Historian, I know that it takes some attention to separate the signal from the noise; to construct a narrative and choose what is significant and what is the interval. There is one story that emphasizes the past fifty years and consigns the earlier time and the time remaining to the sidelines. At the same time, from another perspective, I can see the fifty years since “back then” (even if briefly interrupted by prior reunions with some of the group) as a gap for which hormonally-imprinted youth and the immediacy of age seem all the more real.

The musical we produced our Senior year was “West Side Story,” the current revival of which provides one of many inevitable historical ironies of any pair of dates. We had an extraordinarily talented group of singers, dancers, and producers. It gave us part of our lives’ soundtrack, along with Motown, the Moody Blues, Zeppelin, and Don McLean’s “American Pie” (top of the charts for 1972!).

The show concludes with a reprise of “Somewhere,” insisting that, for Tony and Maria, “there’s a time and place for us.” For our group, which I like to think was extraordinary (even if only because it was ours), we had an overlapping life in a particular and rare “time and place” for which I am immensely grateful. This group…this time…this place… helped launch me on a particular trajectory (skewed with gifts and baggage). This regathering, fifty years on, will help me take stock of that trajectory and help me plan the rest.
1 Comment

Denial

5/20/2022

2 Comments

 
If you get caught in flagrante delicto, the late great comedian Lenny Bruce has some advice for you: “Deny it. Flat out - deny it! If you really love your wife, deny it. If they got pictures, deny it. … If they walk in on you, deny it. Just say this strange chick came into the apartment shivering with a sign around her neck that said, ‘l have malaria. Lie on top of me and keep me physically active or I'll die.’” – a schtick by Lenny Bruce (from the movie Lenny (1974)).

We all deny stuff (I know my own list is … robust). Since the days of the broken window and pointing at my little brother, we have done so most of our lives.  As Lenny Bruce implied, even if the facts are clear, there’s some small chance you might get away with it. Why? Because, as Bruce said: “They want to believe it!” Or, as Jack Nicholson’s character said in A Few Good Men: “You can’t handle the truth!”

We all (or at least some part of virtually all of us) want to believe a nice story. A simple understanding of the world seems vastly preferable to the stresses of dealing with its complexities; and “truth” takes a back seat to sanity. Sometimes, of course, there’s no self-deception involved; denial is a cynical/dishonest ploy to avoid blame/responsibility (“Tobacco doesn’t cause cancer” worked for some folks for a while). But, sometimes we do it because we can’t tolerate living in a world in which the (denied) fact is true.

This explains a lot of the climate deniers or Covid deniers. A world in which the world (i.e. nature) is actually running the show is scary. Things were easier when just about everybody believed in God. All the weird stuff and problems could be written off to Him and were psychologically manageable via faith in His goodness or his plan/providence from which we would all (sooner or later) benefit.

Science, however, has shrunk the scope of God’s domain. He’s only around the fringes now and faith is harder to come by and seems to have less to do with how the world works than it used to.

Left to our own devices (so to speak) we fabricate coherence.

I’d like to think that this inability to cope is behind some of the well-known phenomenon of Holocaust denial. Certainly there were those who were excessive apologists for Nazi Germany. Certainly there were those who sought the fame/notoriety of controversy. Certainly there were those who had plenty of reason to distrust conventional and governmental information and then ran a bit amok. But, some folks couldn’t handle the truth of man’s inhumanity to man (or, more particularly, the evil of their own country/people/allies). Their weltanschauung (“worldview”) was shattered.

As I have pointed out in previous postings, there are a lot of folks here in the US and elsewhere whose weltanschauung has been pretty well hammered and so, things that don’t fit are labelled “fake news.” There are a bunch of folks who “can’t imagine” that our democracy is at risk/ Russia will invade / Japan would attack the US Fleet in Pearl Harbor/ Britain would leave the EU / … (you get the idea).

The rantings and machinations of the “Stop the Steal” gang following Biden’s victory in November 2020 are a textbook example. Trump couldn’t contemplate a world in which he lost; so he created one in which he didn’t. Millions followed (still follow) this delusion. Perhaps some will wake up and admit to temporary insanity; or they will just hope this incident fades into history and they won’t be asked to take a stance on the question. But, I suspect, too many drank too much Kool-Aid and will never recover. At this stage, it’s hard to imagine that Rudy Giuliani was a respected/feared US prosecutor and (not entirely terrible) Mayor of NYC. What’s left is a sorry knock-off of Batman’s arch-foe “the Penguin” who got suckered into self-parody by Borat.

Whether recent or more dated, in order to offset these imaginings, evidence and rational analysis don’t work so well when dealing with the most ancient parts of the human brain. Those in “fight or flight” mode don’t stop to read statistical tables.

It’s an interesting question as to whether this psychostress is uniquely or even particularly a “modern” phenomenon. I suspect that core bio-psychological human capabilities have been placed, over the past few centuries, in an environment of far more complexity and rapid change than for most of our first 70,000+/- years. The bling of electronic living has not helped, nor have the hormone-stimulating activities of the media and advertising industries. Indeed, it’s ironic that the same drivers of rationalistic modernity: the “Scientific Revolution” and Enlightenment have also led to these anti-rationalist pressures and many brains can’t stand the strain.

Regardless of its historical origins, however, denial remains an apparently useful tool for many. Lenny Bruce would be proud.
2 Comments

Revival of the Fittest

5/6/2022

0 Comments

 
The vast majority of folks I know are deeply concerned about the likely path of humanity’s interaction with our planet. There are, to be sure, plenty of issues to be worried about, both immediate and long-term, and they are sufficiently well-known not to require rehearsal here.

Each of these folks carries some combination of despair and doggedness (and still a bit of legacy enjoyment of current creature comforts). Moods fluctuate: water is saved, birds are counted, even while eyes/ears glaze over at the news reports and webinars detailing the latest dire report or development. Amid this, I have noticed a streak of resignation in which the expectation of some kind of slow-motion-train-crash is relieved by a sense that we (of a certain age) will be “gone” by then and won’t see the worst parts of it. Even the well-off and (otherwise) pretty sophisticated blithely seem to assume that their progeny will be spared through some sort of “gated-community” salvation.

Of course, there’s no telling how far down the path of global distress our species will take us. Again, I won’t parse through the various dystopias and scenarios that have been sketched out. Suffice it to say that there is a significant chance that civilization will crumble and some successor will have to be rebuilt. (I will posit for this purpose that it will be by humans, not cockroaches or dolphins.) This scenario is a playground for utopians, with soaring opportunities for harmonious relationships between peoples, genders, and the rest of nature.

I won’t dive into that normative debate (i.e., what kind of world would we want?), nor the related predictive question of what kind of world is likely? There are plenty of current political views out there already which will serve (equally well) as the basis for projecting for both desiderata and prognostication. Instead, I’d like to pose some other questions: 1) Should we tell them how we got here and, if so, what should we say? and 2) How would we go about sending such a message into the future?

Regular readers of this blog have heard me warn of the perils of divining and applying the “lessons of history.” Nonetheless, the demand for such apparent comforts as a coherent human history with “actionable” lessons remains strong and whether future historians/anthropologists/archeologists will be curious or future politicians will be looking for someone to blame, there’s no reason to think that this won’t be a fortiori true for the post-enviro-calamity world.

Indeed, three renowned SciFi books each wrestle with how the past survives in a such a future. Isaac Asimov’s Foundation trilogy (1950s) takes a “hard science” perspective on a galactic scale renaissance. Walter Miller’s A Canticle for Liebowitz (1959) goes down a more religious path. More recently, Neal Stephenson’s Seveneves (2015) includes a group who survive a (non-manmade) apocalypse by preserving the Encyclopedia Britannica in memorizable bites, one of whose members is named for her portion of scripture: “Sonar-TaxLaw.”

What should we leave behind? Should we start with the “Great Books” series from the mid 20C which sought to capture the finest thinking of human history (albeit with a strong White/European/Male bent)? Even a more diverse bibliography of ideas and literature might well be indigestible without some concordance/guidance/framework.

How might we account for the state of the planet and our civilization? Who will write the histories of how we got here? What of the stories of empires, genders, wars, ideas, demographics, technology, and everyday life would be worth preserving? The possibilities are endless and the arguments among historians would be too (as if no “hard stop” was imminent!) Will we (as we are doing in the more substantive vector of actual climate change prevention) talk and hypothesize; or will someone put pen to paper (to speak in 19C metaphors)?

But then, what use is philosophy and literature (or history for that matter) in a world that is rebuilding itself from remnants. The great Encyclopédie of Diderot and D’Alembert in the mid-18C addressed not only ideas but practicalities. It included hundreds of pictures of ordinary machinery because it sought to enable people to change how they lived, not just how they thought. Perhaps we should commission hundreds of “how-to” manuals, ranging from irrigation and simple pumps to solar panels (and also how to build the materials out of which all of this is made)? If so, how far down the technological road should we go before we are (implicitly) urging our successor societies to replicate our own (problematic) path? We might include all of it and let them decide. (But what critical histories of technology and society would we include in the package?)

Of course, it’s not at all clear who would make all decisions. UNESCO? A committee of Nobel Laureates? The Texas School Book Commission? I do know some historians, maybe I should ask them? More likely, it would be a small group of smart folks chosen by whoever raised the money to launch such an endeavor.

This brings us to the last stop on this hypothetical inquiry: Once you have the “stuff” chosen, how do you preserve it—for several hundred or a thousand years—until some group comes along who can handle this compendium of knowledge/wisdom? There are serious technical problems around data preservation and compression into a manageable size. What language (s) should be used? How do you design an educational path (including languages, math, sciences) so that people could (progressively) comprehend this material? Where do you store it so it’s both safe and discoverable?

I’ll stop with the questions now. It’s an interesting thought experiment. But I can’t help think that if we leave a mess, we should help clean it up—somehow (and apologize, too!).

0 Comments

Social Darwinism

4/29/2022

0 Comments

 
"It’s not likely that Charles Darwin had any idea that his novel understanding of the range of life forms on this planet (1859) would work such a profound change in popular epistemology, much less mutate and spawn a new framework for looking at human societies which would be called “social Darwinism.” Indeed, he died in  1882 and the phrase didn’t gain much currency until well into the 20C, even if its essential concepts were developed by Herbert Spencer and other European thinkers late in the 19C.

The gist of the idea starts with Darwin’s theory that species compete for resources and those that best adapt to their environment (via mutation, procreation, and expansion) will fare better than those who don’t fit. “Social Darwinism” then applies this model to human societies (tribes/races/nations) rather than biological species. Spencer’s phrase: “the survival of the fittest”  captures both the original and the adapted theories.

There are a bunch of problems with this conceptual sleight-of-hand, but two stand out. First, groups of people are not different species. Even “races” are more of a social construct than the difference e.g., between a two-toed and a three-toed sloth. In other words, differences and divergences within our species have remained just that: within our species. In the (more-or-less) 10,000 years since humans started settlements, societies, and agriculture, we haven’t had enough genetic time to change very much. And between migration and interbreeding (sexual and cultural) the differences between “nations” are both recent and transient: they just don’t have much meaning. Second, and most significantly, actual (natural) Darwinism acts without consciousness and the adverse effects of evolution on the ‘losing’ species carry no moral weight. In contrast, “social Darwinism” invokes a conscious decision by a society to act in its own interests, knowing that other humans will suffer. In other words, if the essence of humanity is consciousness and moral judgment, then “social Darwinism” is a negation of that humanness.

Still, during its heyday, this approach to life and international relations gained a lot of support and still makes its appearance via a nationalistic perspective that seeks to control/condemn other groups/nations/races/species. What is significant about “”Social Darwinism” is not that countries all of a sudden started to see themselves in competition with each other, but that the spread of “scientific” thinking in Europe in the 19C led some elites to invoke Darwin’s ideas as justification for long-standing aggressiveness and animosity.

Another aspect of his ideas that Darwin (likely) didn’t foresee was the establishment (1993) of the “Darwin Awards" (https://darwinawards.com) as a forum to commemorate “those who improve our gene pool--by removing themselves from it in the most spectacular way possible.” The site contains some remarkable and often amusing stories of individual human stupidity.

I think it’s time to develop a comparable award for countries and leaders who, either through bull-headedness, ego, or a desire to be memorialized for Gotterdammerung-like behavior, put themselves in no-win situations, often leading to the demise of their country, regime, or economy.

Of course, Mr. Putin’s foray into Ukraine is the leading candidate from current affairs. There are many scenarios in which this retro-imperial revival could lead to a fundamental change in the structure of Russia (although there are also plenty of scenarios in which not much happens). We’ll have to check back in a year or two and see what eventuates.

World War II, from the perspective of both Germany and Japan, were long-shot attempts to revise the international order. In each case, the economic power of the aggressor was measurably less than that of the countries they attacked. In each case, questions were raised internally (albeit not too loudly) about the ability of the country to succeed. In each case, ideology and ego (including a good dose of “Social Darwinism”) trumped (so to speak) common sense and economic analysis. In each case, the aggressor was crushed and their government and society were reconstructed following the model of the victors. They both seem like good candidates for the ”Social Darwin” Awards.

Similar cases can be made for the German Empire, the Austro-Hungarian Empire, and (in its own way) the Russian Empire in the context of 1914. Each volunteered; each went down in flames. Napoleon, too, killed his Empire and many thousands of his men by marching all the way to Moscow and coming up empty-handed. Three years later, he was stuck on a tiny speck in the middle of the South Atlantic and the Bourbons had retaken the throne. Of course, the French monarchy itself had virtually bankrupted themselves by supporting the upstart Americans revolting against the British. They were so fixated on their perennial foe that they forgot to check their bank account. Six years after American independence was finally won, the French monarchy went down in flames.

There are undoubtedly many more examples we could draw upon. (I personally would go with Kaiser Wilhelm and the Germans of 1914).

One of the interesting things about the whole “Social Darwinism” thing is the intellectual dexterity of its adherents. Should we, based on the examples given, declare the ineradicable inferiority of the German “race” (or the others “losers”)? All sorts of excuses can be (and were) made (blaming the Jews was always popular). The leader who led his country over the cliff is often blamed, but not the country that followed him. There are few patriots ready to stand up and acknowledge their country’s stupidity and suggest that it should be fully dissolved or taken over by another country/culture.

All of which just illustrates that those who advocated for “Social Darwinism” not only don’t know much about evolutionary theory or basic sociology, but they also don’t actually believe it either.
0 Comments

In the Shadow of History

4/22/2022

0 Comments

 
As any magician or advertising producer can tell you, we humans are easily distracted by bright, shiny objects (rather like our late cat, Samantha, chasing after a laser pointer). Putting us in front of adrenalin or other brain chemistry-stimulating activities is pretty likely to suck up our attention. Fighting, chasing, and melodrama all fulfill this role in our popular culture and entertainment media.

The same is true for the far more sober-seeming practice of history. We pay attention to the big, bright, shiny events and personalities far out of proportion to their effect on the world and pay little attention to the dull stuff, however significant it might actually be. This is not to say that, e.g., the French Revolution or World War I (from a European history perspective) or the US Civil War were not important, but each gets thousands of books devoted to them, not to mention any number of movies, operas, etc.

However, they do tend to crowd out other developments, particularly those in close proximity. As a result, we tend to lose these fainter stars in our historical firmament.

This is part of the reason I wrote a set of world history lessons called “1905.” It connects seemingly disparate events and developments of that year: the Russo-Japanese War, the (first) Russian Revolution, the British partition of Bengal province in India, the British Parliament refusal to vote on women’s suffrage, and Einstein’s incredible writing of four papers that revolutionized modern physics. No, they’re not as dramatic as 1914 and the start of WWI, but they get lost in what I call the shadow of history.

Indeed, 1914 itself provides a fine example. Not knowing that their world would plunge into war in August, Europeans early that year were going about their business.

This is why I like to spend time in my relevant European history courses talking about the summer of 1914. The assassination of the Austrian Archduke, the diplomatic ‘to-ings-and-fro-ings,’ the downward spiral into a war of surprising length and destructiveness tend to push a bunch of other significant developments to the sidelines. Part of a historian’s joy in explicating the complexity of the past comes from the fact that these “secondary” developments don’t get the attention they deserve (and they give us some really good stories, too).

Just as in 2022 (when the Ukraine war pushed COVID off the headlines), so, too, did the assassination of Archduke Ferdinand in Sarajevo on June 28, pull attention away from the “top stories” of the Spring of 1914.

In Paris, everyone was enraptured by (their version of) the “trial of the century.” In March, Henriette Caillaux, the wife of a former Prime Minister, had shot the editor of a leading Paris newspaper who was blackmailing her husband. The murder trial started in July and raised a host of political and legal issues, exposure of the blackmail material, as well as providing a focal point for French society’s dealing with the “new” woman. At the end of July, Caillaux was acquitted on the grounds of women’s excitability.

Meanwhile, in London, the political leadership was dealing with the perennial problem of Ireland. While a “home rule” proposal was being debated in Parliament, Ulster Protestants took up arms against the British government plan. In March, 1914, rather than actively suppress their countrymen, portions of the British Army threatened to resign (the “Curragh Mutiny”). The resulting turmoil brought the resignation of the Minister for War and several senior generals, a high-profile political debate, and undermined the chain of command within the British Army and its morale generally—a great politico-military crisis only a few months before Britain was to start it’s bloodiest campaign ever.

It's hard to say what is “normal” when a major dramatic event comes crashing through everyone’s everyday lives. Things that appeared ordinary at the time, look strange in retrospect. Four days before the Archduke was shot, the Royal Navy made its annual friendly visit to the German Imperial Navy base in Kiel, with the Kaiser in attendance, while “German and British bluejackets made merry ashore.” Even after the assassination, on July 18, the German Fleet announced that they would make their traditional return visit to the Royal Navy base in Portsmouth. The visit, planned for August 8, never happened.

In July, the Kaiser went on his normal summer cruise off of the Norwegian coast. Radomir Putnik, the Commander of the Serbian Army still went to Hungary to “take the waters” and was there when the Austro-Hungarians declared war on Serbia (in a demonstration of bygone chivalry, the Austrian Emperor Franz Joseph allowed him to return to his command!).

Every historical event has comparable stories. The larger the event, the larger the shadow cast by our focus on the grand developments of the day. This phenomenon is one way in which we always look at the (relatively) distant past through the lens of more recent events. Curragh and Caillaux would have been the featured facets of a history of 1914 which stopped on June 27. If the Nazi’s hadn’t invaded Poland in September, 1939, we might have remembered that summer for the premier of The Wizard of Oz and the inaugural telecast of a baseball game the previous week.

When historians of the next century look back on the last two years, which issue will get top billing: the defeat of Trump, the pandemic, or Putin’s invasion of Ukraine? We can easily spin all sorts of scenarios in which each one frames a decisive moment in world history. Of course, we don’t know which it will be, but the other two risk falling into the shadows.

0 Comments
<<Previous
Forward>>

    Condemned to Repeat It --
    Musings on history, society, and the world.

    I don't actually agree with Santayana's famous quote, but this is my contribution to my version of it: "Anyone who hears Santayana's quote is condemned to repeat it."

    Archives

    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

      Sign up for alerts when there's a new post

      Enter your email address and click 'subscribe.'
    Subscribe

    RSS Feed

Proudly powered by Weebly