I am always amused when watching some video/movie to see a brief claim of verisimilitude: “this story was inspired by actual events.” As if. As if every piece of fiction wasn’t inspired by actual events and people and authors’/directors’ experiences. How are we supposed to parse the difference between facts and pretense? At least these media creations have the courtesy to alert us that they’re not actually claiming to be true. “Fiction” announcing its fictionality is a nice counter-part to news/non-fiction claiming some adherence to reality. There are exceptions of course, Orson Welles’ “War of the Worlds” radio broadcast of 1938 famously incited panic among listeners who missed the disclaimer at the start of the show. It all seems so old-fashioned now. The difference between a memoir and a first novel is less its degree of accuracy than a function of which editorial department the publisher assigned to the manuscript when it walked in the door.
In the 21C, “deepfakes” are among us. The use of sophisticated audio and video editing software has spawned a slew of faux interviews, statements, and cultural performances. “Compromising” pictures of Senator X with some semi-clad bed-partner, hi-jacked images or voices of some “A-list” star being sold for real bucks. A Presidential statement embracing a North Korean dictator (Gee, sorry, that actually happened!). Notable figures and ordinary folks will be subjected to this misappropriation of their likenesses and voices. It will get worse before it gets better, but “reality” will fight back in several ways. Lying, of course, isn’t new. Ever since Cain denied knowing what had happened to Abel (“Am I my brother’s keeper?”), dissimulation has been a standard human practice. In WWII, Churchill (defending the use of misinformation about Allied war plans) said: “In wartime, truth is so precious that she should always be attended by a bodyguard of lies.” Indeed, there is a long history of military deception. In the political world, misquoting or taking statements out of context also has a robust past. The rise of photography and electronic media (the original stuff: radio and TV) held the promise of verifiability as to what people said and did; after all, we had it “on tape.” But now, we’re all used to computer generated graphics as the background in many movies and the mix of live (“real”) and fake characters (“Who Framed Roger Rabbit,” a film mash-up of live and cartoon characters dates from 1988). The progress of technology has now advanced to the point that photos, recordings, and videos can all easily (with a modest amount of technical savvy) be wholly created. Now, everything is up for grabs; the lines between fact and fiction have been rubbed out and we’re left a bit bewildered. Most of the time (with things that proclaim or at least signal their “fictionality”) it doesn’t matter, but deep fakes present the risk of undermining our confidence in understanding what is “real” in a new and problematic way. One of the complications arises because of the highly accelerated speed of social media. Items are re-“Xed” (formerly re-twittered/tweeted) or forwarded in chat or email in a moment, almost always with an implicit endorsement (at least of interest if not veracity). Retractions, corrections, clarifications, etc. aren’t so newsworthy and don’t get passed along, so the initial “news” stays in our social media consciousness far more than any digested assessment of the situation. Did Biden really say that he loves China? Did Putin announce Russian withdrawal from Ukraine? Did some prosecutor’s announcement of an investigation get transmogrified into an indictment? Apparent slurs (racial, ethnic, personal) by media personalities are easy to produce. We might as well expect a remastered statement by Walter Cronkite (the personification of trustworthiness) announcing that the moon landing was a fraud. In each case, the damage is quickly done and never fully remedied. A second complication comes from the broad undermining of truth in modern society. Propaganda isn’t new, but the degree to which notable people make false statements or statements without caring whether they’re speaking the truth has become far too ordinary. When we don’t care (as a society) if we’re hearing the truth, then we shouldn’t be surprised when we get all manner of gobbledy-gook. In a sense, the “supply” of truth is a function of the “demand” for it; it’s basic economics. Great efforts are being made to train young people in “media literacy” and “critical thinking”; but there is an awfully long way to go in this direction. Another problem (less dire) arises from deepfake videos that undermine artists’ performances. With a little work, I can look like Olivier as Hamlet or do a cover of Taylor Swift’s “Cruel Summer” (ouch!). I can even create “influencers” that tout my dramatic or musical work. But back to the “news” and the remnants of civic culture. It’s ironic that after social media has eviscerated the traditional vehicles of journalism, we now need someone to verify who actually said what. It won’t be enough merely to report on a video of Mike Pence endorsing a chain of marijuana dispensaries, tomorrow’s journalist will need the technical chops to evaluate the provenance of the digital file. Of course, this verification function will likely face its own fakers: a deepfake of Tucker Carlson undermining a deep-faked statement by He-Who-Shall-Not-Be-Named. The possibilities are endless and will make Alice in Wonderland seem downright homey. Alternatively, the crescendo of falsity may build to such a point as no one knows or cares what is published and completely tunes out. Or maybe, some folks only subscribe to media outlets./channels/streams with some verifiable reliability; sort of like Trust-e for certain websites or the Good Housekeeping Seal of Approval. Unfortunately, too many folks already just listen only to those channels that push their pabulum of choice and much of modern journalism has already been eviscerated so that the “reliable” choices are few; as are the number of listeners/watchers who summon enough attention and focus to watch/listen critically. If Jack Webb repeatedly asked for “Just the facts, Ma’am” on Dragnet, he may have his work cut out for him. The acceleration of the pace of change in modern society has been recognized for over 150 years. Whether, as a broad matter, this has been an improvement in the human condition is certainly debatable. In our own era, this acceleration has been particularly noticeable in the area of communications and information flows. Again, while there are many benefits, there are also profound and less visible costs to be paid.
In particular, the process of democratic deliberation and decision making has been disrupted by a combination of technologies and distorting information flows, often abetted/created by mass media, a process that was underway well before the advent of the internet, and which has been shifted into overdrive by technological capabilities driven by market forces with only glancing consideration of the fabric and aspirational values of our society. I have taken plenty of shots at the media generally in previous postings, so I want to target one particular angle here: public opinion polling. Thus, not “Vox Populi,” (the “voice of the people”) a traditional formulation of the basis of democratic culture, but “Vox Polluli,” a Latin-abusing neologism for looking to the polls as the basis of democratic culture. Modern polling dates from the 1930s (a famous Gallup survey called the 1932 election for Al Smith over FDR), connected to the rise of modern advertising/marketing/consumerism of the early 20C. Today it’s a whole little industry of its own with academic studies, dozens of polling organizations and extensive media coverage of policy and political issues. Technology has advanced from “please return the postcard with your opinions” to live, real-time assessments of Presidential debates and speeches. “Public Opinion” (as apparently discovered and authoritatively articulated by such polling) is regularly reported on and seems to be relied upon as a basis for public policy decision-making by elected officials. There are several problems with this: First, polls are simplistic and life is complicated. Generic expressions of broad philosophical principles are of little use in diagnosing problems or the real-world crafting of policy. Second, few members of the public spend much time understanding even the first-level specifics of tax policy or education expenditures, much less the extensive complexities that each areas entails. In a world of eight-second-sound bites, the thought of more than one percent of the population taking half an hour to understand the mechanisms of trade relations with China is, to be mild, highly speculative. Third, poll responses are often/mostly driven by ‘feelings,’ not facts. Disapproval of Presidential performance ratings, for example are usually more a function of economic sentiment and psychological security than an assessment of what the “Leader of the Free World du jour is actually doing or is capable of doing. Indeed, there is a good argument that pollees (i.e., the people being polled) more-or-less consciously use polls for this purpose (i.e., as a “venting” mechanism rather than as a substantive expression of preferences for policy or candidates). Fourth, things change—events, negotiations, compromises all happen too fast for most of us to keep up with. All of these are, in effect, arguments for intelligent representative government with policy decisions made by folks who are chosen to spend the bulk of their time sorting through options and coming to conclusions about desired outcomes. In other words, to whatever extent direct democracy might have worked in Athens 2400 years ago, or in a New Hampshire Town Meeting today, it’s wholly inadequate for the modern world and groups of more than a few thousand. This is the same rationale for avoiding plebiscites on policy (e.g. referenda and public votes on detailed legislative public initiatives). Our current polling culture short circuits the process of democratic representation by providing instantaneous answers which are then supposed to “guide” policy makers. Bad questions, bad answers, bad information; even if we had good legislators/officials, what could go wrong…? Of course, the media is less concerned with quality policy than with “news,” even if it’s meaningless, the “blah-blah” mouthings of innumerable candidates that you have to “trust the judgment of the ‘American People’ notwithstanding. The media’s counterpart to the public opinion survey (wearing its coat of statistical validation) is the apparently non-scientific spectrum coverage article which takes quotes and views from a full range of opinion. It’s of no more value than “some people like green and others prefer pink;” but it does enable the news outlet to ensure the public that it is listening and presenting everybody’s point of view, without apparent bias or spin (or value). The upshot of this aspect of our political culture reinforces the corrupting influence of money by ensuring that those elected can claim to be “representing the people,” by listening to the polls, rather than their more fundamental job of leadership and public education on why they (and not their PACs) have taken the stances that they have why the complexities and compromises inherent in any democratic political system have worked in practice and why simplistic thinking doesn’t help anyone. On top of this, is the obsession with speed and “breaking news,” best exemplified in the reporting on “exit polls” so that the apparent winner of an election can be designated a full 12 hours or so ahead of when results might otherwise be available. More media filler, more non-news; Pavlov would be proud. As with most of modernity, there’s not much use in seeking to put the polling genie back in the bottle. It would be great if there were less and slower results. I’m not sure what public purpose is served by pre-election information on where the candidates stand, nor by post-election information on what “the people” “think.” Is it too much to hope that media outlets stop feeding the adrenalin junkies and give due (i.e. less) attention to such matters? I suspect it is. Every country, as Joseph de Maistre said (1811), “has the government it deserves.” We have the media and political culture we deserve, alas. A recent piece in the NYT reported that a combination of astronomical, geological, and atmospheric developments will render the planet uninhabitable by mammals—in about 250 million years. Ah well, it’s been fun….
Looking further out, scientists have predicted the burning out of the Sun (~ 7.6 billion years) and, of course, the ultimate heat death of the universe in 100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000, 000,000,000,000,000,000,000,000,000,000,000,000, 000,000,000 years (more or less). On the more immediate side, the familiar apocrypha (nuclear war, climate change, massive pandemic, and Trumpism are regularly cited as likely causes of the end of humanity. Some have predicted that the Boston Red Sox or Chicago Cubs winning the World Series was a sure sign of the end of days, others view the Detroit Lions as an NFL contender in the same light. More religious types have looked to the “Rapture,” or other modes of the Second Coming of Jesus. Other deep cultural traditions have their own versions of the end. We—both as individuals and societies—seem to have some difficulty in comprehending such big numbers and long periods. As a species (maybe as with any living thing) we’re highly focused on the “here-and-now.” At one level there’s a correlation between survival and immediate threats, so this primacy of presentism makes sense. Yet, I can’t but wonder if one of the benefits (purposes?) of a larger brain and human consciousness is the ability to think ahead. After we’re pretty sure no sabre-toothed tiger is nearby and that we have enough food safely stored for a few days, we can extend our perspectives. Modern folks increasingly have such confidence in short-to-medium term survivability and can afford to commit an increasing portion of our attention to longer-term issues: saving for college or a home, planning for climate change, or retirement. As one moves up the ranks in the military or commercial organizations, more time is spent thinking about “the strategic,” or the “long-term” and leaving the tactical and day-to-day to those lower down the organizational ladder. Historically speaking, “modern” folks seem to have a different sense of time than pre-moderns. This is due to several reasons. First, our awareness of time is a function of our awareness of change. Traditional societies faced considerably less change than we have seen in the last 250 years. As a result, the “future” has become meaningful as a concept since it has become increasingly apparent as something different from the past and present. Second, modern societies have (generally speaking) increasingly mastered short-term survivability and can spare some bandwidth for a longer-term future. Third, the emergence of modern historical practice has made us aware of the length, complexity, and change of the past and opened up the prospect of the reciprocal: the future. Fourth, our longer individual longevity means that, as compared with a few hundred years ago, the prospects and conditions of our (potential) lives multiple decades hence actually could have some meaning (not so if your life expectancy is only 45). Finally, 19C science—especially geology and evolution—has forced us to come to terms with the vastness of human time. Charles Lyell’s Principles of Geology in 1830 showed that the planet was millions of years old (not a few thousand as clerics had inferred from Scripture). This created a chronological space for Darwin and his theory of evolution, showing that the emergence of species could actually occur, one genetic modification at a time, since there was now enough possible generations to accommodate the development of the human variability, for example; something that would not be feasible without divine direction in the few thousand years since Adam. Modern rationality has also insisted on counting and specificity. So traditional stories of end times—of whatever culture and nature—can’t be so open-ended: floating out there as something that would happen…sometime. Mythos doesn’t fit well—stylistically—with spreadsheets. This has led to scientifically-grounded projections which are far more bounded, even if not precise to the last detail (we’ve only got 7.6B years with the Sun, not 8B). So, not only has our time horizon expanded, but the mentality with which se contemplate what lies ahead has changed too. The acceleration of change in the 20C—whether in terms of technology, geopolitics, or culture—has brought with it (among other things) an expectation of further accelerating change. This makes the future inherently—and consciously—different from the present and increases the interest in what’s coming next. When we add in a dollop of Enlightenment-stimulated sese of human power and control, it’s no wonder that the 20C saw a bourgeoning of “futures studies,” scenarios, and efforts to at least conceive of potential future vectors of developments: possibilities which could be planned for. Planning connected the present with the future. In contrast, ancient and traditional modes of envisioning the future—the Second Coming, Kaliyuga, the Mayan Long Calendar—all existed “out there” somewhere in the indeterminate future; not in the present, but not really according to any calendar that people could comprehend. Railroads and naval fleets, on the other hand, required plans—with schedules and budgets. With the incremental advance of technologies as a model. The arrival of the future could be projected as emerging, piece-by-piece, out of the present. It became immanent (of and in the world), no longer transcendent (dropping out of the skies without much human agency). Much the same can be said of the avowedly fictive futures. Utopias from Plato (4C B.C.) to More (16C) existed away from reality; indeed that was their point. Modern “science fiction” wrestled with societies based on the present-plus; Verne and Wells being the pioneers here. The combination produced an extension of the culture of the present; one that continued on increasingly, so that the realm of the imaginable, the realm of the implementable, grew from merely the present to years and decades ahead. Our world today is filled not with just the present, but with this future, extrapolations of current trends, either through literary imaginations or statistical models. The premodern world didn’t conceive of itself in this way. Its future was preconscious, dominated by the here-and-now. We are willing to contemplate a span of years ahead as something that is integral to who we are now, something which we have some chance of steering; even if we never actually know what will happen. The problem with most of the rhetoric uttered in times of stress (and these days is there much else?) is that it’s good for rousing people, exercising their adrenals and other brain chemistries, and flinging them into action for some cause or another. Outrage, insult, doom: we must all push hard against these incipient evils. On the other hand, it’s not good for governing, solving problems, or living together.
My title today: “Ma non troppo,” is an Italian musical term typically affixed to the composer’s direction to the player as to the tempo or how briskly or languorously the piece is to be played; as in “allegro, ma non troppo” or “lively, but not too much.” It’s a delightful phrase with useful application far beyond the recital hall: telling me not to get carried away; to be focused on my target, but to remain conscious of my context at the same time. There is much to be said for capitalism, socialism, individualism, cohesive group identification, social justice, rule of law, democracy, governmental effectiveness, national security, individual rights, promoting moral standards at home and abroad, fiscal rectitude, self-defense, respect for authority, a sense of aspiration, incrementalism, liberty, equality, fraternity, a responsibility for the future, a responsibility to the past, human rights, communal responsibilities, faith, science, basic quality of life, environmentalism and, indeed, hope [did I leave anything out?]. All are good, but “ma non troppo.” I’ve found that it’s a good practice when in a confrontational situation to try to construct a plausible rationale and to identify the omissions/blind spots for each side: Landlords and tenants, Palestinians and Israelis, advocates for a universal basic income and advocates for lower taxes, those who want to choose gender identities different from traditional appearances and those who have embedded decades of habit in reacting to others by those appearances, etc., etc. I’ve found it’s a good practice not to presume malicious or insulting intent. Not there isn’t often reason for such a belief, but to presume it without assessment doesn’t generally get me where I want to go. Indeed, I suspect that well over ninety five percent of what’s bad in the world is due to negligence, loss of attention and (especially) incompetence; evil and malice are pretty rare. I’ve found that binary thinking, simplistic categorization, painting people and ideas as either black or white—period—is usually laziness, arrogance, blindness, or anger on my part. I’ve found that being a victim of some crime or evil doesn’t make a person incapable of criminal or other evil actions and to merely recite their victimhood as a justification rather than assessing their own actions is disingenuous. One of the downsides of despotism/authoritarianism is that such regimes’ insecurity/arrogance usually means that they can’t tolerate consideration of alternatives or constraints or balance. Lenin found this out in the early 1920s when, despite the then-new triumph of Marxist doctrine, it was necessary to carve out market-oriented exceptions if people were to be fed. Mao didn’t learn the lesson and millions starved in China in the 1960s. Unbridled [fill-in-the-blank with any of the items from the list in the third paragraph] rarely works. This is mostly due to the inherent distance from theory to practice and the complexities of having lots of people with different views and priorities living together. Liberty and initiative have brought many benefits to the modern world, but we read every day about the excesses of 21C oligarchs/billionaires who throw lavish parties while millions starve. Each is its own mini, privatized version of a self-serving authoritarian regime. Socialism for the public good is noble, too; but is also subject to corruption, arrogance, and bureaucracy. One of my favorite examples has to do with the level of taxation on the rich. Any attempt to raise funds for public benefits is met with pained cries of those who insist that a heavier tax burden will suppress investment and initiative, that entrepreneurs will be deterred because they won’t be able to make as much money and society will suffer the loss of innovation and competition. Yet few entrepreneurs I know or know of would work less hard due to a higher tax rate. They’re motivated by their own ideas, their own energy, and their own drive for recognition and success. They “keep score” with money, to be sure; but if a steeper tax bill meant that all their competitors also ended up with a bit lower net worth, the rankings would still be the same. So, capitalism…sure, but ma non troppo. Self-defense is another example. The doctrine that a person’s home is their “castle,” defensible with weapons is a plausible theory of criminal defense. Pushing that idea out into the streets via the “stand your ground” theory might be seen as an incremental extension. But, it runs into other people’s liberty and security. So, let’s dial it back a bit, let’s not push things to (past?) their logical limits. Let’s leave the last ten percent of every idea off the table. Abortion/women’s rights, capitalism/socialism, free speech, the mare’s nest of the Middle East, US/China, etc., etc. In a Supreme Court case (whose name I can’t recall) on the question of due process under the 14th Amendment, Justice William Brennan described the decision point as “implicit in the concept of ordered liberty.” It’s not a bad phrase, even if it’s overwhelmingly ambiguous (more a signal of the difficulty of balancing principles than a useful predictor of what the Constitution allowed). Order is good, so is liberty. They often (usually? always?) clash. Ma non troppo is more elegant. I like to think that I’m in control of myself. I rather take pride in my rationality and ability to solve problems; it’s pretty central to my self-image. So, I don’t know if two recent incidents constitute a karmic telegram to stop kidding myself (Remember telegrams? The last one (physical—not karmic) was delivered about ten years ago, apparently).
Of course, any sense of control is an illusion, and often a dangerous one. The ability to “go wild” seems to have all manner of positive psychological and physical benefits (at least in doses and with some limits); as evidenced by popular dance music for centuries. Alcohol, tobacco, and other drugs are much to the same end. Regardless, the illusion has provided me with no small sense of self-satisfaction, even if part of me can also acknowledge the costs. And beyond satisfaction, a sense of security, both situational and ethical. So, on to recent history…. Incident #1: Last month, I was doing some yard work (man-of-the-land that I am!) when I apparently disturbed a ground nest of yellowjackets (wasps) who swarmed me instinctively. Before my “normal,” control-predilected self was aware of this, my amygdalic brain started flailing my arms—foolishly, I later learned—and propelling my legs away quite rapidly. A few seconds later (real time; or an extended period as it seemed in the moment), I was in the house with—mercifully—only four stings on my hands and wrists. By the time I had dashed to the computer to do an internet look-up for appropriate remedies, grabbed appropriate creams and dunked my hands into ice water, I caught my breath and realized that my flailing had left my glasses out in the yard in the spot of the initial onslaught. For the next several hours, I felt drained physically. Mentally, I didn’t feel scared (I did retrieve my glasses), but a touch wary and with a definite preference for “hunkering down.” I spent some time observing myself. I guess I don’t fire off the brain chemicals and short-circuit my normal, well-processed thought processes very often. In fact, I can’t remember the last time I reacted as instantaneously/intensely. As a result, it was strange to recognize the guy who moved through this situation in this way. I don’t regret acting the way I did; not that “I” had much control over what I did. So, both in the moment and in the aftermath, some quite apparent demonstrations of Steve not being “in control.” Incident #2: Almost a week later, my wife is starting to feel increasingly bad: fatigue, aches, respiratory inflammation. We had, for three and a half years, avoided being caught by the COVID bug, but our days of innocence were gone. I followed about two days later. Fortunately, for both epidemiological and pharmacological reasons, we only had a few days of being miserable and are both more-or-less returned to normal health. Nonetheless my two-ish days of moderate misery: spaciness, comprehensive body aches, a bad sore throat, and occasional chills/fever were, for me, remarkable. I’ve been quite fortunate to have avoided acute illness over my life. Other than a couple of out-patient procedures, a light-to-moderate set of cold/flu infestations, and an increasing prevalence of age-appropriate chronic physical conditions, I have been pretty healthy. COVID presented in me in a manner similar to colds/flu, but more severe. Since I’ve had colds/flue since I was a kid, at one level it wasn’t remarkable. And yet…even though the chances of severe complications was small, it was different. It was new. Or, perhaps I just looked at it (i.e., me with “it”) differently. I was regularly aware of struggling to clear my head, to wake up from my (more frequent) sleeping and deciding (repeatedly) that I didn’t have to or want to. When sitting at my desk, I was “just fine” to sit there vacantly and not do much (if any) work (once I had emailed my students with the revised class schedules for the week). I didn’t have a chronic condition, but I could see that I could very easily feel the same way indefinitely. I got to wondering whether I could be like this if in some time—for any number of reasons or conditions—my limited acuity and attention (…and self-control) would become my “new normal” and possibly terminal, if indefinite state. What if the reduced sense of connection with the world: my characteristic interests in ideas and affairs, my role in managing my life was “as good as it got.” Perhaps I would mind, perhaps I would be upset with my new smaller world; but perhaps that’s just the current me standing up now when—by definition—that Steve wouldn’t be present anymore. I’m not sure how to characterize how I feel about such a prospect. Not “scared,” certainly not “resigned to it;” aware, as I say, that any idea of such a future is more projection than prediction. It is all well-and-good to declaim: “Rage, rage against the dying of the light.” But that presumes a certain level of synapses and energy levels to spark such rage. A noble dream, but not everyone’s reality. So, to return to where I started, this mild-to-moderate COVID bout gave me a second taste (and a hint of a third) of not being in control of myself I the way I am used to thinking. One due to a hyped-up system, the other due to a spaced-out processor. I take from these two (+) situations an appreciation of how much I rely on my constructed sense of myself, the fragility of that control, and a question of whether to lean on it as much as I have. Or, as T.S. Eliot asked (and the Allman Brothers affirmed): “Shall I eat a peach?” There are many bizarre and ironic aspects of the war which Russia unleashed on Ukraine over a year and a half ago; Orwellian tropes abound. Not least on this score are the statements of the latter-day tsar Putin, criticizing Ukraine for its “terrorist” attacks on Russia. From someone who is responsible for launching the overall military invasion, as well as unprovoked attacks on Ukrainian civilians, this is a bit much. After all, the Ukrainians are doing nothing by way of basic tactics than the Russians have done. So, beyond his actual war crimes, Putin is also guilty of (gasp!) hypocrisy and propaganda mongering.
If we try to find something solid in the history and concept of terrorism, and not let Putin (or Bush/Cheney for that matter) pluck a word and twist it into a scare tactic du jour, we have to turn to the chaos of the great French Revolution of the 18C. It was a derogatory term applied to those exercising wonton physical force. Robespierre used it to justify the power of his Committee of Public Safety (the de facto government of 1793-4) and its efforts to dominate counter-revolutionaries and the populace generally during the so-called “Reign of Terror.” The term gained wider use in the 19C, to describe those non-state actors (often anarchists) who promoted violence, disorder, and revolution in various parts of Europe. Beginning in about the 1970s the latest round of terrorism flourished and is still with us. Sometimes it was motivated by nationalism, sometimes by separatism, sometimes by civilizational/religious anxieties, and sometimes it was merely a violent expression of animosity towards whatever regime/government/elite who was in power in a particular area. We can recall the spate of hijackings and bombings from that era, and the notable attack by radical islamicists on Israeli Olympic athletes in Munich over fifty years ago. The most recent wave of course was marked by 9/11 (twenty-two years ago) and the resulting “War on Terror” that brought US invasions of Iraq and Afghanistan, as well as a variety of deployments and actions by US and forces and our allies in a wide range of situations ever since. The campaigns of “shock and awe” that we launched certainly generated a commensurate amount of terror among innocents as well as any actual enemies, but since we were fighting “terrorism,” any impact on them was only collateral damage. On the domestic front, there is a long line of actions, stemming back to the 18C, in which coercion, destruction, and fear were deployed against civilian targets. Examples include the Puerto Rican separatists attacking the Capitol (1954), Timothy McVeigh’s bomb in Oklahoma City (1995), the Boston Marathon bombing (2013), and MAGA-ites on January 6, 2021. If such an action was aimed at or justified by a critique of the incumbent power structure (i.e. the Government, Big Business, etc.) then it was characterized as terrorism. In contrast, in foreign and official “war” contexts, comparable tactics and effects were described as ordinary and inherent in the nature of the conflict. This distinction echoes the 19C origins. Terrorism is seen as a special kind of violence because it is 1) aimed at the state and the established order of society and 2) caused by someone other than another country’s military (which is called “war”). The recent attacks in Israel by Hamas might fall somewhere in between. Since the purpose of the state, as I have noted elsewhere, is to ensure public order on behalf of society as a whole; the purpose of terrorism is public disruption and the undermining of public faith in order. In other words, beyond the immediate destruction/death, the goal of “terrorists” is to raise the specter of societal collapse, anarchy, and chaos; i.e., the creation of a special kind of fear. The existence of terrorism is thus a product of the democratization of “civil” power structures and of the extensive distribution of coercive (“military”) power. Attacks on pre-democratic power structures (monarchies/aristocracies/oligarchies) weren’t intended to undermine the public sense of order and security since there wasn’t really a “public,” and few cared what the hoi polloi thought. By the same token, the widespread distribution of coercive physical power (i.e. effective firearms or easily manageable explosives) is a predicate to the spread of terrorism, since the threat of disruption from a spear or sword or even pre-19C firearms couldn’t cause such widespread fear and the deployment of more powerful devices was effectively limited to states or organized insurrectionist organizations. Weber said that the modern state was characterized by the claim of a monopoly of legitimate violence. Yet, the spread of firearms and explosives has posed an essential challenge to that claim. All of this lends a certain perversity to those who claim an absolute right to violence as the premise of the 2d Amendment. They are, in effect, saying that the Constitution guarantees the right to violently resist the state (i.e. those exercising constitutional authority). They want to be terrorists. Never mind that the folks back in 1787 thought they were getting rid of arbitrary government (and, for the most part, successfully). There are other problems with the broad reading of the 2d Amendment, too; not relevant here. So, terrorism is a label, much abused in both international and domestic discourse. It’s hallmark is not “terror” (i.e. deep fear) per se, nor violence against civilians; rather it goes to a certain conception of society and public order. It’s too often bandied about as a ready-made license to kill/terrorize in retaliation/prevention; but just as long as it’s “our guys” doing it, it is, apparently, OK. Back in the late 20C, there was a pretty active and engaging intellectual movement called “post-modernism,” which held (and this is hard to attribute because rarely has there been a more diffuse and anti-coherent “body” of thought) that society’s sense of truth was idealized, particularly as a result of the rational integrative thinking of the scientific revolution/ Enlightenment (the shorthand for which is “modernity”). Instead, these folks argued, “truth” was merely a construct—the connection of a few dots of information out of millions into particular patterns (much like the constellations). The structure of these connections was more a reflection of the perceptions and personalities of those constructors than of any underlying reality. In other words, the Big Dipper could as readily be characterized as depicting a bear (major or minor) or ladle or, for that matter, a zigzaggy graph showing the history of post-war inflation rates in Bulgaria.
Post-modernism retains some presence in the arts (think Gehry’s Bilbao Museum) in its rejection of linearity and on the fringes of academic discourse (where it remains an ongoing cautionary tale of skepticism about objectivity). It was always too obscure for the mainstream. Which is another way of saying that the social construction of “truth” remained resilient enough to shunt post-modernism off to the side. At least for a while. As I have noted elsewhere, cultural change always takes a while and, as with other modes of change, doesn’t usually move in straight lines. The echoes of post-modernism continue to reverberate. The erudition and obscure theory is gone. But the talk is of “competing narratives” and alternate realities. In popular culture, particularly in the past 10-15 years, the demolition of both specific truths, standards of proof, and the underlying premise of logical analysis itself has been recognized and much commented on. Some even argue that post-modernism was the source of the post-truth trope in our modern politics, but I think that’s mostly intellectuals wanting to feel that they are more culturally influential than they actually are. Our current Immediate Past President has proved a master of this demolition (although, like his real estate empire, much less adroit at construction on the now-empty lot). Social media, to be sure, has been an accelerant of this process; although previously established modes of media were already moving in this direction. The signal-to-noise ratio in the public square has gone down radically; principally due to the increased noise (semi-automatic retweets, ad-driven hyping of popular fizziness, and way-too-many Instagrams of dinner plates), accompanied by the slow-motion collapse of the mechanisms (e.g. newspapers) by which coherent and substantiated (aka “conventional”) stories were generated and circulated. That such developments would undermine the democratic process is not surprising, but the reason is not obvious. Lies, slander, and distraction have been central parts of the political process even before that process became “popular” and “democratic” (pretty much starting in the 19C). Just think of Julius Caesar, or European monarchical courts of the early modern era, or even our own furious to-dos between Federalist and Republicans in the early United States. However, two of the premises of democracy are 1) a shared community and 2) a shared epistemology. The former cannot stand in the absence of the latter and the latter cannot stand without a coherent sense of truth. In other words, democracy is as much a part of modernity as industrialization, urbanization, and a sense of “progress.” So, just as “post-modernity” attacked, in effect, the coherence/confidence (arrogance?) of modernity, it couldn’t but have a follow-on effect on democracy too. Indeed, democracy is especially susceptible. Since the time of Aristotle, the central problem of democracy has been the risk of mobocracy. Plato argued that we should only trust the government to those who were well-educated and well-trained in morality. Exogenous stresses (as social scientists like to say) have us careening down these parlous paths. Close analysis and rational thinking too easily go by the boards. Discarding an epistemic standard centuries in the construction would not, it would seem, come lightly. But underneath the tradition of Socrates, Galileo, and Voltaire lies an equally robust stream of superstition and nescience; of astrology and absurdism and surrealism. Witches burn and tulips (not to mention bitcoins) get sold for more than houses. In the midst of modernity, many political leaders have offered alternate histories and futures, sufficiently attractive to motivate millions of supporters. So, Putin, Xi, Orban, Boris Johnson, Bolsonaro, Modi, Khamenei, and dozens of the others of our era are not really new. The particulars of their motivations and speeches are far less important than desire of many for a reality which seems manageable and energizing. We like to think that truth is the foundation of how we see the world; that science-based analysis gives us the confidence to deal with the world. But we have the order wrong. The hunger for epistemological confidence and psychic security “trumps” our traditional mainstream mindset. A narrative which provides comfort is fundamental; more fundamental than the construction of a narrative with roots in logic and experience. When modernity offers uncertainty and disruption, fear drives us to construct a “reality” which soothes. After all, if you look closely, it’s not a ladle or bear up there, it’s a bunch of stars which appear, from our particular location in the galaxy, to line up in some pattern or shape we strain to recognize. Science has told us as much for hundreds of years, and still we all know under what “sign” we were each born. I recently heard from my department chair (in the nicest way possible under the circumstances) that it’s pretty unlikely that I will be teaching history at SF State next term. Nothing is set in concrete and scheduling flukes do happen, but the bureaucratic/budgetary processes are often inexorable.
It’s not (so it seems) me and my teaching, rather the decision (calling it a decision makes it much more personal than I feel it actually is) is based on several broad and intersecting trends. The first is short-term demographics: the number of adolescents has been going down for several years, which means that there is too much collegiate capacity and state funding is tied to the number of enrolled students, with clear consequences for those hired to teach. Second, the cost-benefit analysis for students as they determine whether to spend time and money on college is skewing slightly against the traditional clear-cut preference college attendance. Third, there is a marked decline in the study of the humanities, about which I have previously spoken. History, in particular, is attracting fewer majors and fewer casual students. Again lower demand justifies lower supply (of teachers). Fourth, SF State presents an especially challenging economic proposition to students, on top of the increasingly dubious economic proposition of a college education generally. The Bay Area is extraordinarily expensive, even for the minimal food/housing expectations of public university students. Our campus enrollments reflect this too. Finally, I am a lecturer, which means I sit at the lower end of the pecking order in terms of job security. I had been moving up the list for the past several years, but now there’s hardly anyone left below me. Apparently, all of us at this work status in the department are being offed; University-wide, it’s reportedly about 300 teachers. It's too bad for the students on several levels; and not just because I believe that I have brought something distinctively valuable to my efforts to engage with the young people I have had in class. As is common in large lumbering organizations facing cost reductions, the cuts are made crudely and broadly. It would take a much more limber organization than a public university—with its unimaginative management style and bound by union and other rules—to redefine itself on the fly. Few private companies: profit-driven among the “creative destruction” of capitalism can pull it off. The CSU System has feet of clay. This is true at the System level, at the level of SF State, and in terms of the History Department. As one example, the elimination of lecturers (who are shockingly underpaid even compared to regular tenured faculty) saves relatively little. The smart move—economically—would be to cut a few senior profs and keep more lecturers (equally capable as teachers). A second approach (even more radical!) would be to put students first. We would have to design a departmental faculty line-up to offer the courses most important to our majors and attractive to a wide range of students who come to us for some aspect of their “general education.” Instead, the cuts are made with little regard to curriculum and pedagogy. Thinking “outside the box” is too sensible and creative a process to expect from a faculty that is reeling from the pandemic, uninspiring working conditions, and which has no particularly capability for innovation and zero-based institutional design. On a personal level, it’s a big bummer. The health care benefits and (modest) income will be missed, to be sure; but I never went down this road for the money. Much more important to me is not plugging into the energy of youth and the chance to share some perspectives (wisdom?) and gain some of their perspectives. The loss of intellectual engagement is also a problem. Designing a course, preparing lectures and discussion plans and exercises were great learning vehicles for me, even before the students saw any of them. That intellectual engagement is, however, perhaps the most easily remedied; not only by my research and writing projects (several of which have appeared on these pages). I will like teach adult classes some more. I will have to try to figure out a way to keep my full library privileges. I started teaching at SF State in 2013 while I was working on my dissertation at UC Davis. I’ve been extremely lucky to have been able to do so. When I started down this history road in 2005, I had no reason to expect 1) to get into a Ph.D. program, 2) get any sort of teaching gig thereafter, and 3) to get a gig that was only one bus ride away. So, don’t get me wrong: I am extremely grateful to the folks at SF State, especially the four department chairs who hired me. I was (am) drawn to history, both the research and teaching aspects. That is to say, I didn’t go in this direction to “re-pot” myself or have a “second career,” even though both have eventuated. As I talk to my peers (even back eighteen years ago, before I went down the History road), I was acutely aware of the perils of non-directional retirement. I have been able to dodge that bullet for a long time. The prospects of “re-potting” grow more challenging as one ages: internal defeatism and the social conventions of age discrimination, to name just the principal obstacles. What now? I don’t play golf. I haven’t played cards (bridge) with any focus in forty years. I have (as I noted recently) a coin collection from my childhood; but accumulation is not where my head is at. I like our house and I’m happy to do some work here, but I need to get out and mix it up with others at least several days a week (and I don’t mean the neighborhood pub). My next project is to figure this out. A few years ago, Gina asked me when I planned to retire from teaching. I responded: “Ask me again when I’m 75.” Well, I didn’t make it that far. The future, as they say, is now. Well, there’s only about 400 days until the election—a seemingly meaningless mark on a seemingly perpetual calendar amid a seemingly meaningless process driven by a recursive media/political operative frenzy. We’ve really done it to ourselves here.
And all this is without regard to the individuals involved, particularly He-Who-Shall-Not-Be-Named. His peculiarities and poisonous politics have merely aggravated the situation, not caused it. Nonetheless, his polarizing personality (I can’t really attribute anything to his “policies,” since he doesn’t seem to have many) and popularity have created a bizarre situation on the GOP side. All these wannabes who, by reason of their history, would be plausibly on the debate stage, are either sitting it out or basically sniping a bit at each other, but ignoring the 600 lb. orange haired elephant in the room. They can’t attack him, lest they put off his base, which they need for both the nomination and a general election. Those that haven’t figured out a plausible way to do this are sitting on the sidelines (Pompeo, Youngkin, Cruz, Rubio), leaving the “field” to those who haven’t figured it out either but convinced themselves that they had to try. Yet, unless HWSNBN falls, they have no hope. None have articulated their conundrum; they’re all waiting around, twiddling their thumbs, hoping for a heart attack (it’s doubtful that one or more convictions (all of which would be appealed until after the election) would change either his mind or those of his supporters). Then the myriad also-rans will scramble madly and starting their campaigns. I’m sure they each have scenario books for the contingencies (especially the Veepables). All the rest is just waiting. The situation is strangely parallel on the Demo side. Whatever might be said about Joe’s accomplishments (and they’ve been considerable), there’s not really any debate going on here. Most folks think he’s too old, but he’s got inertia which, combined with the fear of a disruptive nomination fight, has led all of the potential candidates to sit on the sidelines. No one is interested in doing anything to upset the applecart and increase the risk that HWSNBN gets re-elected. So, we have, effectively, two non-campaigns going on. What’s a poor attention-desperate media machine to do? They’re struggling hard to create the impression of meaning and action; but no one really cares. After all, it is still more than a year to go; many (most) minds are already made up (mostly out of fear of the other). It seems we’re all just waiting for the media to get exhausted themselves. In terms of the election itself, then, barring something really cataclysmic that leads to a “rally ‘round the flag” scenario, we’re in a long-term holding pattern—sort of like circling around Pittsburgh because bad weather has back-up the landing patterns at JFK. Unless,… Unless, there is a medically-forced vacancy at the top of either ticket. The later it happens the more exciting it would be. Immense chatter among the chattering classes. High-stakes wheeling-and-dealing, real polls, real debates…wow. If it doesn’t happen until spring then previously-selected nominating convention delegates might actually have to make decisions. Or, perhaps, after a convention, a party National Committee would have to actually choose a candidate. The shock to the system would be great (not that I’m wishing ill health on anyone) and would force us (i.e., the body politic) to actually pay attention and perhaps even care about politics for a change. After the Nazis blitzkrieged through Poland 84 years ago, there was an eight-month period when pretty much nothing happened in the European War. Historians refer to this phase as the “phony war.” We’re pretty much in the same situation in our politics now; even though there has been no blitzkrieg either before or (likely) following. This strange politics also highlights the surreal nature of campaigning. The media (by which I mean not only the national print and broadcast press, but also the social media) still talk in terms of campaigning as if we were back in the early 20C, with whistle-stop tours and substantive policy positions. They call it “retail politics” and feature all sorts of amusing/strange local events around the country: corn-dog chomping in Iowa, maple syrup slurping in New Hampshire, unending barbeques. But in an age of Amazon vs Walmart and electronic transmission of information, it’s not at all clear to me why we care and why we think this actually has an effect on national politics. I mean, how many folks actually go to such events? Their impact is far more driven by the coverage of such events by the media than by the live “retail” political customers. The same is true for the other staple of local politics: the “rally.” The percentage of attendees who haven’t already made up their minds (“been there, bought the tee shirt”) is tiny. The media coverage is all about snooping around for gaffes. In sum, the whole local angle of national politics is a charade. During the 2020 mid-pandemic election, candidates (mostly) stayed home and the quality of campaigning didn’t suffer; it was actually quite nice. One has to ask whether—other than the media hand wringing over the loss of “retail politics”—candidates couldn’t usefully return to the 19C style of campaigning from their home’s front porch, rather than dashing around the country, dropping into 2…3…4 states each day for “appearances” in their pre-election frenzy. So, what if Governor X didn’t actually visit State Y during the campaign? Do a couple of 2-hour stops in, say, Tucson and Mesa, AZ really demonstrate local knowledge and engagement? And don’t even get me started on all the wasted money and its inevitable corruption. Again, I emphasize that most of these distortions and problems arise without considering the personalities and age of our two leaders. More signs of system crash. Time for a reboot. |
Condemned to Repeat It --
|
|